Physics and Computing explained in plain english as my Theory Of Everything

Project: The Total Library

Project: The Total Library

I know that I know nothing and from that I start to know one thing. I often rebuild large parts of my thoughts that way. Usually it leaves me with what I thought I already knew, but this time a puzzle is fitting together...

Its technical name is "Fourier Boltzmann Bell Bayesian Turing Complete Sparse Dimensional Manifold". I'll explain each of those words.

Fourier Transform in the simplest form is to take the dot-product (the overlapping parts of different angles) as vectors change. Example: Each of -pi to pi positions in a unit sine wave is a vector from zero to its height ranging -1 to 1. By multiplying an arbitrary function by that height, and considering the alignment, we get the amplitude of that sine wave as a measurement of the other function/wave. A complex number is normally used for the phase alignment, so the magnitude is constant 1 while it still rotates like tracing a spring.

A Boltzmann Machine is a simple statistical algorithm that works because NAND is Turing Complete. NAND is a 3-bit logic constraint or operator that makes the output bit be 1 when the 2 input bits are not both 1. A Boltzmann Machine moves on the path of least resistance based on weights between pairs of variables. You pay a higher cost to have both variables true. That cost is the weight between those variables plus the cost of other changes to the network on and adjacent to those variables if you change those variables.

But I can't use a Boltzmann Machine because NAND is not symmetric. We see this paradox in the Toffoli Gate of quantum physics which is their attempt to make it symmetric, but Toffoli Gate is a proof, not a practical quantum hardware.

What I've been working toward for all these years, although I didn't know it the whole time, is a Turing Complete flowing wave spread across the Internet we can play and calculate with. I'm going to turn the Internet into one big wave based computer that can calculate anything a normal computer can calculate but only in a statistical way, so it will be compatible with grids of quantum computers and simulations of waves we can play with on the screen but not compatible with existing programs because they break when a 1 changes to a 0 which is the nature of Statistical Turing Machines.

So instead, I started with creating rotations in high dimensions and getting the waves on the screen that we can play with. That's working, but still its lacking a clear wave behavior of NAND. The potential is in there.

Then I remembered my BayesianCortex software which uses Bayes Rule, about conditional probability, to do some interactive graphics effects you paint with the mouse while they change.

Bayes Rule can implement NAND as a 3-variable statistical logic constraint, but Bayes Rule is still linear. It works in flat space where probability ranges 0 to 1. It doesn't work for hyperspheres... until you do this...

If you multiply the height of 3 bell curves, each along dimension of x y and z, you get a 3d bell curve that is also a bell curve at angles and other positions. It looks like a sphere where density is height of a bell curve aligned at its center. This works in any number of dimensions.

Therefore, by using the integral of a bell curve as the distance function, a bell curve becomes a linear space and can be used as bayesian variables in range 0 to 1.

An Affine Transform is simply a rotation, tilt, or movement. Its done by defining a new vector for each dimension, and you multiply a point's dot-product (its position in that dimension) by each such vector to transform it into the new coordinate space.

1d and 2d bell curves are all over the 3d bell curve, which can be defined by affine transforms, mostly the rotation I'm interested in, and let the tilt and position come from bigger systems using these simple data-structures and calculations.

Start at a center point. Choose a random vector and put a bell curve there. Repeat this until you have many overlapping bell curves.

Problem: Some of those bell curves are closer to eachother than others, so they would distort the data if given equal weight.

Solution: Their "amount of dimension" is the convergence of their position defined in terms of eachother, weighting eachother as dimensions using the others current "amount of dimension". The exact calculation is in my older software, which I'll copy/paste when needed. It runs in approximately log number of cycles to calculate the amount of dimension of each such vector, and the total "amount of dimensions" will sum to slightly less than 3 since they are embedded in a 3d space, a 3d bell curve.

So now we have a sparse bell curve which can be more or less dense than a standard bayesian node, like a flat 3d space for 3 bayesian variables each in range 0 to 1. Instead, we can have many or few bell curves at angles and use Bayes Rule to calculate the conditional-probability of specific points on that 3d bell curve surface.

Not all parts of the bell curve are equally dense, but overall it must still have average 0 and standard deviation 1 from every angle. That means we need a network of these sparse dimensional variable density bell curves so we can have them unbalanced in a variety of ways but still summing (or multiplying? or whatever way these data-structures are used like numbers) to an n-dimensional bell curve of the expected density everywhere. Density must also be on a bell curve, so its just another dimension.

I found that by using circles (complex number held at radius 1 but angle varies) as node data and numbers on bell curves as edge data (between nodes) and using various functions of dot-product between those angles to adjust the edge numbers, that a fourier-like effect happens. They blob together in various ways depending on what function you use. The nodes are updated as weighted sums of other angles of their adjacent nodes. The whole thing flows smoothly and in interesting patterns when you create alignments between the angles using the mouse. This is a template to put in the Fourier Boltzmann Bell Bayesian Turing Complete Sparse Dimensional Manifold as applying to those edges, or possibly some other combination of the node data, or maybe node data should be a hypersphere instead of just a circle, maybe of 4 dimensions (kept at radius 1) as in the Poincare Sphere a 3d surface of 4d hypersphere, since you need 3 dimensions of some kind for NAND.

Theres various ways to do it, but I'm certain there is no way around the use of a Fourier Boltzmann Bell Bayesian Turing Complete Sparse Dimensional Manifold since it is the only kind of math which can turn the Internet into a high dimensional wave and still be fully capable of general computing as patterns in that wave.

While most parts of the world are organized in a way that makes bigger systems and more complexity appear advanced, this is more like E = M C^2 / squareRoot(1 - velocity^2 / C^2) in the way that small accurate things are most valuable.

It is also relevant to that equation because you can do algebra on it to get (mass/energy)^2 + (distance/time)^2 = 1, the equation of a circle with those 2 dimensions. We also see hyperspheres and bell curves all over the quantum equations.

A Fourier Boltzmann Bell Bayesian Turing Complete Sparse Dimensional Manifold is simple. Our minds are complex and look for complexity where it isn't.

A bell curve is simply what happens when you continue to move left 1 or right 1, do that n times, and you get a bell curve with squareRoot(n) standard deviation. Similarly, the alignments of these sparse dimensional variable density bell curve bayesian nodes will create more bell curves simply by aligning with eachother in various directions, and then we use those same data structures to model that statistical property. Its a self-referencing system that way.

I don't know the details of how I'll put this all together, but I do know all these components and that there is a simple way they fit together. These areas of math form a continuous manifold, and I'll find their intersection by exploring what I already see to be a continuous path between each 2 of them, but I can't use them in pairs, I need a clique of all these areas of math, the absolute intersection of them in one data-structure and one calculation. I'm not talking about a system made of many different kinds of math.

I'm talking about one math calculation done in many combinations. That one math calculation is where this all leads. Want to help me find it?

My latest research is usually uploaded to

http://sourceforge.net/projects/physicsmata

http://sourceforge.net/projects/bayesiancortex

and a few of my other projects athttp://sourceforge.net/users/benrayfield

There will be no need to search to find this when I get it working. We will start connecting it to almost everything to organize the world better.