For a long time, theoretical physicists have dreamed of the day when the general theory of relativity and quantum mechanics would be combined to create the Theory of Everything. It often stated that such a theory would be so simple and concise that the whole thing could be condensed into a simple equation that would fit on a T-shirt.
It was clear to me that classic material reductionism could not provide a path to that laudable goal, so I undertook an investigation to see what could replace it. That investigation spanned almost 4½ years, and it was documented step-by-step in my essay Order, Chaos and the End of Reductionism. This research led me to several dead ends, blind alleys, and self contradictions. What I ultimately discovered was that Einstein's field equations of the general theory of relativity actually provide an exact solution for the universe as a whole, whereas these laws are recapitulated on smaller scales as approximations for weak-field interactions.
Combining this principle with the principle of maximal entropy led to some surprising conclusions, summarized by a simple equation of state that can easily fit on a T-shirt that captures the essence of the Theory of Everything.

© All Rights Reserved

25 vistas

For a long time, theoretical physicists have dreamed of the day when the general theory of relativity and quantum mechanics would be combined to create the Theory of Everything. It often stated that such a theory would be so simple and concise that the whole thing could be condensed into a simple equation that would fit on a T-shirt.
It was clear to me that classic material reductionism could not provide a path to that laudable goal, so I undertook an investigation to see what could replace it. That investigation spanned almost 4½ years, and it was documented step-by-step in my essay Order, Chaos and the End of Reductionism. This research led me to several dead ends, blind alleys, and self contradictions. What I ultimately discovered was that Einstein's field equations of the general theory of relativity actually provide an exact solution for the universe as a whole, whereas these laws are recapitulated on smaller scales as approximations for weak-field interactions.
Combining this principle with the principle of maximal entropy led to some surprising conclusions, summarized by a simple equation of state that can easily fit on a T-shirt that captures the essence of the Theory of Everything.

© All Rights Reserved

- Syllabus 104 TR
- Fractal Geometry - Blackledge
- 32019417
- Tuna Fish Thesis
- Infinite Continuum
- 0007287
- Happy Journey
- Law of Vibration
- Lisbon-PresentationMay94
- 3136 Dist Ladder
- Solution Manual for Oceanography an Invitation to Marine Science 9th Edition by Garrison
- Lequeux2003
- IAO1997 Problems
- Percolation and Disorder Resostance in Cellular Automata
- _PAS_PAS21_01_S132335800000607Xa
- Cepheid Variables Lab
- Security Analysis of Multimedia Data Encryption Technique Using Piecewise Linear Chaotic Maps
- Global Warming2
- The Life of This World
- Sybil M. Anderson, Daniel Neuhauser and Roi Baer- Trajectory-dependent cellularized frozen Gaussians, a new approach for semiclassical dynamics: Theory and application to He–naphtalene eigenvalues

Está en la página 1de 13

by John Winders

Note to my readers:

You can access and download this essay and my other essays through the Amateur

Scientist Essays website under Direct Downloads at the following URL:

https://sites.google.com/site/amateurscientistessays/

You are free to download and share all of my essays without any restrictions, although it

would be very nice to credit my work when quoting directly from them.

For a long time, theoretical physicists have dreamed of the day when the general theory of relativity

and quantum mechanics would be combined to create the Theory of Everything. It often stated that

such a theory would be so simple and concise that the whole thing could be condensed into a simple

equation that would fit on a T-shirt.

It was clear to me that classic material reductionism could not provide a path to that laudable goal,

so I undertook an investigation to see what could replace it. That investigation spanned almost 4½

years, and it was documented step-by-step in my essay Order, Chaos and the End of Reductionism,

which you can access from my Amateur Scientist Web Page. This research led me to several dead

ends, blind alleys, and self contradictions; however, I never deleted or changed any of my mistakes

in order to preserve and document the evolution of my thinking along each step of the way.

The essay presented here is just a condensation of that much longer essay. The equation on the

cover would easily fit on a T-shirt and I think it really does capture the essence of the Theory of

Everything.

dEU / dtU = 2 π k T c3 tU / ħ G

You'll note that the four fundamental constants of nature are here: Boltzmann's constant, k, the

speed of light constant, c, Planck's constant, ħ, and Newton's gravitation constant, G. Astute readers

who are familiar with the Bekenstein-Hawking theory will notice a piece of the Bekenstein equation

is found in it. I believe this is the equation of state of the universe that describes expansion in terms

of increasing mass-energy, EU, with respect to a universal time parameter, tU. You might object,

“Wait a minute. I thought mass-energy is conserved.” Well, mass-energy is conserved over short

time intervals where time-displacement symmetry is valid. What I discovered, however, is there is

no time-displacement symmetry over cosmological time scales, and there's a good reason for that.

The other thing I discovered is the gravitational constant, G, is not really constant after all, and

there's a good reason for that too. So come with me on a short journey to find out what those

reasons are. Because this essay is fairly brief, so you may have a bit of trouble following the

derivation of my theory; therefore, I urge to to go over to the web site and review Order, Chaos and

the End of Reductionism, especially Appendices W, X and Y, which go into much more detail.

In its most basic form, reductionism is an approach to understanding the nature of complex things

by reducing them to the interactions of their parts, or to simpler or more fundamental things.

Engineers and physicists use reductionism to explain reality. I came to the conclusion that there are

three different classes of interactions in nature:

1. Deterministic, linear, reversible, certain

2. Deterministic, non-linear, irreversible, predictable in the forward direction

3. Non-deterministic, irreversible, unpredictable (probabilistic)

Reductionism is concerned mainly with the first class of interactions; however, they only apply to

the most trivial of situations, such as two bodies orbiting around each other and simple harmonic

motion. The vast majority of interactions in nature are in the second class, commonly referred to as

chaotic interactions. Ironically, it seems that the highly-complex order we observe in the universe

emerges essentially from chaos. Take for example weather patterns, like a hurricane, born from

chaos and yet having an identity and a quasi-stable structure. The giant red spot on Jupiter is a

permanent hurricane that has persisted for at least 187 years.

Since reductionism is only capable of examining the simplest and most trivial examples of order, I

chose the title Order, Chaos and the End of Reductionism to reflect the fact that order and chaos

1

begin where reductionism ends. Another interpretation is that reductionism is at an end as a viable

scientific philosophy going forward. As long as you examine nature through linear, deterministic

and reversible interactions, you are only seeing reality through a tiny keyhole. How sad it is that a

majority of scientists still consider reductionism as the preferred default method of solving science.

String theory is touted as the whiz-bang cutting edge of theoretical physics, but I perceive old-

fashioned reductionism at its core.

The third class of interactions are stochastic, random, and completely unpredictable. These

interactions lie at the heart of quantum mechanics. Oddly enough, some extremely brilliant

theoretical physicists (including Albert Einstein up till his death) deny the very existence of

stochastic interactions, believing that some underlying local hidden variables are involved instead.

I confess being guilty of thinking that chaotic interactions might be used as substitutes for stochastic

processes, but I was definitely wrong. Experimental violations of Bell's inequality put that idea to

rest, and in the face of such incontrovertible evidence as this I'm amazed there are theoretical

physicists who still cling to determinism.

The core of my thesis is this: Entropy equals information. Entropy has been completely

misunderstood by many leading scientists, who try to label it as “missing information” or “hidden

information” or even “negative information.” This misconception stems from the fact that order

and entropy are indeed opposites. People tend to prefer order over disorder, so they equate entropy

to something very negative and undesirable. On the other hand, people love information – the more

the better. After all, we live in the “information age” with the Internet offering us cool things like

Wikipedia, Facebook, Twitter, and Instagram. So how can something “good” like information

possibly be the same as something so obviously “bad” like entropy? First off, you need to know

information is defined, which unfortunately most physicists do not. Claude Shannon figured it out

in the 1940s, and it has everything to do with probability and uncertainty. Suppose there are N

possible outcomes of some interaction, each with a certain probability, pr. Shannon concluded that

the amount of information, S, contained in that set of outcomes is as follows.

S = – ∑ pr log2 pr , r = 1, 2, 3, … , N

If an outcome is certain; i.e., if any of the probabilities in the set should equal one, then there is zero

information in that set. Suppose I call someone on the phone and inform them it's Saturday. How

much information did I relate to that person if he already knew it was Saturday? The answer is

zero, because there was no uncertainty on his part about the day of the week. But now suppose that

person just woke up from a coma and had no idea what day it was, so all days are equally probable

to him. For N equally-probable outcomes, the above equation reduces to S = log2 N. Stating that

it's Saturday provides log2 7 bits of information to that person. If you notice, S = log2 N is identical

to Boltzmann's definition of thermodynamic entropy, except Bolzmann used the natural logarithm

instead of the base-2 logarithm and he stuck a constant, kB, in front of it: S = kB Ln N.

Once you come to grips with the fact that entropy = information, then it's apparent that information

cannot exist without uncertainty. So which class of interactions in nature involves uncertainty?

Well, the first class clearly doesn't because all outcomes can be uniquely solved in both forward and

reverse directions. A single planet revolving around a star will stay in that orbit forever unless it is

perturbed by some outside force. You can determine the exact location of that planet billions of

years into the future or billions of years into the past using a simple formula that describes an ellipse

with a time parameter, t.

Can information come from a chaos? It might seem that chaos could provide randomness and

uncertainty, but this is not the case. Chaotic processes are still deterministic because there is a

unique relationship between the current state and subsequent states. Thus, every repetition of a

chaotic process will produce exactly the same sequence of events. This is not true in the reverse

2

direction due to one-to-many relationships between the current state and previous states, rendering

chaotic processes irreversible. Thus, irreversibility alone does not generate true uncertainty, at least

going forward. Chaotic interactions can rearrange bits, and even make them unrecognizable, but

they cannot create new bits. Only the third class of stochastic interactions can introduce the

uncertainty that information requires.

Chaos produces fractal patterns, and these patterns are widespread in nature. So at one point in this

investigation, I thought the universe itself might be a colossal fractal. Fractal patterns have

extremely high – or one might even say infinite – levels of complexity that can be generated by very

simple non-linear functions. Fractals have the properties of scale-invariance and self-similarity,

where large-scale features are repeated over and over on smaller scales. Those features are not

necessarily repeated exactly, however. The Mandelbrot set is one of the most widely-known

fractals, having a prominent circular feature that appears over and over again on smaller scales. On

the smallest scales, this circular feature gradually gives way to different features. You can try this

yourself using the interactive Mandelbrot Viewer.

I used to think the general relativity field equations could only be applied to small-scale systems,

but I was very wrong. What I discovered is that Einstein actually had stumbled on a set of

equations that provides an exact description for the entire universe, and that pattern is only repeated

as an approximation for smaller scales involving weak-field interactions. In other words, the

universe is a fractal having an exact overall solution given by the Schwarzschild equation, but this

equation is not necessarily an exact solution for smaller scales.

A recurrent theme in this throughout my investigation is that the most important – and perhaps the

only – law of nature is the statement that entropy of isolated systems cannot decrease. This is the

famous second law of thermodynamics, which really should be the zeroth law of the universe

because it underlies causality itself. Since entropy and information are equivalent, this law means

that information cannot be destroyed. Some scientists try to trivialize this law by saying that there's

just a tendency for entropy to increase because it's more likely to increase than to decrease. They

say given enough time (and patience) you'll see an isolated system inevitably repeat some previous

lower-entropy state. I state unequivocally that this is not just unlikely, but it's impossible because it

would be tantamount to destroying information and causing “unhappening” of previous events.

As a corollary to the second law of thermodynamics, I came up with what I call the “post-

reductionist universal law” stated as follows:

“Every change maximizes the total degrees of freedom of the universe.”

The phrase “total degrees of freedom” sounds kind of nice, which is why I chose it. But the

logarithm of total degrees of freedom equals total entropy, so what this really means is that every

change maximizes the entropy of the universe. Not only can entropy never decrease, it must always

increase to the maximum extent possible. Taking this idea to the limit, I postulated we live in a

moment of maximally-increasing entropy, which addresses – and maybe solves – the mystery of

time. What clocks are actually measuring are increases in entropy reflected as a reduction in

curvature of the universe unfolding around them, as explained in the following paragraphs.

Solving the Schwarzchild equation yields R = 2 E G / c2 describing a sphere of radius R, where E is

the mass-energy of the system, G is the gravitational parameter, and c is the speed of light.

Maximizing the total degrees of freedom (entropy) of the universe means the universe is in a

permanent state of maximal entropy, so the only way to further increase entropy is through

expansion. The maximum rate of expansion can be attained if R increases at the speed of light by

introducing the concept of universal time, tU, where R = c tU.

The idea that there could be such a thing as universal time is anathema to physicists. After all, we

3

are told space and time are relative, not absolute. However, tU isn't the Newtonian notion of

simultaneity across space. Instead, tU marks the progress of universal expansion, and while R has a

dimension of length, it should be thought of as an expanding radius of curvature around a temporal

center, with a surface surrounding the center at a distance R = c tU marking the present moment. No

clock can run ahead of tU because no clock can run ahead of the present moment. A free-falling

body will keep up with tU, except when a force acts on the body causing acceleration and its proper

time to lag behind tU.

Observing objects at some distance in any direction, we observe them when the universe had a

radius R' < c tU. Those objects will fall behind us in time and will appear to recede from us in space,

resulting in the cosmological red shift. Objects at at distance R = c tU will be receding at the speed

of light and will be at the edge of our horizon. Substituting c tU for R in the Schwarzschild equation

results in E G = ½ c3 tU. This means either the total mass-energy of the universe or the gravitational

parameter must increase over time, or both. As it turns out, the gravitational parameter decreases

over time, being proportional to tU – 1, so E must increase in proportion to tU 2.

If the universe is in a state of maximal entropy, we can apply the Bekenstein equation to it. By

combining the Bekenstein and Szilárd equations, we get the following equation of state.

dEU = (k T c3 / 4 ħ G) dA , where A is the expanding surface area of uniform curvature, 4 π R2.

dA = 8 π R dR

dA / dtU = 8 π R dR / dtU = 8 π c R

\ dEU / dtU = 2 π k T c4 R / ħ G = 2 π k T c3 tU / ħ G

The above equation of state combines the four fundamental constants k, c, ħ, and G (although G is

really a variable, being inversely proportional to tU). The temperature of the universe, T, is

inversely proportional to tU also, so the ratio T / G equals a constant that can be evaluated using the

current temperature of the universe and the measured value of the gravitational constant.

According to the Bekenstein equation, the total entropy expressed in bits is proportional to the area

of uniform curvature, 4 π R2 , divided by 4 Ln 2 times the Planck area, G ħ / c3. Since the Planck

area is proportional to tU–1, total entropy is proportional to tU3. There must have been a time in the

past when the total entropy of the universe was equal to one bit, which I would guess is the

minimum amount of information that has meaning; the information associated with a coin toss. The

value of tU corresponding to a single bit of entropy would be my idea of “The Beginning.”

Information from the past is encoded into the present moment. Linear and chaotic interactions

transform those bits according to the laws of determinism without any loss of information, obeying

the second law of thermodynamics, a.k.a. the zeroth law of the universe. Meanwhile, stochastic

interactions are laying new bits of information at an increasing rate across an ever-expanding

surface of uniform curvature corresponding to the present moment.

One of the raging controversies in the scientific community is the “vacuum catastrophe,” referring

to the huge discrepancy between the mass-energy vacuum density based on cosmological arguments

with an apparent flatness of space and the mass-energy vacuum density of virtual particle pairs

based on quantum electrodynamics (QED). Using the model presented in this essay, the value of

density, ρ, is found by dividing the rate of change of dEU / dtU by dVU / dtU = 4π R2 dR / dtU, with the

assumption that dR / dtU is at the maximum rate, c. The vacuum density is ρ = k T / 2 ħ G tU, and it

decreases over time. Based on the known values of the parameters used in the formula, the vacuum

density is currently 980 kg / m3, a surprisingly large value. However, it's not nearly as outlandish as

the QED value for vacuum density of around 10 106 kg / m3. I'll conclude this essay on the following

page with some bullet items that capture the key points of my Theory of Everything.

4

SUMMARY

deterministic, chaotic, irreversible; stochastic, probabilistic, irreversible.

• Entropy is equivalent to information.

• Information requires uncertainty; thus, only stochastic interactions are capable of producing

information.

• Linear and chaotic deterministic interactions preserve and transform information in causal

space according to the “laws of nature.”

• Causal space has one time dimension, requiring three spatial dimensions because they must

match the number of rotational degrees of freedom.

• A free-falling observer is incapable of measuring any spatial curvature of three-dimensional

space because of rotational symmetry.

• Due to the asymmetry of time, there is a radius of temporal curvature, R, expressed in units

of length, centered on the beginning of time.

• Order emerges from chaotic interactions as fractal-like patterns that repeat on different

spatial and temporal scales.

• The universe is a fractal with the properties of scale-invariance and self-similarity.

• Due to scale-invariance, solutions to the general relativity field equations are exact solutions

for the entire universe and approximate solutions for to its sub parts.

• The Schwarzschild formula R = 2 E G / c2 is an exact formula of a closed system, e.g. the

universe.

• The universe is in a permanent state of maximal entropy and so the Bekenstein equation can

be applied to it. Thus, the universe must expand in order to accommodate more information.

• There exists a universal time parameter, tU, which marks the expansion of the universe.

• The universe expands maximally at a rate dR / dtU that is bounded by the speed of light, c.

• Since tU corresponds to the present moment, proper time of an observer cannot get ahead of

tU. The geodesic paths of free-falling bodies maximize proper time up to the limit of tU.

• Time, having a radius of curvature equal to R, does not have time translation symmetry over

cosmological time periods. Thus, the law of conservation of mass-energy does not apply to

the universe as a whole.

• The quantity of mass-energy in the universe increases in proportion to tU2.

• The quantity of entropy-information in the universe increases in proportion to tU3.

• There is an equivalency between mass-energy and entropy-information (“it equals bit”).

• Since mass-energy and entropy-information increase at different rates, they are linked by the

Szilárd equation with a decreasing temperature, T, proportional to tU–1.

• The vacuum density of mass-energy is ρ = k T / 2 ħ G tU, with a present value of 980 kg / m3.

5

Appendix A – A Picture Is Worth 1,000 Words

The schematic diagram below explains the cosmology of the Theory of Everything.

The universe expands from left to right beginning at a time t' = 0, which can be interpreted as the

“big bang” or whatever initial state is appropriate. The magenta curves indicate “now” surfaces of

uniform curvature having radii of curvature, R', centered on t' = 0. The present radius of curvature

at Here and Now is R = c tU where dR / dtU = c. Looking in any direction, x, y, or z out into space,

we see the universe as it was younger and when R' of the “now” surface of uniform curvature was

smaller. Because the universe was younger at these locations, their times appear to be lagging

behind tU from the vantage point of Here and Now, creating time dilation proportional to the

distances Ö x2 + y2 + z2. This is the reason for the cosmological red shift.

Temperatures, T', shown above the red thermometers, are inversely proportional to time t', so true

temperatures are proportional to distances Ö x2 + y2 + z2 . However, because of the cosmological

red shift, all temperatures appear the same from our vantage point of Here and Now. The so-called

cosmic microwave background (CMB) is actually the composite of all temperatures of every era

after those temperatures have been red-shifted in proportion to distances. The CMB isn't just one

red-shifted temperature from a particular era, but all temperatures after they've been red-shifted.

The radius of curvature always expands at the speed of light at a surface of uniform curvature:

dR' / dt' º c. However, the cosmological time dilation slows distant expansion velocities from the

vantage point of Here and Now. This makes objects from previous eras appear to recede away from

Here and Now, as shown by the green arrows. Thus, the true origin of the Hubble constant is the

apparent slowing down of distant recessional velocities from cosmological time dilation.

6

Appendix B – The Point of Inflection

According to the standard cosmological model (SCM), the universe is currently on the precipice of

something big. Since the big bang, gravity has been slowing the rate of universal expansion – until

now. Dark energy – also known as the cosmological constant – has caused the rate of expansion to

start picking up recently. In the future, the expansion will continue to accelerate, causing a number

of cosmologists to fear that space itself will be torn apart. They call this future event “the big rip.”

In calculus, the point of a curve where the slope stops decreasing and starts increasing is called a

point of inflection. The size of the universe is now at a point of inflection.

On the other hand, according to my Theory of Everything (TOE), the rate of expansion as expressed

by the radius of temporal curvature is, was and always will be equal to the speed of light. The

graphs below compare the SCM and TOE models. “Time Since the Beginning” and “Radius” are

on normalized scales, where 1.0 = tU » 13 + billion years and 1.0 = R = c tU.

The blue curve corresponds to the SCM and the red line corresponds to my TOE. The point of

inflection of the blue curve is occurring at t' = 1.0. The SCM rationale for a rate of expansion that

decreases and then increases is as follows. Whereas gravity puts the brakes on expansion, it's

becoming a non-factor as the universe expands because no additional matter is being added to the

universe. On the other hand, since the density of dark energy is constant, there is more outward

“pressure” to expand as the universe gets larger, and this is just now overwhelming the tendency for

the universe to collapse under gravity.

Doesn't it seem a bit odd that a once-in-a-universe event such as this would wait until just after the

human species had a chance to evolve into intelligent creatures and began to contemplate the

universe? Or could it be that astronomers really don't know how big or how old the universe

actually is? After all, if distances to the “standard candles” used by astronomers to judge distances

are slightly off, then the calculated previous rates of expansion will be off too. The most reliable

“standard candles” are in our immediate vicinity, so if the rate of expansion right around us seems

to be X, then it just might be that the rate of expansion always has been X.

My TOE looks at things differently than SCM: Mass-energy is being added to the universe and it is

proportional to tU2, while entropy (which drives expansion) is proportional to tU3. Temperature,

being proportional to tU – 1, balances the two, keeping the rate of expansion constant. It's not that the

radius of curvature is forced to expand at the speed of light; instead, the speed of light is forced to

equal the rate of expansion in order to prevent travel into the past, which would violate causality.

7

Appendix C – Some Observations and Experiments

One of the most controversial claims of my TOE is that Newton’s so-called constant, G, is a

variable that decreases over cosmological time, and I think there is some evidence that suggests G is

not constant. Astronomers estimate distances based on apparent luminosities of stars, which

decrease by the square of their true distances from Earth, compared to their intrinsic luminosities.

The intrinsic luminosity increases with the star’s diameter and its surface gravity. If the value of G

was greater in the past, the intrinsic luminosity of a distant star would be greater than an identical

nearby star. In other words, if G were greater in the past distant stars would be actually farther

away from Earth than their apparent luminosities would indicate. The following chart shows

measurements of the cosmological red shift based on the apparent luminosities of distant stars. The

blue circles depict stars with distances computed using “standard candles” based on the assumption

that G is a constant. The red circles depict the same stars assuming greater values of G and higher

intrinsic luminosities in the past, which shifts their true distances toward the right.

Based on standard cosmology, the rate of expansion of the universe is slowing down. This is shown

by the blue line, which bends upward with increasing distance. My TOE claims that the

cosmological red shift must be linear over distance since all observers recede from the Beginning at

the same speed (c) in their own frames of reference. Based on their greater true intrinsic

luminosities, distant stars are shifted from the curved blue line toward the straight red line.1 This

doesn’t prove a thing; however, if G does decrease over cosmological time, then it would explain

why the observed rate of expansion seems to have slowed down even if the true rate were constant.

Astronomers claim that observations of distant supernovae prove that G has remained constant for

billions of years. But this assumes an average star in the past began with the same amount of

material as an average star forming today. But if G were significantly greater in the past, couldn’t

smaller stars then appear much the same as medium-sized Sun-like stars today?

Observations of Mars clearly indicate remnants of rivers, lakes, and possibly seas on the Martian

surface. Although today’s temperatures at the Martian equator can reach 20°C at high noon, the

mean surface temperature is −60°C. These chilly temperatures are partly due to the fact that Mars

doesn’t have much of an atmosphere, but its doubtful that a thicker atmosphere would make much

1 This means that distances to far away galaxies are greater than indicated from astronomical observations.

8

of a difference considering that the “greenhouse effect” from Earth’s relatively thick atmosphere

adds only about 30°C to the Earth’s surface temperature. Current models of stellar evolution

indicate that the Sun was significantly cooler billions of years ago than it is today, but those models

are highly dependent on the surface gravity of a star. So how was it possible for water to flow on

Mars in the distant past? One plausible explanation is that if G were larger in the past than it is

today, the Sun could have been significantly brighter in the past than the current models indicate.

Gravity is an exceptionally weak “force”2 on small scales, so measuring G requires very precise

instruments. The Cavendish torsion balance is the standard instrument for doing these

measurements. A schematic diagram of the torsion balance is shown below.

Two spherical orange weights, each with a mass m, are attached to a rod of length L suspended from

a thin fiber (shown in green). Two movable blue weights, each with a mass M, are placed in close

proximity to the orange weights so that the M-to-m centers are maintained at a fixed distance, a,

apart. The blue and orange spheres attract each other by gravity and produce a torque, τ, in the fiber

(indicated by the green arrows). G is calculated by measuring the torque in the fiber and applying

Newton’s law of gravitation in reverse, as follows.

G = a 2 τ / (L M m)

When measuring any physical parameter in a laboratory, random errors produced by all components

of the instrument must be taken into account. These errors are combined statistically to produce a

range of possible true values around the measured result. For example, if the measured value of X

is 1.0 and the combined measurement errors are 0.1, we expect 0.9 < X < 1.1 to the be the true

value of X. When multiple measurements are made using the same instrument, the measurements

may vary but they should fall within the range of statistical error. This isn’t case with the

Cavendish torsion balance; over the years as measurements are getting more precise, researchers

have noted anomalous measurements that routinely fall well outside those ranges.

In a 2015 paper (Click this link for the paper), J. D. Anderson of the Jet Propulsion Laboratory and

three other authors reported that measured values of G seems to be correlated with the variations in

the length of day (LOD) over a 5.9-year periodic cycle. These results are is depicted on the highly-

schematic chart on the next page.3 The blue dots represent laboratory measurements of G and the

vertical bars indicate instrument error ranges. The solid blue curve represents the LOD over one of

several 5.9-year cycles.

2 I hesitate using the words force and gravity in the same sentence because the General Theory of Relativity clearly

shows that gravity isn’t a force; it’s a distortion of space-time that defines trajectories of free-falling bodies. The

“force” appear when the body is prevented from following that trajectory. The chair I’m sitting in alters my

trajectory through space-time by pushing on my butt. That push is often mislabeled as “gravitational force.”

3 The actual chart is copyrighted so it can’t be shown here, but you can see it by following the link to the paper.

9

As depicted on the chart, most of the blue dots fall outside the instrument-error ranges of other blue

dots and follow the length-of-day curve. Anderson and the other authors are quick to point out that

they don’t believe a change in the Earth’s rotation itself affects the G measurements, nor do they

believe G actually changes. Thus, they are at a loss to explain why measured values of G are

correlated with LOD, although they suspect there is a common driver. Some have suggested

periodic changes in the Earth’s core affect LOD, which is reasonable. But why would changes in

the Earth’s core affect laboratory measurements of G at the surface?

Well here’s my hypothesis: The drivers behind the variations in measurements of G and LOD are

true variations in G itself. Gravitational mass and inertial mass are two sides of the same coin. If

the true value of G increases, the Earth’s inertia will also increase. Then in order to conserve

angular momentum, the Earth’s speed of rotation must slow down, thus increasing the LOD. At any

rate, Cavendish torsion balance data strongly suggest that G is not a constant but a variable.4

The fact that the period of G correlates with the period of LOD is interesting, although it might just

be a coincidence. But matching amplitudes would really support the hypothesis of a causal

connection between them. From the Anderson et al paper, the measurements of G over the 5.9-year

LOD cycle vary between 6.672 and 6.675 10 –11 m 3 s – 2 kg – 1. Unfortunately, different labs

made these measurements using different instruments with different statistical errors. At any rate,

the measurements deviated 0.0225% from the mean value.

I looked up the LOD over a 5.9-year period, and the variations deviated about 1.5 ms from the

mean sidereal day of 86,164 sec. These LOD variations are only 0.0000017% compared to the

hypothetical 0.0225% variation of G. The angular momentum of a sphere having a mass M and

radius R is given by the formula L = ⅘ ω M R2.. The speed of rotation, ω, is inversely proportional

to M, so if angular momentum is conserved with G (along with the inertial mass M) truly varying

by 0.0225% , the LOD should vary by 19 seconds over a 5.9-year cycle. This is 13,000 times

larger than the observed LOD variations, and such a huge discrepancy in LOD observations would

seem to show my TOE hypothesis is all wet; however, some relevant facts were left out.

The angular momentum of a sphere is dependent on R2 and the Earth is an obloid sphere, with its

radius bulging out at the equator. Since the Earth is not perfectly rigid, the equatorial bulge would

move slightly inward as G increases and the rotation slows down. Such a slight decrease in R might

almost – but not quite – cancel the slowing down of the rotation due to an increase in inertial mass,

M. In other words, a more detailed model of an elastic Earth might show that the data from the

observations actually align with the hypothesis that G changes over time and causes LOD to change

in synch with G. This warrants some further research in my opinion.

4 I still don’t have a clue why G would vary over a 5.9-year cycle, although it should be correlated with the local

ambient cosmological temperature. Unfortunately, I don’t know of a way to measure that temperature directly, but

I’d love to know if the LODs of other planets in our solar system also follow the same 5.9-year cycle.

10

Appendix D – What’s Below the Ontological Basement?

The British physicist Paul Davies famously stated that information occupies the ontological

basement of reality. From my own studies over the past decade, I gradually arrived at the same

conclusion: Entropy (information) underlies everything we call physical “reality” or “universe”

(see two other of my essays Order, Chaos and the End of Reductionism and Relativity in Easy

Steps). This implies a hierarchical chain, as follows.

Information → Time → Space → Energy → Matter

The last four items in the chain (time, space, energy and matter) form what physicists define as the

material universe in its entirety, but there are a growing number of them, like Davies, who are

beginning to suspect that these are just the four upper floors of an edifice with a basement below

them consisting of information. In fact, some have concluded that the key to a unified theory of

quantum physics and gravity lies in the language of information.

The question that remains is whether information is ontologically complete in and of itself. I’ve

struggled with that question, and lately have come to the conclusion that it must depend on

something else which stands completely apart from time, space, energy and matter. My reasoning is

as follows.

Most of us have an intuitive idea of information as having to act on a physical object. For example,

flash drives store data using electrical charges applied to billions of tiny transistors. Information

(according to Claude Shannon’s definition) requires uncertainty, so if each of those transistors could

exist only in one possible state, there would be zero uncertainty about the flash drive’s actual

configuration, rendering it useless because it couldn’t store any information. On the other hand, a

functional 8 GB flash drive can store 6.8719 10 10 bits of information, meaning the drive may be

68,719,000,000

configured in any of 2 unique configurations, an insane number of possibilities with a

huge uncertainty. The flash drive’s information storage function depends entirely on uncertainty.

We can envision physical computer hardware coming off the assembly lines without any firmware,

operating system, software, or data. In other words, hardware can exist without containing any

information, but, we cannot imagine firmware, an operating system, software or data existing in

empty space without the physical hardware. Thus, information truly requires a physical substrate to

act upon. The aforementioned Szilárd equation reveals that energy equals information multiplied

by the temperature-dependent proportionality factor, kBT, suggesting that a substrate having

thermodynamic properties5 must exist for the energy ↔ information equivalence to be valid.

Ontological information relies on a universal substrate I’ll refer to as “Q”.

Q (the universal substrate) → Ontilogical Information (the physical universe)

The problem is that Q cannot consist of either matter or energy because mass-energy is just

information in a condensed form. This suggests Q must be at least one additional level below

Davies’ ontological basement, distinct from the physical universe itself and existing beyond it. The

metaphor of physical reality represented by a building with information as its ontological basement

is even more appropriate if we represent Q by the soil surrounding and supporting the basement.

The soil can easily exist without the building, but not the other way around. Being outside the

physical universe, Q seems similar to the description of the Deity.6 In any case, whatever Q is, it

cannot be described or accessed as if it were a physical object.

5 In my essay Relativity in Easy Steps, I show how the general relativity field equations can be rearranged to describe

a physical “substance” having energy, entropy and temperature, implying that space-time can be thought of as a

“field” having physical and thermodynamic properties. Unfortunately, space-time cannot serve as the necessary

substrate for information because space and time are both derived from information itself.

6 Except that the conventional western Deity is said to reside “above” the universe instead of “below” it.

11

- Syllabus 104 TRCargado porrussell_paul9822
- Fractal Geometry - BlackledgeCargado porric
- 32019417Cargado porpauloivo314
- Tuna Fish ThesisCargado porPreetham Bharadwaj
- Infinite ContinuumCargado porBlake Allen
- 0007287Cargado porMurali Krishnan Nair
- Happy JourneyCargado poraRishi
- Law of VibrationCargado porWill Westing
- Lisbon-PresentationMay94Cargado porNoman Mahmood
- 3136 Dist LadderCargado porRoy Vesey
- Solution Manual for Oceanography an Invitation to Marine Science 9th Edition by GarrisonCargado pora641705730
- Lequeux2003Cargado porAline Gonçalves
- IAO1997 ProblemsCargado porcrimsonsnow
- Percolation and Disorder Resostance in Cellular AutomataCargado porhumejias
- _PAS_PAS21_01_S132335800000607XaCargado porhenrique_rp
- Cepheid Variables LabCargado porDawin Morna
- Security Analysis of Multimedia Data Encryption Technique Using Piecewise Linear Chaotic MapsCargado porEditor IJRITCC
- Global Warming2Cargado porprimodorado8674
- The Life of This WorldCargado porPrasanjit Grover
- Sybil M. Anderson, Daniel Neuhauser and Roi Baer- Trajectory-dependent cellularized frozen Gaussians, a new approach for semiclassical dynamics: Theory and application to He–naphtalene eigenvaluesCargado porPrem_Swift
- ContentsCargado porsyrish_2622
- Fragmentation in Population III Galaxies formed through Ionizing RadiationCargado porGaston GB
- 23675452 Controlling Chaotic Melodies a Coca G Olivar and Z LiangCargado porsaudade96
- Farmer.goldreich 2006 IcarusCargado porNilton Araújo
- COSMOS.docxCargado porPatricia Rodriguez
- As You Gaze Up Into the Night SkyCargado porSamiha Torrecampo
- StarsCargado porLuis Viana Guzmán
- EARTH SCIENCE 1.pdfCargado porKristel Romero
- 1Cargado porMohsin Karajagi
- PruebaCargado porAndrés Ruiz

- The Hidden Secrets of General Relativity RevealedCargado porJohn Winders
- Relativity in Easy StepsCargado porJohn Winders
- Why There Are No True Black HolesCargado porJohn Winders
- Teachings from Near Death ExperiencesCargado porJohn Winders
- Global Warming Is Real (Even if the Term "Greenhouse" Is Bogus)Cargado porJohn Winders
- Is Science Solving the Reality Riddle?Cargado porJohn Winders
- Are We All Alone?Cargado porJohn Winders
- Manifesto of an Amateur ScientistCargado porJohn Winders
- Order, Chaos, and the End of ReductionismCargado porJohn Winders

- MILITARY THREATS AND CHALLENGES FOR GEORGIA’S DEFENCE SYSTEM AND THE SOCIETYCargado porandrobarnovi
- IJETTCS-2013-08-19-097Cargado porAnonymous vQrJlEN
- Conservation Management Plan for the Iron Bridge, Ironbridge, Shropshire, UKCargado porWin Scutt
- Open Source Software NoticeCargado porSirDobrica Dunjic
- 2016 ATP&JV Outline - CLVCargado porDanica Evangelista
- Ch2 Complexity Growth WDCargado porHarsha Madhwani
- Table 2007 CBCCargado pormttaib
- Tales of Symphonia WalkthroughCargado porvinci
- DICTM AOM 20091.docCargado porjomar ico
- 1505.03540Cargado porBille Jean
- Danish Furniture HouseCargado porNichole L. Reber
- NIT for SPV Water Pumping System in Tripura.pdfCargado porpthakur2011
- DeVry Keller ACCT 504 Chapter 3 Homework SolutionsCargado porbaseballjunker4
- Girish Com Paras Ion Between in Nation Bank & Private Bank122Cargado porChuraman Singh
- LabRev 1991 Bar.xlmhbgsCargado porZeon_Two_Six
- Mathematical Modelling of Hydraulic Transients in Simple SystemsCargado porFernando Tapia
- 1-s2.0-S0264370715300028-main.pdfCargado porDavid Trejo Cancino
- Best Practice Guide PumpCargado porcuervohijoguacho
- MITdCargado porali4299
- Julie Sherry Rehabilitation of Spondylolysis2-1.pdfCargado porghidecmaria
- The Abyss Beyond Dreams by Peter Hamilton, 50 Page FridaysCargado porRandom House Publishing Group
- Short ListCargado porShakeel Akhtar
- Artículo Dossier a 40 Años de Surveiller Et PunirCargado porivandalmau
- Linde Datasheet TIG18570 41397Cargado porAgustin Cesan
- Annex 2 - Ward LegendCargado porErickCarranza
- 60 MinutesCargado porSheri Deterling
- ISO 03-Alberto Fossa-IsO 50001-The PathCargado porSoniaSegerMercedes
- HIPPSCargado porAhmed Mohamed Khalil
- FS ABC_ver2Cargado porFaisal Sheikh
- Stelor Productions v. Silvers, et al - Document No. 28Cargado porJustia.com