Astronomy Without A Telescope – Dark Denial

[/caption]

A recent cosmological model seeks to get around the sticky issue of dark energy by jury-rigging the Einstein field equation so that the universe naturally expands in an accelerated fashion. In doing so, the model also eliminates the sticky issue of singularities – although this includes eliminating the singularity from which the Big Bang originated. Instead the model proposes that we just live in an eternal universe that kind of oscillates geometrically.

As other commentators have noted, this model hence fails to account for the cosmic microwave background. But hey, apart from that, the model is presented in a very readable paper that tells a good story. I am taking the writer’s word for it that the math works – and even then, as the good Professor Einstein allegedly stated: As far as the laws of mathematics refer to reality, they are not certain, and as far as they are certain, they do not refer to reality.

Like a number of alternate cosmological models, this one also requires the speed of light in a vacuum to vary over the evolution of the universe. It is argued that time is a product of universe expansion – and hence time and distance are mutually derivable – the conversion factor between the two being c – the speed of light. So, an accelerating expansion of the universe is just the result of a change in c – such that a unit of time converts to an increasing greater distance in space.

Yes, but…

The speed of light in a vacuum is the closest thing there is to an absolute in general relativity – and is really just a way of saying that electromagnetic and gravitational forces act instantaneously – at least from the frame of reference of a photon (and perhaps a graviton, if such a hypothetical particle exists).

It’s only from subluminal (non-photon) frames of reference that it becomes possible to sit back and observe, indeed even time with a stopwatch, the passage of a photon from point A to point B. Such subluminal frames of reference have only become possible as a consequence of the expansion of the universe, which has left in its wake an intriguingly strange space-time continuum in which we live out our fleetingly brief existences.

As far as a photon is concerned the passage from point A to point B is instantaneous – and it always has been. It was instantaneous around 13.7 billion years ago when the entire universe was much smaller than a breadbox – and it still is now.

But once you decide that the speed of light is variable, this whole schema unravels. Without an absolute and intrinsic speed for relatively instantaneous information transfer, the actions of fundamental forces must be intimately linked to the particular point of evolution that the universe happens to be at.

For this to work, information about the evolutionary status of the universe must be constantly relayed to all the constituents of the universe – or otherwise those constituents must have their own internal clock that refers to some absolute cosmic time – or those constituents must be influenced by a change in state of an all-pervading luminiferous ether.

In a nutshell, once you start giving up the fundamental constants of general relativity – you really have to give it all up.

The basic Einstein field equation. The left hand side of the equation describes space-time geometry (of the observable universe, for example) and the right hand side describes the associated mass-energy responsible for that curvature. If you want to add lambda (which these days we call dark energy) - you add it to the left hand side components.

The cosmological constant, lambda – which these days we call dark energy – was always Einstein’s fudge factor. He introduced it into his nicely balanced field equation to allow the modeling of a static universe – and when it became apparent the universe wasn’t static, he realized it had been a blunder. So, if you don’t like dark energy and you can do the math, this might be a better place to start.

Further reading: Wun-Yi Shu Cosmological Models with No Big Bang.

26 Replies to “Astronomy Without A Telescope – Dark Denial”

  1. I was under the impression that things like the Oklo natural nuclear reactor pretty much rule out the idea that the natural forces were very much different in the distant past to what they are now.

  2. @ Nexus

    Yes – and I imagine a whole bunch of other scientific observations lead to the same conclusion. Thanks for Oklo – I note from: http://en.wikipedia.org/wiki/Natural_nuclear_fission_reactor
    that “The natural reactor of Oklo has been used to check if the atomic fine-structure constant might have changed over the past 2 billion years… most (but not all) have concluded that nuclear reactions then were much the same as they are today, which implies (the constant) was the same too”.

  3. The 3-sphere or Geometry implied in this paper is something I focused on, it describes (to me) how the speed of light can be variable. To gain a focus you have to place us, (the observers) inside a homeomorphic 3-dimensional manifold. The constant would only be variable to an outside observer, which we are not, so to my way of thinking the constant is always “constant”. The local geometry of space and time is homogeneous to its location in 3-sphere space.

    In this theory, the CMB is often quoted as being poorly explained, yet the Poincaré conjecture (now proven by Grigori Perelman) is often suggested as a solution to the imbalance of WIMAP data on larger angular scales – greater than 60′ which can not be described by current theory.

    http://arxiv.org/abs/astro-ph/0310253

    The problem has been and remains, that this implies that the universe is infinite and there is no need for a big bang.

    The Poincaré dodecahedral space is as valid an outlook for the topology of the universe as the Current Picard horn shaped hyperbolic topology.

    I guess its time to wait on Data from Planck to put more context to this debate.
    Damian

  4. Dang… The equations in the referred to ref. of Wun-Yi Shu’s paper are WAY beyond my simple calculus understanding and remind me of a scene from the epic Sci-Fi movie from the 1950’s called, “The Day The Earth Stood Still”. In one scene from that movie the alien, Micheal Rene, visits an astro physicist’s office to introduce himself and seeing that the good Doctor is not present, modifies some calculations the Doctor has written on a blackboard. The physicist’s secretary sees what he’s done and after shooing the visitor away tries to return those calculations to their original form but gives up in exasperation after realizing there was NO WAY to do so…. I feel like the good secretary here somehow?

    BUT, would like to add… that intuition and vision nevertheless remain a factor even if the nomenclature for describing that vision mathematically is clouded or temporarily unclear. During moments of clarity, those calculations are understandable even if only as in a fleeting dream. Gonna have to work on that! Grins and grunts, then puts the chalk down…. ~@; )

  5. Such papers actually helps confirming my conjecture: standard cosmology fruitful, non-standard cosmology full of nuts.

    In doing so, the model also eliminates the sticky issue of singularities – although this includes eliminating the singularity from which the Big Bang originated.

    The whole point with GR being an effective theory is that singularities don’t stick. Cf string theory formulations that removes a bunch of them.

    But the real deal is that a big bang singularity is an untested conjecture. Taking inflation seriously means that there is no such specific singularities but a local end of inflation. Instead we have semiclassical worldlines emerging out of Planck volumes. (And that is from a simple theorem that even I understand. :-D)

    We know from supernova timing results that the later are Lorentz invariant, i.e. there are no geometrical singularities on that scale. In string/M theory AFAIU this scale is tentatively identified as the holographic duality, that all the variants pivot on, for natural reasons.

    To understand this requires new physics. The only thing we do know is that this is entirely uncoupled from our effective physics or holography, or rather the classical bound on entropy (microstate information), wouldn’t work. It is basically the same phenomena really, effective microstates vs effective singularities.

    So I’m not worrying about it, except that I do my very best to stay away from a big bang “singularity”. It doesn’t seem likely IMHO.

    The cosmological constant, lambda – which these days we call dark energy – was always Einstein’s fudge factor. He introduced it into his nicely balanced field equation to allow the modeling of a static universe – and when it became apparent the universe wasn’t static, he realized it had been a blunder.

    It is interesting to note that Einstein introduced it on the space-time curvature side because he wanted to balance the geometry. Unfortunately that means a linear term, AFAIU.

    While the natural interpretation today is as a mass-energy associated with the vacuum, as would be expected from field theories. That, and a constant term is natural.

    Not his biggest mistake, which in the end turned out to be the simpler mistake that he thought it was a mistake in the first place. But a funny one.

    Another irony is that the cc helps unbalance the geometry in the standard model. Oh woe!

  6. The Poincaré dodecahedral space is as valid an outlook for the topology of the universe as the Current Picard horn shaped hyperbolic topology.

    Except that the Poincaré conjecture is about topology, not cosmological geometry.

    The local geometry better be homogeneous or our physics wouldn’t be (see GR). But the difference is in the global geometry, naturally enough standard cosmology builds on that same uniformity (more generally the copernican principle).

    the imbalance of WIMAP [sic] data on larger angular scales – greater than 60? which can not be described by current theory.

    Um, there is no such anomaly, study the results. From the main WMAP 7 year summary abstract:

    “In this paper we examine potential anomalies and present analyses and assessments of their significance. In most cases we find that claimed anomalies depend on posterior selection of some aspect or subset of the data. …

    We find no significant anomaly with a lack of large angular scale CMB power for the best-fit ΛCDM model.”

    Section 5 makes the explicit claim: “The angular correlation function over the full sky ILC map from Equation (7) is shown in Figure 5. As can be seen, C(θ) lies within the 95% confidence range of the best fit ΛCDM model for all θ, as determined by Monte Carlo simulations. This supports the conclusion that there is no statistically significant lack of large-scale power on the full sky.”

    And then proceed to explicitly reject all the post selection models claims that was around at the time.

  7. if time traveling actually physically changes earths history, then the big-bang is a 3-D model. like the traveler who went back into the past, and accidently stepped on a butterfly, that drastically changed earth, then the big-bang is just a finite 3-D model, and time is a 4-D model, and the 5-D+ models I hope I am not wasting any of my time with here living!

  8. more generally the copernican principle

    Speaking of which, it hits me that the big bang singularity breaks that principle, or perhaps a generalized form of it in the face of theories that embeds our observable universe (and the rest of the “big bang singularity” volume) in a larger space. It implicitly makes Earth special again on that scene: why would we see a singularity (if the potential phase space is larger).

    So now that those other theories exist, there is also an explicit (implicit?) reason to stay away from that conjecture.

  9. @ Damian.

    The 3-sphere geometry is AFAIU 🙂 an environment that allows the existence of 4 dimensional hyperspheres and other such hyperobjects – and in this context us poor flatlanders are looked down on by higher dimensional 3-sphere beings who scoff at our naive belief that the speed of light in a vacuum is constant.

    The point of this article is that if you want to go there you can’t just take what’s left of general relativity with you. If the speed of light isn’t constant you have to find another way to account for phenomena that GR also explains – like black holes, atomic clocks running relatively faster in orbit etc.

  10. Even if you have a pretty static eternal oscillating universe, one point in time was there a start of this universe. It does not change the fact that once there was a singularity even though it might have been 1000 oscillations before.

  11. @ Olaf

    Well, that’s a good question to the human mind: What is easier to grasp? A day without a yesterday or time that has beginning at all?

    I say: the human mind is not made to imagine both things. 😉 Typical for physical relations…..

  12. Grrr, I am not happy! 🙁 This is rubbish. To start, the speed of light is not something which extends cosmologically. The speed of light holds in a local inertial reference frame. It is something which all observers measure in a flat region of spacetime. If space is curved, then this local frame is limited in its spatial extent. You can’t say the invariance of the speed of light extends universally from my frame to an frame on some galaxy with z = 1 or more. In fact such as extension starts to breaks down for z > .3 or more.

    The cosmological constant is not something attached to the Einstein field equation in a willy-nilly manner. It is an integration constant. The Einstein field equation may be derived from the momentum-energy tensor T^{ab} where the covariant derivative is zero D_aT^{ab} = 0, and the integration of this gives the Einstein field equation.

    Another problem is that the author has G/c^2 in units of velocity, which means c is not in such units. This is a clear error. The whole idea of c = c(t) is must simply flawed, for it makes cosmological assumptions about a conversion factor measured in a local frame. The speed of light also reflects the projective structure the Lorentz group is a fibration on. It makes no sense to say there is any measurable consequence for changing this projectivization factor, for the structure is invariant under any such reparameterization of it. Physically this can easily be seen in how the various Planck units will all re-adjust in response to any reassignment of c so to mask any apparent change.

    This paper is just plain crap.

    LC

  13. Re cyclical universes, there are problems. Actually I think the problems are germane, even if it isn’t mentioned there.

    Seems an everywhere expanding universe is the only sensible long-time solution.

  14. heh, im out of my depth here, only so much an artistic mind can comprehend.

    But I will throw a question out there; what do the greater minds here think of Rikki flow in Perelman’s Conjecture in how it attempts to describe singularities. ?

    It would be interesting to attempt to explain Quantum gravity in terms of a 4-dimensional Euclidean space. Perhaps it might bring some sense to the Calabi–Yau manifolds of string theory. I certainly cant make any sense out of that.

    Anyway, I like the 3-Sphere. Its neat.

    As for this theory, well its out there, time and observation will tell.
    Doesn’t seem to have gotten a great reception by the brains out here.

    However I personally I have my fingers crossed for the end of Singularities, dark matter – energy, the fudged cosmological constant, the creationist like big bang and the nihilism in theoretical cosmology before I die.. (one can live in hope)

    Perhaps another hypothetical question.
    What sort of proof would be necessary to knock GR off its current perch. ?

    I dont expect that GR is wrong completely, but it is incomplete.

    Perhaps it theoretical cosmology that is traversing polymorphic space in a 3-Sphere? 🙂 hmm.

    Damian

  15. The evolution of a spatial surface is similar to Ricci flows. Fixed points are mapped from event horizons on the Euclidean version, and there are even thermodynamic analogues with the Ricci flows with fixed points.

    Singularities are likely replaced by quantum topological numbers. They are physically quantum fluctuations, or are restricted from infinity by quantum uncertainty. Sorry, but big bang and inflation look pretty real, and they pass the basic tests.

    LC

  16. I actually like this! Anything that makes Dark Energy “go away” must have merit.

    I like the idea of a variable speed of light. Light travels at different speed thru different mediums, and who knows what the universe was like in the past. The medium could be different, allowing light to “superconduct” (so-to-speak) through the universe.

  17. Damian:

    What sort of proof would be necessary to knock GR off its current perch.?

    I dont [sic] expect that GR is wrong completely, but it is incomplete.

    General Relativity may be “incomplete”, but at least it’s not “sitting on its perch, in the first place, [because] it had been nailed there” — such as some ‘alternative’ cosmology theories out there! 😉

  18. Einstein: “If the speed of light is the least bit affected by the speed of the light source, then my whole theory of relativity and theory of gravity is false.”

    So we would need to Observe something in the Universe that perhaps does not account for time dilation? Like the observed light from Quasars perhaps. ?

    ** On time dilation in quasar light curves, M. R. S. Hawkins,
    http://www.physorg.com/news190027752.html

    So redshift works and is a cornerstone of GR unless the light comes from a Quasar.

    Err, Just saying, as an example.

    Im no scientist and I dont haven any (alternative) theory I wish to champion, just keeping an open mind.

    Damian

  19. @ Damian

    You ask: What sort of proof would be necessary to knock GR off its current perch?

    Not that I am one of the greater minds here, but… the idea that nothing can travel faster than light in a vacuum makes huge amounts of sense, but has arguably already been broken by quantum entanglement – which can allegedly be demonstrated in a laboratory.

    Also, the concept of singularities seems physically implausible. We know that one black hole can be more massive than another – so does this mean the more massive black hole has a singularity with a mass of infinity plus one?

    Nonetheless, I find the idea of a variable speed of light in a vacuum (repeat – in a vacuum) so at odds with the fundamental principles of GR that it is barely worth entertaining. It would suggest even James Clarke Maxwell didn’t get it right with electromagnetism.

    So I think you’re right that GR is incomplete at explaining everything – but what it does, it does very well.

  20. Thanks Steve, I was going to mention Quantum entanglement as an example, but the Quasar Wavelengths mystery seems more pertinent.

    Wun-Yi Shu theory does cite this as rationale for needing an alternative to GR if I’m not mistaken.

    What it does not do is offer a solution that suggests a Grand Unification Theory.

    However Garrett Lisi Thory using the E8 Lie group and this have some synergy.
    Neither is proven, so this remains a friendly discussion without beer.

  21. Quantum entanglement does not propagate information faster than light. There is nothing which travels in quantum entanglement. I don’t quite have the time for a complete explanation, but no information actually travels anywhere. In a sense quantum mechanics has a representation in space and time, but quantum states are not defined by spacetime coordinates.

    The speed of light is a local invariance principle. It holds on a local frame. A frame at some distance from your frame might have some intervening curvature, which means your lab measurement of the speed of light does not carry to the other frame. This is how one gets particles reaching the speed of light at the event horizon of a black hole, even though an exterior observer sees if frozen on the horizon, and reach v > c inside the black hole. It is also how we get z > 1 galaxies traveling faster than light beyond the horizon scale. The reason is that space itself is dynamically evolving to drag these galaxies. As I have indicated previously here, it also does work out that photons traveling locally at the speed of light can reach us from a galaxy being frame dragged by the dynamics of space faster than light. It is for the same reason an observer falling through the interior of a black hole can still observe the exterior world, even though they have a v > c relative to a local frame on the outside.

    Garrett Lisi’s use of the E_8 theory is interesting, but has a couple of big problems. One of the problems is he frames the graviton field with other quantum fields in a way that violates the Coleman-Mandula theorem. The only way such framing can be done is with supersymmetry, but this increases the number of particles by 2, each particle with its supersymmetry partner. Lisi does not do this. The idea is interesting as a sort of trial, but it is in no way a final theory of things.

    LC

  22. @ LBC

    You say: Quantum entanglement does not propagate information faster than light.

    Well, perhaps not – and too bad you don’t quite have the time to talk us through it 😉 I agree the two physics paradigms are effective even though altogether not in agreement on this point. Maybe its OK to live with this apparent schism, but maybe it represents a limitation to both paradigms.

  23. are galactic redshifts quantizied? Some astronomers claim to observe highly redshifted distant galaxies appearing in clusters of 2 or 8, self-similar to atomic hydrogen electron orbital shells. Is there any evidence that this is some kind of higher dimensional link to faster rates of renormalization group flows of the smaller scale 5th dimensional objects?

Comments are closed.