Astronomy Without A Telescope – Cubic Neutrons

CAPTION>>>

[/caption]

The nature of the highly compressed matter that makes up neutron stars has been the subject of much speculation. For example, it’s been suggested that under extreme gravitational compression the neutrons may collapse into quark matter composed of just strange quarks – which suggests that you should start calling a particularly massive neutron star, a strange star.

However, an alternate model suggests that within massive neutron stars – rather than the neutrons collapsing into more fundamental particles, they might just be packed more tightly together by adopting a cubic shape. This might allow such cubic neutrons to be packed into about 75% of the volume that spherical neutrons would normally occupy.

Some rethinking about the internal structure of neutron stars has been driven by the 2010 discovery that the neutron star PSR J1614–2230, has a mass of nearly two solar masses – which is a lot for a neutron star that probably has a diameter of less than 20 kilometres.

PSR J1614–2230, described by some as a ‘superheavy’ neutron star, might seem an ideal candidate for the formation of quark matter – or some other exotic transformation – resulting from the extreme compression of neutron star material. However,  calculations suggest that such a significant rearrangement of matter would shrink the star’s volume down to less than the Schwarzschild radius for two solar masses – meaning that PSR J1614–2230 should immediately form a black hole.

But nope, PSR J1614–2230 is there for all to observe, a superheavy neutron star, which is hence almost certainly composed of nothing more exotic that neutrons throughout, as well as a surface layer of more conventional atomic matter.

Modelling the quantum field waveforms of neutrons under increasing densities suggests a cubic, rather than a spherical, geometry is more likely. Credit: Llanes-Estrada and Navarro.

Nonetheless, stellar-sized black holes can and do form from neutron stars. For example, if a neutron star in a binary system continues drawing mass of its companion star it will eventually reach the Tolman–Oppenheimer–Volkoff limit. This is the ultimate mass limit for neutron stars – similar in concept to the Chandrasekhar limit for white dwarf stars. Once a white dwarf reaches the Chandrasekhar limit of 1.4 solar masses it detonates as a Type 1a supernova. Once, a neutron star reaches the Tolman–Oppenheimer–Volkoff mass limit, it becomes a black hole.

Due to our current limited understanding of neutron star physics, no-one is quite sure what the Tolman–Oppenheimer–Volkoff mass limit is, but it is thought to lie somewhere between 1.5 – 3.0 solar masses.

So, PSR J1614–2230 seems likely to be close to this neutron star mass limit, even though it is still composed of neutrons. But there must be some method whereby a neutron star’s mass can be compressed into a smaller volume, otherwise it could never form a black hole. So, there should be some intermediary state whereby a neutron star’s neutrons become progressively compressed into a smaller volume until the Schwarzschild radius for its mass is reached.

Llanes-Estrada and Navarro propose that this problem could be solved if, under extreme gravitational pressure, the neutrons’ geometry became deformed into smaller cubic shapes to allow tighter packing, although the particles still remain as neutrons.

So if it turns out that the universe does not contain strange stars after all, having cubic neutron stars instead would still be agreeably unusual.

Further reading: Llanes-Estrada and Navarro. Cubic neutrons.

Astronomy Without A Telescope – Impact Mitigation

CAPTION...

[/caption]

The save-the-Earth rehearsal mission Don Quijote, commissioned by the European Space Agency, is planned to test the potential of a real life-or-death mission to deflect a mass-extinction-inducing asteroid from a collision course with Earth.

Currently at ‘concept’ stage, the Don Quijote Near Earth Asteroid Impact Mitigation Mission – has been modelled on a proposed flight to either 2002 AT4 or 1989 ML, both being near-Earth asteroids, though neither represent an obvious collision risk. However, subsequent studies have proposed that Amor 2003 SM84 or even 99942 Apophis may be more suitable targets. After all, 99942 Apophis does carry a marginal (1 in 250,000) risk of an Earth impact in 2036.

Whatever the target, a dual launch of two spacecraft is proposed – an Impactor called Hidalgo (a title Cervantes gave to the original Don Quixote) and an Orbiter called Sancho (who was the Don’s faithful companion).

While the Impactor’s role is self-explanatory, the Orbiter plays a key role in interpreting the impact – the idea being to collect impact momentum and trajectory change data that would then inform future missions, in which the fate of the Earth may really be at stake.

The extent of transfer of momentum from Impactor to asteroid depends on the Impactor’s mass (just over 500 kilograms) and its velocity (about 10 kilometres a second), as well as the composition and density of the asteroid. The greatest momentum change will be achieved if the impact throws up ejecta that achieve escape velocity. If instead the Impactor just buries itself within the asteroid, not that much will be achieved, since its mass will be substantially less than any mass-extinction-inducing asteroid. For example, the object that created the Chicxulub crater and wiped out the dinosaurs (yes, alright – except for the birds) is thought to have been in the order of 10 kilometres in diameter.

So before the impact, to assist future targeting and required impact velocity calculations, the Orbiter will make a detailed analysis of the target asteroid’s overall mass and its near-surface density and granularity. Then, after the impact, the Orbiter will assess the speed and distribution of the collision ejecta via its Impact Camera.

However, accurately measuring the degree of deflection achieved by the impact represents a substantial challenge for the mission. We will need much better data about the target asteroid’s mass and velocity than we can establish from Earth. So, the Orbiter will do a series of fly-bys and then go into orbit around the asteroid to assess how much the asteroid is affected by the spacecraft’s proximity.

A precise determination of the Orbiter’s distance from the asteroid will be achieved by its Laser Altimeter, while a Radio Science Experiment will precisely determine the Orbiter’s position (and hence the asteroid’s position) relative to the Earth.

Having then established the Orbiter as a reference point, the effect of the collision of the Impactor will be assessed. However, a significant confounding factor is the Yarkovsky effect – the effect of solar heating of the asteroid, which induces the emission of thermal photons and hence generates a tiny amount of thrust. The Yarkovsky effect naturally pushes an asteroid’s orbit outwards if it has a prograde spin (in the direction of its orbit) – or inwards if it has retrograde spin. Hence, the Orbiter will also need a Thermal Infrared Spectrometer to separate the Yarkovsky effect from the effect of the impact.

To estimate the effect of Hidalgo's collision, the Yarkovsky effect must be acounted for. Heating of an asteroid's surface by the Sun causes thermal radiation. The nett cumulative momentum of that radiation is from surfaces that have just turned out of the Sun's light (i.e. 'dusk'). In asteroids with prograde spin, this will push the asteroid into a higher orbit - i.e. further away from the Sun. But, for asteroids with retrograde rotation, the orbit decays - i.e. towards the Sun.

And of course, given the importance of the Orbiter as a reference point, the effect of solar radiation on it must also be measured. Indeed, we will also need to factor in that this effect will change as the shiny new spacecraft’s highly-reflective surfaces lose their sheen. Highly reflective surfaces will emit radiation, almost immediately, at energy levels (i.e. high momentum) almost equivalent to the incident radiation. However, low albedo surfaces may only release lower energy (i.e. lower momentum) thermal radiation – and will do so more slowly.

To put it another way, a mirror surface makes a much better solar sail than a black surface.

So in a nutshell, the Don Quijote impact mitigation mission will require an Impactor with a Targeting Camera – and an Orbiter with an Impact Observation Camera, a Laser Altimeter, a Radio Science Experiment and a Thermal Infrared Spectrometer – and you should remember to measure the effect of solar radiation pressure on the spacecraft early in the mission, when it’s shiny – and later on, when it’s not.

Further reading: Wolters et al Measurement requirements for a near-Earth asteroid impact mitigation demonstration mission.

Astronomy Without A Telescope – A Photon’s Point Of View

What would you see at the speed of light/

[/caption]

From a photon’s point of view, it is emitted and then instantaneously reabsorbed. This is true for a photon emitted in the core of the Sun, which might be reabsorbed after crossing a fraction of a millimetre’s distance. And it is equally true for a photon that, from our point of view, has travelled for over 13 billion years after being emitted from the surface of one of the universe’s first stars.

So it seems that not only does a photon not experience the passage of time, it does not experience the passage of distance either. But since you can’t move a massless consciousness at the speed of light in a vacuum, the real point of this thought experiment is to indicate that time and distance are just two apparently different aspects of the same thing.

If we attempt to achieve the speed of light, our clocks will slow relative to our point of origin and we will arrive at our destination quicker that we anticipate that we should – as though both the travel time and the distance have contracted.

Similarly, as we approach the surface of a massive object, our clocks will slow relative to a point of higher altitude – and we will arrive at the surface quicker than we might anticipate, as though time and distance contract progressively as we approach the surface.

Again, time and distance are just two aspects of the same thing, space-time, but we struggle to visualise this. We have evolved to see the world in snapshot moments, perhaps because a failure to scan the environment with every step we take might leave us open to attack by a predator.

Science advocates and skeptics say that we should accept the reality of evolution in the same way that we accept the reality of gravity – but actually this is a terrible analogy. Gravity is not real, it’s just our dumbed-down interpretation of space-time curvature.

If you could include the dimension of time in this picture you might get a rough idea of why things appear to accelerate towards a massive object - even though they do not themselves experience any acceleration.

Astronauts moving at a constant velocity through empty space feel weightless. Put a planet in their line of trajectory and they will continue to feel weightless right up until the moment they collide with its surface.

A person on the surface will watch them steadily accelerate from high altitude until that moment of collision. But such doomed astronauts will not themselves experience any such change to their velocity. After all, if they were accelerating, surely they would be pushed back into their seat as a consequence.

Nonetheless, the observer on the planet’s surface is not suffering from an optical illusion when they perceive a falling spacecraft accelerate. It’s just that they fail to acknowledge their particular context of having evolved on the surface of a massive object, where space-time is all scrunched up.

So they see the spacecraft move from an altitude where distance and time (i.e. space-time) is relatively smooth – down to the surface, where space-time (from the point of view of a high altitude observer) is relatively scrunched up. A surface dweller hence perceives that a falling object is experiencing acceleration and wrongly assumes that there must be a force involved.

As for evolution – there are fossils, vestigial organs and mitochondrial DNA. Get real.

Footnote: If you were falling into a black hole you would still not experience acceleration. However, your physical structure would be required to conform to the extremely scrunched up space-time that you move through – and spaghettification would result.

The Russian Hubble?

The Spektr-R spacecraft. If you are thinking it looks nothing like the Hubble Space telescope, you'd be right.

[/caption]

This is hardly breaking news, but there’s a new Russian space telescope in town. With a name like an anime character, Spektr R was launched on 18 July 2011 and its 10 metre carbon fibre dish was deployed a week later. It’s a radio telescope and – via a very large baseline array project known as RadioAstron – it will become arguably the world’s biggest radio telescope – and by a very long shot.

Following so closely after the Space Shuttle fleet’s retirement, the media has latched onto the idea that this represents a major step up from the Hubble Space Telescope and a further indication of the USA’s decline from space. But, nah…

Don’t get me wrong, when fully operational RadioAstron will be the biggest ever interferometer and is likely to deliver some great science when it gets up to speed. Well done, Roscosmos. But the various comparisons made between it and Hubble are a little spurious.

RadioAstron’s angular resolution is reported as 7 microarc seconds (or 0.000007 arcseconds) while Hubble’s resolution is generally reported as 0.05 arc seconds – so RadioAstron is reported as having over a thousand times more resolution. Well, sort of – but not really.

Firstly, the 10 metre radio mirror of Spektr R is designed to detect centimetre range wavelength light, while Hubble’s 2.4 metre mirror, is capable of detecting wavelengths in the visible light range of 350-790 nanometre range (and some non-visible infrared light too).

Angular resolution arises from the relationship between the wavelength of light you are observing and the size of your aperture. So, at the single instrument level Hubble rules supreme in the resolution stakes.

The image detail you can gain from arraying radio telescopes. Blobby false colour becomes more detailed blobby false colour (but there's useful science data there). Credit: VSOP.

The resolution assigned to RadioAstron (the telescope array) arises from the ‘virtual’ dish diameter created by Spektr R’s orbit, when arrayed with ground-based radio telescopes – which may eventually include Earth’s largest dish, the 300 metre Arecibo dish and Earth’s largest steerable dish, the 110 metre Greenbank radio telescope.

Spektr R will orbit the Earth via a highly elliptical orbit with a perigee of 10,000 kilometres and an apogee of 390,000 kilometres – so giving an elliptical orbit with a semi-major axis of 200,000 kilometres. That sounds like one big dish, huh… although it isn’t, really – just virtually.

Don’t get me wrong, there is a huge increase in information to be gained from arraying Spektr R’s one data point with other ground based observatories’ data points. But nonetheless, it is just radio light conveyed information – which just can’t deliver the level of detail that nanometre wavelength visible light can carry.

That’s why you can usefully create radio telescope arrays, but you can’t gain much value from arraying visible light telescopes (at least not yet). The information conveyed by radio light is spread widely enough so that you can estimate the information it is carrying from just detecting it at two widely spread detectors – and then superimposing that data. The fine detailed information contained in visible light is just too complex to allow this.

So putting up RadioAstron up as a contender to the beloved Hubble Space Telescope makes no sense. It is a totally different scientific project that will deliver totally different – and hopefully awesome – scientific data. Ad astra. If we want a step up from Hubble, we need to get the James Webb Space Telescope back into production.

Astronomy Without A Telescope – The Unlikeliness Of Being

The Search for ExtraTerrestrial Intelligence could be a waste of time according to a recent statistical analysis of the likelihood of life arising spontaneously on habitable-zone exoplanets out there in the wider universe (and let's face it - when have predictive statistics ever got it wrong?) Credit: SETI Institute.

[/caption]

History has proved time and again that mathematical modelling is no substitute for a telescope (or other data collection device). Nonetheless, some theoreticians have recently put forward a statistical analysis which suggests that life is probably very rare in the universe – despite the apparent prevalence of habitable-zone exoplanets, being found by the Kepler mission and other exoplanet search techniques.

You would be right to be skeptical, given the Bayesian analysis undertaken is based on our singular experience of abiogenesis – being the origin of life from non-life, here on Earth. Indeed, the seemingly rapid abiogenesis that occurred on Earth soon after its formation is suggested to be the clinching proof that abiogenesis on habitable-zone exoplanets must be rare. Hmm…

Bayes theorem provides a basis for estimating the likelihood that a prior assumption or hypothesis (e.g. that abiogenesis is common on habitable-zone exoplanets) is correct, using whatever evidence is available. Its usage is nicely demonstrated in solving the Monty Hall problem.

Go here for the detail, but in a nutshell:
There are three doors, one with a car behind it and the other two have goats. You announce which door you will pick – knowing that it carries a 1/3 probability of hiding the car. Then Monty Hall, who knows where the car is, opens another door to reveal a goat. So, now you know that door always had a zero probability of hiding the car. So, the likelihood of the remaining door hiding the car carries the remaining 2/3 probability of the system, since there was always an absolute 1/1 probability that the car was behind one of the three doors. So, it makes more sense for you to open that remaining door, instead of the first one you picked.

In this story, Monty Hall opening the door with a goat represents new data. It doesn’t allow you to definitively determine where the car is, but it does allow you to recalculate the likelihood that your prior hypothesis (that the car is behind the first door you picked) is correct.

Applying Bayesian analysis to the problem of abiogenesis on habitable-zone exoplanets is a bit of a stretch. Speigel and Turner argue that the evidence we have available to us – that life began quite soon after the Earth became habitable – contributes nothing to estimating the likelihood that life arises routinely on habitable-zone exoplanets.

They remind us that we need to acknowledge the anthropic nature of the observation we are making. We are here after 3.5 billion years of evolution – which has given us the capacity to gather together the evidence that life began here 3.5 billion years ago, shortly after the Earth first became habitable. But that is only because this is how things unfolded here on Earth. In the absence of more data, the apparent rapidity of abiogenesis here on Earth could just be a fluke.

Stromatolites - which were a fairly early form of life on Earth. Earth became inhabited by such early life shortly after it became habitable. This might seem suggestive that life is somewhat inevitable when the conditions are right. But a statistician is never going to buy such an argument when it's based on a single example.

This is a fair point, but a largely philosophical one. It informs the subsequent six pages of Spiegel and Turner’s Bayesian analysis, but it is not a conclusion of that analysis.

The authors seek to remind us that interviewing one person and finding that she or he likes baked beans does not allow us to conclude that all people like baked beans. Yes agree, but that’s just statistics – it’s not really Bayesian statistics.

If we are ever able to closely study an exoplanet that has been in a habitable state for 3.5 billion years and discover that either it has life, or that it does not – that will be equivalent to Monty Hall opening another door.

But for now, we might just be a fluke… or we might not be. What we need is more data.

Further reading: Spiegel and Turner. Life might be rare despite its early emergence on Earth: a Bayesian analysis of the probability of abiogenesis.

Astronomy Without A Telescope – Bubblology

Multiverse hypotheses are all very well, but surely 'when worls collide' we should be able to determine the existence of the multiverse - but to date.... nup. Credit: cosmology.com

[/caption]

One model of a hypothetical multiverse has, perhaps appropriately, some similarity to a glass of beer. Imagine an eternal false vacuum – that’s a bit like a fluid, though not all that much like a fluid – since it doesn’t have volume, in fact it doesn’t have any spatial dimensions. Then imagine that this eternal false vacuum expands.

This sounds rather contradictory since expansion implies there are spatial dimensions, but a string theorist will assure you that it all happens at the sub-Planck scale, where lots of immeasurable and unknowable things can happen – and after a few more drinks you might be willing to go along with this.

So – next, we introduce bubbles to the false vacuum. The bubbles – which are essentially independent baby universes – are true vacuums and can rationally and reasonably expand since they have four overt dimensions of space-time – albeit they may also have the other immeasurable and unknowable dimensions in common with the encompassing false vacuum.

The bubbles are the reason why it is necessary for the false vacuum to expand, indeed it must expand faster than the bubbles – otherwise an expanding bubble universe could ‘percolate’ – that is, spread throughout the all-encompassing false vacuum – so that your multiverse would just become a universe. And where’s the fun in that?

Anyhow, within such an eternal expanding fluid, bubble universes may nucleate at random points – taking us away from the coffee analogy and back to the beer. In bubblology terms, nucleation is the precursor of inflation. The sub-Planck energy of the non-dimensional false vacuum occasionally suffers a kind of hiccup – perhaps a quantum tunnelling event – making the sub-Planck virtual nothingness commence a slow roll down a potential energy hill (whatever the heck that means).

At a certain point in that slow roll, the energy level shifts from a sub-Planck potential-ness into a supra-Planck actual-ness. This shift from sub-Planck to supra-Planck is thought to be a kind of phase transition from something ephemeral to a new ground state of something lasting and substantial – and that phase transition releases heat, kind of like how the phase transition from water to ice releases latent heat.

And so you get the characteristic production of a gargantuan amount of energy out of nothing, which we denizens of our own bubble universe parochially call the Big Bang – being the energy that drove an exponential cosmic inflation of our own bubble, that exponential inflation lasting until the energy density within the bubble was cool enough to form matter – in an e=mc2 kind of way. And so another bubble of persistent somethingness formed within the eternal beer of nothingness.

The light cone of our bubble universe showing the stages of the energy release driving cosmic inflation (reheating), the surface of last scattering (recombination) and the subsequent disolution of the cosmic fog (reionisation) - cosmic microwave background photons from the surface of last scattering could show signs of a collision with an adjacent bubble universe. Credit: Kleban.

Good story, huh? But, where’s the evidence? Well, there is none, but despite the usual criticisms lobbed at string theorists this is an area where they attempt to offer testable predictions.

Within a multiverse, one or more collisions with another bubble universe are almost inevitable given the beer-mediated timeframe of eternity. Such an event may yet lie in our future, but could equally lie in our past – the fact that we are still here indicating (anthropically) that such a collision may not be fatal.

A collision with another bubble might pass unnoticed if it possessed exactly the same cosmological constant as ours and its contents were roughly equivalent. The bubble wall collision might appear as a blue-shifted circle in the sky – perhaps like the Cold Spot in the cosmic microwave background, although this is most likely the result of a density fluctuation within our own universe.

We could be in trouble if an adjacent universe’s bubble wall pushed inwards on a trajectory towards us – and if it moved at the speed of light we wouldn’t see it until it hit. Even if the wall collision was innocuous, we might be in trouble if the adjacent universe was filled with antimatter. It’s these kind of factors that determine what we might observe – and whether we might survive such an, albeit hypothetical, event.

Further reading: Kleban. Cosmic bubble collisions.

Astronomy Without A Telescope – Gravitational Waves

An artist's impression of gravitational waves. In reality, a single uniform massive object does not generate gravitational waves. However, a massive binary system in orbital motion, could generate dynamic pulses of gravitational energy that might be detected from Earth.

[/caption]

Gravitational waves have some similar properties to light. They move at the same speed in a vacuum – and with a certain frequency and amplitude. Where they differ from light is that they are not scattered or absorbed by matter, in the way that light is.

Thus, it’s likely that primordial gravitational waves, that are speculated to have been produced by the Big Bang, are still out there waiting to be detected and analyzed.

Gravitational waves have been indirectly detected via observations of pulsar PSR 1913+16, a member of a binary system, the orbit of which decays at the rate of approximately three millimetres per orbit. The inspiraling of the binary (i.e. the decay of its orbit) can only be explained by an invisible loss of energy, which we presume to be the result of gravitational waves transporting energy away from the system.

Direct observation of gravitational waves currently escapes us – but seems at least feasible by monitoring the alignment of widely separated test masses. Such monitoring systems are currently in place on Earth, including LIGO, which has test masses separated by up to four kilometres – that separation distance being monitored by lasers designed to detect tiny changes in that distance, which might result from the passage of a gravitational wave initiated from a distant point in the universe.

The passing of a gravitational wave should stretch and contract the Earth. This is not because it strikes the Earth and imparts kinetic energy to it – like an ocean wave hitting land. Instead, the Earth – which sits within space-time – has its geometry altered, so that it continues to fit the momentarily stretched and then contracted space-time within which it sits, as a gravitational wave passes.

The Laser Interferometer Gravitational-Wave Observatory (LIGO) Hanford installation. When you are talking gravitational wave astronomy, big is good. Credit: Caltech.

Gravitational waves are thought to be unaffected by interaction with matter and they move at the speed of light in a vacuum, regardless of whether or not they themselves are in a vacuum. They do lose amplitude (wave height) over distance, but only through attenuation. This is similar to the way that a water wave, emanating from the point of impact of a pebble dropped into a pond, loses amplitude proportionally to the square of the radius of the growing circle that it forms.

Gravity waves may also decline in frequency (i.e. increase in wavelength) over very large distances, due to the expansion of the universe – in much the same way that the wavelength of light is red-shifted by the expansion of the universe.

Given all this, the exceedingly tiny effects that are expected of the gravitational waves that may routinely pass by Earth create a substantial challenge for detection and measurement – since these tiny space-time fluctuations must be distinguished from any background noise.

The noise background for LIGO includes seismic noise (i.e. intrinsic movements of the Earth), instrument noise (i.e. temperature changes that affect the alignment of the detection equipment) and a quantum-level noise, also known as Johnson-Nyquist noise – which arises from the quantum indeterminacy of photon positions.

Kip Thorne, one of the big names in gravity wave theory and research, has apparently ironed out that last and perhaps most troublesome effect through the application of quantum non-demolition principles – which enable the measurement of something without destroying it, or without collapsing its wave function.

Nonetheless, the need for invoking quantum non-demolition principles is some indication of the exceedingly faint nature of gravitational waves – which have a generally weak signal strength (i.e. small amplitude) and low frequency (i.e. long, in fact very long, wavelength).

Where visible light may be 390 nanometres and radio light may be 3 metres in wavelength – gravitational waves are more in the order of 300 kilometres for an average supernova blast, up to 300,000 kilometres for an inspiraling black hole binary and maybe up to 3 billion light years for the primordial echoes of the Big Bang.

So, there’s a fair way to go with all this at a technological level – although proponents (as proponents are want) say that we are on the verge of our first confirmed observation of a gravitational wave – or otherwise they reckon that we have already collected the data, but don’t fully know how to interpret them yet.

This is the current quest of citizen science users of Einstein@Home – the third most popular BOINC distributed computing project after SETI@Home (spot an alien) and Rosetta@Home (fold a protein).

This article follows a public lecture delivered by Kip Thorne at the Australian National University in July 2011 – where he discussed plans for LIGO Australia and also the animated simulations of black hole collisions described in the paper below – which may provide templates to interpret the waveforms that will be detected in the future by gravitational wave observatories.

Further reading: Owen et al (including Thorne, K.) Frame-Dragging Vortexes and Tidal Tendexes Attached to Colliding Black Holes: Visualizing the Curvature of Spacetime.

Astronomy Without A Telescope – Granularity

.
A gamma ray burst offers a rare opportunity to assess the nature of the apparent 'empty space' vacuum that exists between you an it. In GRB 041219A's case, that's 300 million light years of vacuum. Credit: ESA.

[/caption]

The very small wavelength of gamma ray light offers the potential to gain high resolution data about very fine detail – perhaps even detail about the quantum substructure of a vacuum – or in other words, the granularity of empty space.

Quantum physics suggests that a vacuum is anything but empty, with virtual particles regularly popping in and out of existence within Planck instants of time. The proposed particle nature of gravity also requires graviton particles to mediate gravitational interactions. So, to support a theory of quantum gravity we should expect to find evidence of a degree of granularity in the substructure of space-time.

There is a lot of current interest in finding evidence of Lorentz invariance violations – where Lorentz invariance is a fundamental principle of relativity theory – and (amongst other things) requires that the speed of light in a vacuum should always be constant.

Light is slowed when it passes through materials that have a refractive index – like glass or water. However, we don’t expect such properties to be exhibited by a vacuum – except, according to quantum theory, at exceedingly tiny Planck units of scale.

So theoretically, we might expect a light source that broadcasts across all wavelengths – that is, all energy levels – to have the very high energy, very short wavelength portion of its spectrum affected by the vacuum substructure – while the rest of its spectrum isn’t so affected.

There are at least philosophical problems with assigning a structural composition to the vacuum of space, since it then becomes a background reference frame – similar to the hypothetical luminiferous ether which Einstein dismissed the need for by establishing general relativity.

Nonetheless, theorists hope to unify the current schism between large scale general relativity and small scale quantum physics by establishing an evidence-based theory of quantum gravity. It may be that small scale Lorentz invariance violations will be found to exist, but that such violations will become irrelevant at large scales – perhaps as a result of quantum decoherence.

Quantum decoherence might permit the large scale universe to remain consistent with general relativity, but still be explainable by a unifying quantum gravity theory.

The ESA INTEGRAL gamma ray observatory - devoting a proportion of its observing time to searching for the underlying quantum nature of the cosmos. Credit: ESA

On 19 December 2004, the space-based INTEGRAL gamma ray observatory detected Gamma Ray Burst GRB 041219A, one of the brightest such bursts on record. The radiative output of the gamma ray burst showed indications of polarisation – and we can be confident that any quantum level effects were emphasised by the fact that the burst occurred in a different galaxy and the light from it has travelled through more than 300 million light years of vacuum to reach us.

Whatever extent of polarisation that can be attributed to the substructure of the vacuum, would only be visible in the gamma ray portion of the light spectrum – and it was found that the difference between polarisation of the gamma ray wavelengths and the rest of the spectrum was… well, undetectable.

The authors of a recent paper on the INTEGRAL data claim it achieved resolution down to Planck scales, being 10-35 metres. Indeed, INTEGRAL’s observations constrain the possibility of any quantum granularity down to a level of 10-48 metres or smaller.

Elvis might not have left the building, but the authors claim that this finding should have a major impact on current theoretical options for a quantum gravity theory – sending quite a few theorists back to the drawing board.

Further reading: Laurent et al. Constraints on Lorentz Invariance Violation using INTEGRAL/IBIS observations of GRB041219A.

ESA media release

Astronomy Without A Telescope – Big Rips And Little Rips

The concept of accelerating expansion does get you wondering just how much it can accelerate. Theorists think there still might be a chance of a big crunch, a steady-as-she-goes expansion or a big rip. Or maybe just a little rip?

[/caption]

One of a number of seemingly implausible features of dark energy is that its density is assumed to be constant over time. So, even though the universe expands over time, dark energy does not become diluted, unlike the rest of the contents of the universe.

As the universe expands, it seems that more dark energy appears out of nowhere to sustain the constant dark energy density of the universe. So, as times goes by, dark energy will become an increasingly dominant proportion of the observable universe – remembering that it is already estimated as being 73% of it.

An easy solution to this is to say that dark energy is a feature inherent in the fabric of space-time, so that as the universe expands and the expanse of space-time increases, so dark energy increases and its density remains constant. And this is fine, as long as we then acknowledge that it isn’t really energy – since our otherwise highly reliable three laws of thermodynamics don’t obviously permit energy to behave in such ways.

An easy solution to explain the uniform acceleration of the universe’s expansion is to propose that dark energy has the feature of negative pressure – where negative pressure is a feature inherent in expansion.

Applying this arcane logic to observation, the observed apparent flatness of the universe’s geometry suggests that the ratio of dark energy pressure to dark energy density is approximately 1, or more correctly -1, since we are dealing with a negative pressure. This relationship is known as the equation of state for dark energy.

In speculating about what might happen in the universe’s future, an easy solution is to assume that dark energy is just whatever it is – and that this ratio of pressure to density will be sustained at -1 indefinitely, whatever the heck that means.

But cosmologists are rarely happy to just leave things there and have speculated on what might happen if the equation of state does not stay at -1.

Three scenarios for a future driven by dark energy - its density declines over time, it stays the same or its density increases, tearing the contents of the universe to bits. If you are of the view that dark energy is just a mathematical artifact that grows as the expanse of space-time increases - then the cosmological constant option is for you.

If dark energy density decreased over time, the acceleration rate of universal expansion would decline and potentially cease if the pressure/density ratio reached -1/3. On the other hand, if dark energy density increased and the pressure/density ratio dropped below -1 (that is, towards -2, or -3 etc), then you get phantom energy scenarios. Phantom energy is a dark energy which has its density increasing over time. And let’s pause here to remember that the Phantom (ghost who walks) is a fictional character.

Anyhow, as the universe expands and we allow phantom energy density to increase, it potentially approaches infinite within a finite period of time, causing a Big Rip, as the universe becomes infinite in scale and all bound structures, all the way down to subatomic particles, are torn apart. At a pressure/density ratio of just -1.5, this scenario could unfold over a mere 22 billion years.

Frampton et al propose an alternative Little Rip scenario, where the pressure/density ratio is variable over time so that bound structures are still torn apart but the universe does not become infinite in scale.

This might support a cyclic universe model – since it gets you around problems with entropy. A hypothetical Big Bang – Big Crunch cyclic universe has an entropy problem since free energy is lost as everything becomes gravitationally bound – so that you just end up with one huge black hole at the end of the Crunch.

A Little Rip potentially gives you an entropy reboot, since everything is split apart and so can progress from scratch through the long process of being gravitationally bound all over again – generating new stars and galaxies in the process.

Anyhow, Sunday morning – time for a Big Brunch.

Further reading: Frampton et al. The Little Rip.

Astronomy Without A Telescope – Backgrounds

Thousands of galaxies observed by the Herschel Space Observatory through the Lockman hole. Credit: ESA.

[/caption]

You’ve probably heard of the cosmic microwave background, but it doesn’t stop there. The as-yet-undetectable cosmic neutrino background is out there waiting to give us a view into the first seconds after the Big Bang. Then, looking further forward, there are other backgrounds across the electromagnetic spectrum – all of which contribute to what’s called the extragalactic background light, or EBL.

The EBL is the integrated whole of all light that has ever been radiated by all galaxies across all of time. At least, all of time since stars and galaxies first came into being – which was after the dark ages that followed the release of the cosmic microwave background.

The cosmic microwave background was released around 380,000 years after the Big Bang. The dark ages may have then persisted for another 750 million years, until the first stars and the first galaxies formed.

In the current era, the cosmic microwave background is estimated to make up about sixty percent of the photon density of all background radiation in the visible universe – the remaining forty per cent representing the EBL, that is the radiation contributed by all the stars and galaxies that have appeared since.

This gives some indication of the enormous burst of light that the cosmic microwave background represented, although it has since been red-shifted into almost invisibility over the subsequent 13.7 billion years. The EBL is dominated by optical and infrared backgrounds, the former being starlight and the latter being dust heated by that starlight which emits infrared radiation.

Just like the cosmic microwave background can tell us something about the evolution of the earlier universe, the cosmic infrared background can tell us something about the subsequent evolution of the universe – particularly about the formation of the first galaxies.

The power density of the universe's background radiation plotted over wavelength. The cosmic microwave background, though substantially red-shifted due to its age, still dominates. The remainder, extragalactic background light, is dominated by optical and infrared radiation, which have power densities several orders of magnitude higher than the remaining radiation wavelengths.

The Photodetector Array Camera and Spectrometer (PACS) Evolutionary Probe is a ‘guaranteed time’ project for the Herschel Space Observatory. Guaranteed means there always a certain amount of telescope time dedicated to this project regardless of other priorities. The PACS Evolutionary Probe project, or just PEP, aims to survey the cosmic infrared background in the relatively dust free regions of the sky that include: the Lockman Hole; the Great Observatories Origins Deep Survey (GOODS) fields; and the Cosmic Evolution Survey (COSMOS) field.

The Herschel PEP project is collecting data to enable determination of rest frame radiation of galaxies out to a redshift of about z =3, where you are observing galaxies when the universe was about 3 billion years old. Rest frame radiation means making an estimation of the nature of the radiation emitted by those early galaxies before their radiation was red-shifted by the intervening expansion of the universe.

The data indicate that infrared contributes around half of the total extragalactic background light. But if you just look at the current era of the local universe, infrared only contributes one third. This suggests that more infrared radiation was produced in the distant past, than in the present era.

This may be because earlier galaxies had more dust – while modern galaxies have less. For example, elliptical galaxies have almost no dust and radiate almost no infrared. However, luminous infrared galaxies (LIRGs) radiate strongly in infrared and less so in optical, presumably because they have a high dust content.

Modern era LIRGs may result from galactic mergers which provide a new supply of unbound dust to a galaxy, stimulating new star formation. Nonetheless, these may be roughly analogous to what galaxies in the early universe looked like.

Dustless, elliptical galaxies are probably the evolutionary end-point of an galactic merger, but in the absence of any new material to feed off these galaxies just contain aging stars.

So it seems that having a growing number of elliptical galaxies in your backyard is a sign that you live in a universe that is losing its fresh, infrared flush of youth.

Further reading: Berta et al Building the cosmic infrared background brick by brick with Herschel/PEP