Journal Club – Black Holes Made All The Difference

Today's Journal Club is about a new addition to the Standard Model of fundamental particles.

[/caption]

According to Wikipedia, a journal club is a group of individuals who meet regularly to critically evaluate recent articles in the scientific literature. And of course, the first rule of Journal Club is… don’t talk about Journal Club.

So, without further ado – today’s article is about how turning complex theory into plain English can lead to advances in science.

Today’s article:
Schutz, B. Thoughts about a conceptual framework for relativistic gravity.

This article is a bit on the philosophical side and involves some debatable historical interpretation. For example, it is claimed that Einstein’s general relativity theory, after an initial buzz in the 1920s, sat in the obscurity of backroom physics through the 1930s and up to the mid 1950s. Indeed, as an example of the maxim that you often have to wait for someone to die before the science can move on, it is claimed that only after Einstein’s death in 1955 did something of a revival take place, which then brought relativity physics back into the mainstream.

The author Bernard Schutz can claim some authority here since his thesis supervisor was Kip Thorne whose thesis supervisor was John A Wheeler. Wheeler, quoting from his Wikipedia write up was an American theoretical physicist who was largely responsible for reviving interest in general relativity in the United States after World War II. And according to Kip Thorne’s Wikipedia write up, Thorne is one of the world’s leading experts on the astrophysical implications of Einstein’s general theory of relativity. Bernard F Schutz’s Wikipedia write up just says he is an American physicist, but give him time.

In the article, Einstein is claimed to be partly responsible for keeping general relativity in the boondocks by dismissing some of its more exciting implications such as black holes and gravitational waves. Instead Einstein doggedly pursued his idea of a unified field theory which led relativity science to an apparent dead end.

Wheeler was at Princeton University at the same time as Einstein and is described as a ‘late collaborator’, although much of his earlier work was in quantum physics and he was closely involved in the Manhattan project.

But Wheeler’s later work and teaching was very focused on the implications of the curvaceous space-time geometry of general relativity, which he communicated via plain English heuristic explanations of some of the wilder implications of that geometry. For example, he was responsible for coining the term black hole as well as the term worm hole. And suddenly general relativity got sexy again. There was an explosion of papers from the 1960s on into the 1990s seeking to grapple with the concept of a black hole – which then reached a fever pitch as astronomical evidence of the existence of black holes began to come in.

Schutz’s essential hypothesis is that it was physicists schooled in quantum mechanics taking a fresh look at relativity theory that made the difference. These were physicists schooled in the approach of we have the math, but what does it mean? Suddenly people like Wheeler were back engaging with Einstein-like Gedanken (thought) experiments. This turned the math into plain-English so that non-relativist physicists suddenly got what it was about – and wanted a piece of the action.

So… comments? Was Einstein inadvertently responsible for delaying the incorporation of relativity into mainstream physics? Or is this article just about a bunch of quantum physicists trying to stake a claim in the development of ‘the other side’ of physics? It’s a story of rivalry, jealousy and curvaceous sexiness – I welcome suggestions about an even more controversial article for the next edition of Journal Club.

Journal Club – Aberrant Dark Matter

Today's Journal Club is about a new addition to the Standard Model of fundamental particles.

[/caption]

According to Wikipedia, a journal club is a group of individuals who meet regularly to critically evaluate recent articles in the scientific literature. And of course, the first rule of Journal Club is… don’t talk about Journal Club.

So, without further ado – today’s journal article is about dark matter being in the wrong place at the wrong time.

Today’s article:
Jee et al A Study of the Dark Core in A520 with Hubble Space Telescope: The  Mystery Deepens.

This time, rather than someone suggesting what the next journal club article would be (like that happens), I thought I would pick a topical scientific paper mentioned in one of Universe Today’s fabulously thought-provoking stories and enlarge on that a bit.

This paper by Jee et al was mentioned in Ray Sanders’ excellent Hubble Spots Mysterious Dark Matter ‘Core’ article on 2 March 2012.

So, some might remember the Bullet Cluster – a seemingly clinching proof of dark matter, where two galactic clusters had collided in the past and what we see post-collision is that most of the mass of each cluster has passed straight through and out the other side. The only material remaining at the collision site is a huge jumbled clump of intergalactic gas.

This means that each galactic cluster, that has since moved on, has been stripped of much of its intergalactic gas. But lo and behold the seemingly empty intergalactic space within each of these stripped galactic clusters continues to distort the background field of view (a phenomenon known as weak gravitational lensing).

This seemed strong proof that the intergalactic spaces of each cluster must be filled with gravitationally-inducing, but otherwise invisible, stuff. In other words, dark matter. It makes sense that this dark matter would have moved straight on through the collision site because it is weakly interacting – whereas the gas caught up in the collision was not.

So, a cool finding and almost identical findings were discovered within the cluster collisions MACS J0025.4-1222, Abell 2744 and a couple of others. But now along comes Abell 520 with a completely counter example. Two or more galaxy clusters have collided, most of the visible contents have passed straight through, but back at the collision point is an apparent big clump of invisible stuff creating weak gravitational lensing – i.e. dark matter. It is the region labelled 3 on the figure at page 5 of the article.

This finding requires us to consider that we had naively concluded that the Bullet Cluster’s post-collision appearance was easily interpretable and that its outcome would surely be repeated in any equivalent collision of galaxy clusters.

But in the wake of Abell 520 we now may need to consider that the outcome of a collision between rapidly moving and utterly gargantuan collections of mass is much more complex and unpredictable than we had initially assumed. This doesn’t mean that the dark matter hypothesis has been debunked, it just means that the Bullet Cluster might not have been the clinching proof that we thought it was.

If we subsequently find fifty new Bullet cluster analogues and no more Abell 520 analogues, we might then assume that Abell 520 is just a weird outlier, which can be dismissed as an unrepresentative anomaly. But with only five or six such collision types known, one of which is Abell 520 – we can’t really call it an outlier at the moment.

So… comments? The authors offers six possible scenarios to explain this finding – got a seventh? Did we jump to conclusions with the Bullet Cluster? Could suggestions for an article for the next edition of Journal Club represent a form of negative energy?

Journal Club – Shaping The Invisible

Today's Journal Club is about a new addition to the Standard Model of fundamental particles.

[/caption]

According to Wikipedia, a journal club is a group of individuals who meet regularly to critically evaluate recent articles in the scientific literature. And of course, the first rule of Journal Club is… don’t talk about Journal Club.

So, without further ado – today’s journal article is about dark matter and how to determine where it is and how dense it is – although still without actually seeing it.

Today’s article:
Chae et al Dark matter density profiles of the halos embedding early-type galaxies: characterizing halo contraction and dark matter annihilation strength.

We can see how the gravitational influence of invisible dark matter is affecting the general morphology of a galaxy and the motion of the stars within that galaxy. These factors can then hint at where the dark matter is and how dense it is.

Traditional thinking positions dark matter in a halo shape around a galaxy – meaning more of it is outward than inward – which helps explain why visible objects in the outer rim of a galaxy seem to orbit the galactic center at about the same periodicity as inner visible objects. This is contrary to our local Keplerian understanding of orbital mechanics where close-in Mercury orbits the Sun (containing over 99% of the solar system’s mass) in 88 days while distant Neptune takes a leisurely 165 years.

We assume galaxies’ relatively even periodicities are a result of each galaxy’s total mass (visible and dark) being distributed throughout its structure and not concentrated in its center.

The authors use the term ‘early-type’ galaxy to describe their target population for this research. ‘Early-type’ seems unnecessary jargon – being a reference to the Hubble sequence, for which Hubble explained at some length that he was just putting galaxies in a sequence for ease of classification and he did not mean to imply any temporal sequence from the arrangement.

As it happens, our modern understanding is that these ‘early’ types, the elliptical and lenticular galaxies, are actually some of the oldest galaxy forms around. Young galaxies tend to be bright spirals. Over time, these spirals either fade, so you no longer see their spiral arms (lenticulars), or they collide with other galaxies and their ageing stars get jumbled up into random orbits to form big, blobby shapes (ellipticals).

So everywhere you see ‘early-type’ in this article – you should substitute elliptical and lenticular. Jargon prevents the general reader from being able to follow the meaning of a specialist writer – you don’t have to do this to be a scientist.

Anyhow, the researchers conducted a statistical analysis of the estimated stellar mass values and velocity dispersions of star populations within different elliptical and lenticular galaxies. Their objective was to try and get a fix on the distribution of the invisible dark matter that we think all galaxies contain.

Their analysis found that dark matter was more concentrated towards the centers of elliptical and lenticular galaxies – and the authors conclude that nearby elliptical and lenticular galaxies might hence be ideal candidates for the identification of gamma ray output from dark matter annihilation.

The last suggestion seems a bit of an intellectual leap. There have been a few reported observations of radiation output of uncertain origin from the centers of galaxies. Dark matter annihilation has been one suggested cause – but you’d think there’s a lot of stuff going on in the center of a galaxy that could offer an alternate explanation.

I could not find in the paper any suggestions as to why ‘halo contraction’ (presumably jargon for ‘dark matter concentration’) occurs in these galaxy types more often than others – which seemed the more obvious point to offer speculation on.

So… comments? Why, when knowing diddly-squat about the particle nature of dark matter, should we assume it possesses the ability to self-annihilate? Is ‘early-type’ unnecessary jargon or entrenched terminology? Is the question ‘does anyone want to suggest an article for the next edition of Journal Club’ just rhetorical?

Journal Club – Theory constraint

Today's Journal Club is about a new addition to the Standard Model of fundamental particles.

[/caption]

According to Wikipedia, a journal club is a group of individuals who meet regularly to critically evaluate recent articles in the scientific literature. And of course, the first rule of Journal Club is… don’t talk about Journal Club.

So, without further ado – today’s journal article is about how new data are limiting the theoretical options available to explain the observed accelerating expansion of the universe.

Today’s article:
Zhang et al Testing modified gravity models with recent cosmological observations..

Theorists can develop some pretty ‘out there’ ideas when using limited data sets. But with the advent of new technologies, or new ways to measure things, or even new things to measure – new data becomes available that then constrains the capacity of various theories to explain what we have measured.

Mind you, when new data conflicts with theory, the first question should always be whether the theory is wrong or the data is wrong – and it may take some time to decide which. A case in point is the Gran Sasso faster-than-light neutrino data. This finding conflicts with a range of well established theories which explain a mountain of other data very well. But to confirm that the neutrino data are wrong, we will need to reproduce the test – perhaps with different equipment under different conditions. This might establish an appropriate level of confidence that the data are really wrong – or otherwise that we need to revise the entire theoretical framework of modern physics.

Zhang et al seek to replicate this sort of evidence-based thinking using Bayesian and also Akaike statistics to test whether the latest available data on the expansion of the universe alters the likelihood of existing theories being able to explain that expansion.

These latest available data include:

  • the SNLS3 SN1a data set (of 472 Type 1a supernovae);
  • the Wilkinson Microwave Anisotropy Probe (WMAP) 7 year observations;
  • Baryonic acoustic oscillation results for the Sloan Digital Sky Survey release 7; and
  • the latest Hubble constant measures from the Wide Field Camera 3 on the Hubble Space telescope.

The authors run a type of chi-squared analysis to see how the standard Lambda Cold Dark Matter (CDM) model and a total of five different modified gravity (MG models) fit against both the earlier data and now this latest data. Or in their words ‘we constrain the parameter space of these MG models and compare them with the Lambda CDM model’.

It turns out that the latest data best fit the Lambda CDM model, fit less well with most MG models and at least one of the MG models is ‘strongly disfavored’.

They caveat their findings by noting that this analysis only indicates how things stand currently and yet more new data may change the picture again.

And not surprisingly, the paper concludes by determining that what we really need is more new data. Amen to that.

So… comments? Are Bayesian statistics just a fad or a genuinely smarter way to test a hypothesis? Are the first two paragraphs of the paper’s introduction confusing – since Lambda is traditionally placed on ‘the left side of the Einstein equation’? Does anyone feel constrained to suggest an article for the next edition of Journal Club?

Journal Club – Neutrino Vision

Today's Journal Club is about a new addition to the Standard Model of fundamental particles.

[/caption]

According to Wikipedia, a journal club is a group of individuals who meet regularly to critically evaluate recent articles in the scientific literature. And of course, the first rule of Journal Club is… don’t talk about Journal Club.

So, without further ado – today’s journal article is about the latest findings in neutrino astronomy.

Today’s article:
Gaisser Astrophysical neutrino results..

This paper presents some recent observations from the IceCube neutrino telescope at the South Pole – which acually observes neutrinos from the northern sky – using the Earth to filter out some of the background noise. Cool huh?

Firstly, a quick recap of neutrino physics. Neutrinos are sub-atomic particles of the lepton variety and are essentially neutrally charged versions of the other leptons – electrons, muons and taus – which all have a negative charge. So, we say that neutrinos come in three flavours – electron neutrinos, muon neutrinos and tau neutrinos.

Neutrinos were initially proposed by Pauli (a proposal later refined by Fermi) to explain how energy could be transported away from a system undergoing beta decay. When solar fusion began to be understood in the 1930s – the role of neutrinos was problematic since only a third or more of the neutrinos that were predicted to be produced by fusion were being detected – an issue which became known as the solar neutrino problem in the 1960’s.

The solar neutrino problem was only resolved in the late 1990s when the three neutrino flavours idea gained wide acceptance and each were finally detected in 2001 – confirming that solar neutrinos in transit actually oscillate between the three flavours (electron, muon and tau) – which means that if your detector is set up to detect only one flavour you will detect only about one third of all the neutrinos coming from the Sun.

Ten years later, the Ice Cube the neutrino observatory is using our improved understanding of neutrinos to try and detect high energy neutrinos of extragalactic origin. The first challenge is to distinguish atmospheric neutrinos (produced in abundance as cosmic rays strike the atmosphere) from astrophysical neutrinos.

Using what we have learnt from solving the solar neutrino problem, we can be confident that any neutrinos from distant sources have had time to oscillate – and hence should arrive at Earth in approximately equal ratios. Atmospheric neutrinos produced from close sources (also known as ‘prompt’ neutrinos) don’t have time to oscillate before being detected.

When looking for point sources of high energy astrophysical neutrinos, IceCube is most sensitive to muon neutrinos – which are detected when the neutrino weakly interacts with an ice molecule – emitting a muon. A high energy muon will then generate Cherenkov radiation – which is what IceCube actually detects. Unfortunately muon neutrinos are also the most common source of cosmic ray induced atmospheric neutrinos, but we are steadily getting better at determining what energy levels represent astrophysical rather than atmospheric neutrinos.

So, it’s still early days with this technology – with much of the effort going in to learning how to observe, rather than just observing. But maybe one day we will be observing the cosmic neutrino background – and hence the first second of the Big Bang. One day…

So… comments? Are neutrinos the fundamentally weirdest fundamental particle out there? Could IceCube be used to test the faster-than-light neutrino hypothesis? Want to suggest an article for the next edition of Journal Club?

Journal Club – Transit of Venus

Today's Journal Club is about a new addition to the Standard Model of fundamental particles.

[/caption]

According to Wikipedia, a journal club is a group of individuals who meet regularly to critically evaluate recent articles in scientific literature. Being Universe Today if we occasionally stray into critically evaluating each other’s critical evaluations, that’s OK too. And of course, the first rule of Journal Club is… don’t talk about Journal Club.

So, without further ado – today’s journal article under the microscope is about the 2012 transit of Venus.

Today’s article:
Sigismondi Solar diameter with 2012 Venus transit.

The 2012 transit of Venus will proceed for nearly 7 hours over 5 and 6 June (UTC). It’s not likely that we are going to squeeze a huge amount of ground-breaking science out of this event, which was closely monitored by 21st century technology the last time it happened in 2004. But Sigismondi argues that a more exacting observation of this transit should enable us to clean up some of the historical data from previous transits by kind of reverse-engineering some of the inherent inaccuracies that plagued earlier measurements.

The point of such an exercise may become clear by considering a claim made back in 1979 that the Sun was shrinking – based on an analysis of 120 years of Greenwich Observatory solar measurement data. Apparently this finding has since been hijacked to support a young Earth hypothesis – as in if the Sun is shrinking so fast, then how can it possibly be billions or even millions of years old and yada, yada.

Shapiro was able to quickly counter the shrinking data finding in a 1980 publication (in Science), demonstrating that transits of Mercury data, going back to 1736, indicated that the solar diameter had remained constant to within 0.3 arcseconds. This was then followed up by Parkinson et al, also in a 1980 publication (in Nature), demonstrating that changes in the Greenwich solar data correlated closely with changes in instrumentation, atmospheric conditions and in the people taking the measurements (and thanks to Matt Tiscareno for this story).

Anyhow, Sigismondi outlines how the solar diameter can be measured from the transit of Venus’ outline when it contacts each edge of the Sun – and then discusses a method whereby the ‘notorious’ black drop effect can be eliminated. The black drop effect involves the black shape of Venus seemingly to elongate as it approaches the edge of the Sun – which had confounded all measurements taken prior to 2004.

Transits of Venus generally happen in pairs separated by 8 years, with either 105.5 or 121.5 years separating the last of the pair and the first of the next. Apparently Kepler was the first person to predict a transit of Venus in 1631 – but he failed to predict that it would not be visible from Europe. So it fell to Jeremiah Horrocks and William Crabtree to make the first scientific observation of a transit 8 years later in December 1639. The next two were in June 1761 and June 1769, the latter famously observed from Tahiti by Lieutenant James T Cook (OK, kidding about the T) and then there were two more in December 1874 and December 1882.

Then another 121.5 years passed until June 2004 – now to be followed by this year’s June 2012 transit, being the 7th ever scientifically recorded transit. And BTW here’s an original drawing by James Cook of the June 1769 transit, showing the black drop effect.

So… comments? Is it OK to get a little bit excited about ‘just another’ transit of Venus – since it’s only the 7th we have ever recorded data about? Did you know that the plural of ephemeris (the position of something in the sky) is ephemerides? Want to suggest an article for the next edition of Journal Club?

Journal Club – When White Dwarfs Collide

Today's Journal Club is about a new addition to the Standard Model of fundamental particles.

[/caption]

According to Wikipedia, a journal club is a group of individuals who meet regularly to critically evaluate recent articles in scientific literature. Being Universe Today if we occasionally stray into critically evaluating each other’s critical evaluations, that’s OK too. And of course, the first rule of Journal Club is… don’t talk about Journal Club.

So, without further ado – today’s scheduled-for-demolition journal article is about the ongoing problem of figuring out what events precede a Type 1a supernova.

Today’s article:
Dan et al How the merger of two white dwarfs depends on their mass ratio: orbital stability and detonations at contact.

There is growing interest about the nature of the events that precede Type 1a supernovae. We are confident that the progenitor stars of Type 1a supernovae are white dwarfs – but these stars have generally very long lives, making it difficult to identify stars that are potentially on the brink of exploding.

We are also confident that something happens to cause a white dwarf to accumulate extra mass until it reached its Chandrasekhar limit (around 1.4 solar masses, depending on the star’s spin).

For a long time, it had been assumed that a Type 1a supernova probably arose from a binary star system with a white dwarf and another star that had just evolved into a red giant, its outer layers swelling out into the gravitational influence of the white dwarf star, This new material was accreted onto the white dwarf until it hit its Chandrasekhar limit – and then kabloowie.

However, the white-dwarf-red-giant-binary hypothesis is currently falling out of favour. It has always had the problem that any Type 1 supernovae has, by definition, almost no hydrogen absorption lines in its light spectrum – which makes sense for a Type 1a supernovae arising from a hydrogen-expended white dwarf – but then what happened to the new material supposedly donated by a red giant partner (which should have been mostly hydrogen)?

Also, the recently discovered Type 1a SN2011fe was observed just as its explosion was commencing, allowing constraints to be placed on the nature of its progenitor system. Apparently there is no way the system could have included something as big as a red giant and so the next most likely cause is the merging (or collision) of two white dwarfs.

Other modelling research has also concluded that the two white dwarf merger scenario maybe statistically more likely to take place than the red giant accretion scenario – since the latter requires a lot of Goldilocks parameters (where everything has to be just right for a Type 1a to eventuate).

This latest paper expands the possible scenarios under which a two white dwarf merger could produce a Type 1a supernovae – and finds a surprising number of variations with respect to mass, chemistry and the orbital proximities of each star. Of course, it is just modelling but it does challenge the current assertion at the relevant Wikipedia entry that white dwarf mergers are a second possible, but much less likely, mechanism for Type 1a supernovae formation.

So – comments? Anyone want to defend the old red-giant-white-dwarf scenario? Does computer modelling count as a form of evidence? Want to suggest an article for the next edition of Journal Club?

Journal Club: Dark Matter – The Early Years

Today's Journal Club is about a new addition to the Standard Model of fundamental particles.

[/caption]

According to Wikipedia, a journal club is a group of individuals who meet regularly to critically evaluate recent articles in scientific literature. Being Universe Today if we occasionally stray into critically evaluating each other’s critical evaluations, that’s OK too. And of course, the first rule of Journal Club is… don’t talk about Journal Club.

So, without further ado – today’s journal article on the dissection table is about using our limited understanding of dark matter to attempt visualise the cosmic web of the very early universe.

Today’s article:
Visbal et al The Grand Cosmic Web of the First Stars.

So… dark matter, pretty strange stuff huh? You can’t see it – which presumably means it’s transparent. Indeed it seems to be incapable of absorbing or otherwise interacting with light of any wavelength. So dark matter’s presence in the early universe should make it readily distinguishable from conventional matter – which does interact with light and so would have been heated, ionised and pushed around by the radiation pressure of the first stars.

This fundemental difference may lead to a way to visualise the early universe. To recap those early years, first there was the Big Bang, then three minutes later the first hydrogen nuclei formed, then 380,000 years later the first stable atoms formed. What follows from there is the so-called dark ages – until the first stars began to form from the clumping of cooled hydrogen. And according to the current standard model of Lambda Cold Dark Matter – this clumping primarily took place within gravity wells created by cold (i.e. static) dark matter.

This period is what is known as the reionization era, since the radiation of these first stars reheated the interstellar hydrogen medium and hence re-ionized it (back into a collection of H+ ions and unbound electrons).

While this is all well established cosmological lore – it is also the case that the radiation of the first stars would have applied a substantial radiation pressure on that early dense interstellar medium.

So, the early interstellar medium would not only be expanding due to the expansion of the universe, but also it would be being pushed outwards by the radiation of the first stars – meaning that there should be a relative velocity difference between the interstellar medium and the dark matter of the early universe – since the dark matter would be immune to any radiation pressure effects.

To visualize this relative velocity difference, we can look for hydrogen emissions, which are 21 cm wavelength light – unless further red-shifted, but in any case these signals are well into the radio spectrum. Radio astronomy observations at these wavelengths offer a window to enable observation of the distribution of the very first stars and galaxies – since these are the source of the first ionising radiation that differentiates the dark matter scaffolding (i.e. the gravity wells that support star and galaxy formation) from the remaining reionized interstellar medium. And so you get the first signs of the cosmic web when the universe was only 200 million years old.

Higher resolution views of this early cosmic web of primeval stars, galaxies and galactic clusters are becoming visible through high resolution radio astronomy instruments such as LOFAR – and hopefully one day in the not-too-distant future, the Square Kilometre Array – which will enable visualisation of the early universe in unprecedented detail.

So – comments? Does this fascinating observation of 21cm line absorption lines somehow lack the punch of a pretty Hubble Space Telescope image? Is radio astronomy just not sexy? Want to suggest an article for the next edition of Journal Club?

Journal Club: On Nothing

Today's Journal Club is about a new addition to the Standard Model of fundamental particles.

[/caption]

According to Wikipedia, a journal club is a group of individuals who meet regularly to critically evaluate recent articles in scientific literature. Being Universe Today if we occasionally stray into critically evaluating each other’s critical evaluations, that’s OK too. And of course, the first rule of Journal Club is… don’t talk about Journal Club.

So, without further ado – today’s journal article under the spotlight is about nothing.

The premise of the article is that to define nothing we need to look beyond a simple vacuum and think of nothing in terms of what there was before the Big Bang – i.e. really nothing.

For example, you can have a bubble of nothing (no topology, no geometry), a bubble of next to nothing (topology, but no geometry) or a bubble of something (which has topology, geometry and most importantly volume). The universe is a good example of a bubble of something.

The paper walks the reader through a train of logic which ends by defining nothing as ‘anti De Sitter space as the curvature length approaches zero’. De Sitter space is essentially a ‘vacuum solution’ of Einstein’s field equations – that is, a mathematically modelled universe with a positive cosmological constant. So it expands at an accelerating rate even though it is an empty vacuum. Anti De Sitter space is a vacuum solution with a negative cosmological constant – so it’s shrinking inward even though it is an empty vacuum. And as its curvature length approaches zero, you get nothing.

Having so defined nothing, the authors then explore how you might get a universe to spontaneously arise from that nothing – and nope, apparently it can’t be done. Although there are various ways to enable ‘tunnelling’ that can produce quantum fluctuations within an apparent vacuum – you can’t ‘up-tunnel’ from nothing (or at least you can’t up-tunnel from ‘anti-de Sitter space as the curvature length approaches zero’ ).

The paper acknowledges this is obviously a problem, since here we are. By explanation, the authors suggest:

  • get past the problem by appealing to immeasurable extra dimensions (a common strategy in theoretical physics to explain impossible things without anyone being able to easily prove or disprove it);
  • that their definition of nothing is just plain wrong; or
  • that they (and we) are just not asking the right questions.

Clearly the third explanation is the authors’ favoured one as they end with the statement: ‘One thing seems clear… to truly understand everything, we must first understand nothing‘. Nice.

So – comments? Is appealing to extra dimensions just a way of dodging a need for evidence? Nothing to declare? Want to suggest an article for the next edition of Journal Club?

Today’s article:
Brown and Dahlen On Nothing.

Journal Club – This new Chi b (3P) thingy

Today's Journal Club is about a new addition to the Standard Model of fundamental particles.

[/caption]

According to Wikipedia, a Journal Club is a group of individuals who meet regularly to critically evaluate recent articles in the scientific literature. Since this is Universe Today if we occasionally stray into critically evaluating each other’s critical evaluations, that’s OK too.

And of course, the first rule of Journal Club is… don’t talk about Journal Club. So, without further ado – today’s journal article is about a new addition to the Standard Model of fundamental particles.

The good folk at the CERN Large Hadron Collider finished off 2011 with some vague murmurings about the Higgs Boson – which might have been kind-of sort-of discovered in the data already, but due to the degree of statistical noise around it, no-one’s willing to call it really found yet.

Since there is probably a Nobel prize in it – this seems like a good decision. It is likely that a one-way-or-the-other conclusion will be possible around this time next year – either because collisions to be run over 2012 reveal some critical new data, or because someone sifting through the mountain of data already produced will finally nail it.

But in the meantime, they did find something in 2011. There is a confirmed Observation of a new chi_b state in radiative transitions to Upsilon(1S) and Upsilon(2S) at the ATLAS experiment – or, in a nutshell… we hit Bottomonium.

In the lexicon of sub-atomic particle physics, the term Quarkonium is used to describe a particle whose constituents comprise a quark and its own anti-quark. So for example you can have Charmonium (a charm quark and a charm anti-quark) and you can have Bottomonium (a bottom quark and a bottom anti-quark).

The new Chi b (3P) particle has been reported as a boson – which is technically correct, since it has integer spin, while fermions (hadrons and leptons) have half spins. But it’s not an elementary boson like photons, gluons or the (theoretical) Higgs – it’s a composite boson composed of quarks. So, it is perhaps less confusing to consider it a meson (which is a bosonic hadron). Like other mesons, Chi b (3P) is a hadron that would not be commonly found in nature. It just appears briefly in particle accelerator collisions before it decays.

So comments? Has the significance of this new finding been muted because the discoverers thought it would just prompt a lot of bottom jokes? Is Chi_b (3P) the ‘Claytons Higgs’ (the boson you have when you’re not having a Higgs?). Want to suggest an article for the next edition of Journal Club?

Otherwise, have a great 2012.

Today’s article:
The ATLAS collaboration Observation of a new chi_b state in radiative transitions to Upsilon(1S) and Upsilon(2S) at the ATLAS experiment.