Repaired too Late? Tevatron May Beat LHC in Hunt for Higgs Boson

The CDF detector at Fermilab's Tevatron accelerator (Fermilab)

[/caption]The Large Hadron Collider (LHC) is billed as the next great particle accelerator that will give us our best chance yet at discovering the illusive exchange particle (or boson) of the Higgs field. The discovery (or not) of the Higgs boson will answer so many questions about our universe, and our understanding of the quantum world could be revolutionized.

But there’s a problem. The LHC isn’t scheduled for restart until September 2009 (a full year after the last attempt) and particle collisions aren’t expected until October. Even then, high energy collisions won’t be likely until 2010, leaving the field wide open for competing accelerator facilities to redouble their efforts at making this historic discovery before the LHC goes online.

The Tevatron, at Fermi National Accelerator Laboratory (Fermilab) in Illinois, is currently the most powerful accelerator in the world and has refined high energy particle collisions so much, that scientists are estimating there is a 50% chance of a Higgs boson discovery by the end of 2009

If this was a USA vs. Europe competition to discover the Higgs particle, the Tevatron would have a clear advantage. Although it’s old (the first configuration was completed in 1984), and set to be superseded by the LHC in 2010, the Tevatron is a proven particle accelerator with an impressive track record. Accelerator techniques and technology have been refined, making high energy hadron collisions routine. However, Fermilab scientists are keen to emphasise that they aren’t trying to beat the LHC in the search for the Higgs boson.

We’re not racing CERN,” said Fermilab Director, Pier Oddone. He points out that there is a lot of collaborative work between Fermilab and CERN, therefore all scientists, no matter which continent they are on, are all working toward a common goal. In reality, I doubt this is the case. When searching for one of the most coveted prizes in modern quantum physics, it’s more of a case of ‘every lab for itself.’ Scientists in Fermilab have confirmed this, saying they are “working their tails off” analysing data from the Tevatron.

Indirectly, we’re helping them,” says Dmitri Denisov, DZero (one of the Tevatron’s detectors) spokesman, of his European competition. “They’re definitely feeling the heat and working a little harder.”

For the Standard Model to be complete, the Higgs particle must be found. If it does exist, physicists have put upper and lower bounds on its possible mass. Standing at a value between 114 and 184 GeV, this is well within the sensitivity of the Tevatron detectors. It should be a matter of time until the Higgs particle is discovered and physicists have calculated that if the Higgs particle can be created during a Tevatron high-energy proton-antiproton collision. They even give the Tevatron a 50:50 chance of a Higgs particle discovery by the New Year.

Last summer, both key particle experiments (CDF and DZero) focused on detecting Higgs particles with a mass of 170 GeV (at this value a particle would be easier to detect from the background noise). However, no Higgs particles were detected. Now physicists will expand the search above and below this value. Therefore, if the Higgs boson exists, it would be useful if it has a mass as close as possible to 170 GeV. Estimates suggest a 150 GeV Higgs boson could be discovered as early as this summer, well before the LHC has even been repaired. If the mass of the Higgs boson is around the 120 GeV mark, it might take Tevatron scientists until 2010 to verify whether a Higgs boson has been detected.

Source: New Scientist

Stellar Jets are Born Knotted

Herbig Haro object HH47 (a stellar jet), observed with the Hubble Space Telescope

[/caption]

Some of the most beautiful structures observed in the Universe are the intricate jets of supersonic material speeding away from accreting stars, such as young proto-stars and stellar mass black holes. These jets are composed of highly collimated gas, rapidly accelerated and ejected from circumstellar accretion disks. The in-falling gas from the disks, usually feeding the black hole or hungry young star, is somehow redirected and blown into the interstellar medium (ISM).

Much work is being done to understand how accretion disk material is turned into a rapid outflow, forming an often knotted, clumpy cloud of outflowing gas. The general idea was that the stellar jet is ejected in a steady flow (like a fire hose), only for it to interact with the surrounding ISM, breaking up as it does so. However, a unique collaboration between plasma physicists, astronomers and computational scientists may have uncovered the true nature behind these knotted structures. They didn’t become knotted, they were born that way

The predominant theory says that jets are essentially fire hoses that shoot out matter in a steady stream, and the stream breaks up as it collides with gas and dust in space—but that doesn’t appear to be so after all,” said Adam Frank, professor of astrophysics at the University of Rochester, and co-author of the recent publication. According to Frank, the exciting results uncovered by the international collaboration suggest that far from being a steady stream of gas being ejected from the circumstellar accretion disk, the jets are “fired out more like bullets or buckshot.” It is therefore little wonder that the vast stellar jets appear twisted, knotted and highly structured.

A member of the collaboration, Professor Sergey Lebedev and his team at the Imperial College London, made an attempt to replicate the physics of a star in the laboratory, and the experiment matched the known physics of stellar jets very well. The pioneering work by Lebedev is being lauded a possibly the “best” astrophysical experiment that’s ever been carried out.

Using an aluminium disk, Lebedev applied a high-powered pulse of energy to it. Within the first few billionths of a second, the aluminium began to evaporate, generating a small cloud of plasma. This plasma became an accretion disk analogue, a microscopic equivalent of the plasma being dragged into a proto-star. In the centre of the disk, the aluminium had eroded completely, creating a hole. Through this hole, a magnetic field, being applied below the disk, could penetrate through.

It would appear that the dynamics of the magnetic field interacting with the plasma accurately depicts the observed characteristics of extended stellar jets. At first, the magnetic field pushes the plasma aside around the disk’s hole, but its structure evolves by creating a bubble, then twisting and warping, forming a knot in the plasma jet. Then, a very important event occurs; the initial magnetic “bubble” pinches off and is propelled away. Another magnetic bubble forms to continue the process all over again. These dynamic processes cause packets of plasma to be released in bursts and not in the steady, classical “fire hose” manner.

We can see these beautiful jets in space, but we have no way to see what the magnetic fields look like,” says Frank. “I can’t go out and stick probes in a star, but here we can get some idea—and it looks like the field is a weird, tangled mess.”

By shrinking this cosmic phenomenon into a laboratory experiment, the investigators have shed some light on the possible mechanism driving the structure of stellar jets. It appears that magnetic processes, not ISM interactions, shape the knotted structure of stellar jets when they born, not after they have evolved.

Source: EurekAlert

The Journey of Space Exploration: Ex-Astronaut Views on NASA

Why has "one small step for man" turned into "one giant leap backward" for NASA? (NASA)

[/caption]It reads like the annual progress report from my first year in university. He lacks direction, he’s not motivated and he has filled his time with extra-curricular activities, causing a lack of concentration in lectures. However, it shouldn’t read like an 18 year-old’s passage through the first year of freedom; it should read like a successful, optimistic and inspirational prediction about NASA’s future in space.

What am I referring to? It turns out that the Houston university where President John F. Kennedy gave his historic “We go to the Moon” speech back in 1962 has commissioned a report, recommending that NASA should give up its quest for returning to the Moon and focus more on environmental and energy projects. The reactions of several astronauts from the Mercury, Apollo and Shuttle eras have now been published. The conclusions in the Rice University report may have been controversial, but the reactions of the six ex-astronauts went well beyond that. They summed up the concern and frustration they feel for a space agency they once risked their lives for.

At the end of the day, it all comes down to how we interpret the importance of space exploration. Is it an unnecessary expense, or is it part of scientific endeavour where the technological spin-offs are more important than we think?

John F. Kennedy speaking at Rice University in 1962. How times have changed (NASA)
John F. Kennedy speaking at Rice University in 1962. How times have changed (NASA)
The article published in the Houston Chronicle website (Chron.com) talks about the “surprising reactions” by the six former astronauts questioned about Rice University’s James A. Baker III Institute for Public Policy recommendation for NASA. However, I’d argue that much of what they say is not surprising in the slightest. These men and women were active in the US space agency during some of the most profound and exciting times in space flight history, it is little wonder that they may be a little exacerbated by the current spaceflight problems that are besieging NASA. The suggestion that NASA should give up the Moon for more terrestrial pursuits is a tough pill to swallow, especially for these pioneers of spaceflight.

It is widely accepted that NASA is underfunded, mismanaged and falling short of its promises. Many would argue that this is a symptom of an old cumbersome government department that has lost its way. This could be down to institutional failings, lack of investment or loss of vision, but the situation is getting worse for NASA. Regardless, something isn’t right and now we are faced with a five year gap in US manned spaceflight capability, forcing NASA to buy Russian Soyuz flights. The Shuttle replacement, the Constellation Program, has even been written off by many before it has even carried out the first test launch.

So, from their unique perspective, what do these retired astronauts think of the situation? It turns out that some agree with the report, others are strongly opposed to it, whereas all voice concern for the future of NASA.

Kathryn Thornton, before a Shuttle mission (NASA)
Kathryn Thornton, before a Shuttle mission (NASA)
Walt Cunningham flew aboard Apollo 7 in 1968. It was the first manned mission in the Apollo Program. At an age of 76, Cunningham sees no urgency in going back to the Moon but he is also believes the concerns about global warming are “a great big scam.” His feelings about global warming may be misplaced, but he is acutely aware of the funding issue facing NASA, concerned the agency will “keep sliding downhill” if nothing is done.

Four-time Shuttle astronaut Kathryn Thornton, agrees that the agency is underfunded and overstretched and dubious about the Institute’s recommendation that NASA should focus all its attention on environmental issues for four years. “I find it hard to believe we would be finished with the energy and environment issues in four years. If you talk about a re-direction, I think you talk about a permanent re-direction,” Thornton added.

Gene Cernan, commander of the 1972 Apollo 17 mission, believes that space exploration is essential to inspire the young and invigorate the educational system. He is shocked by the Institute’s recommendation to pull back on space exploration. The 74 year old was the last human to walk on the Moon and he believes NASA shouldn’t be focused on ways to save the planet, other agencies and businesses can do that.

It just blows my mind what they would do to an organization like NASA that was designed and built to explore the unknown.” — Gene Cernan

Apollo astronaut Gene Cernan covered with moon dust (NASA)
Apollo astronaut Gene Cernan covered with moondust (NASA)
John Glenn, first US astronaut to orbit the Earth and former senator, is appalled at the suggestion of abandoning projects such as the International Space Station. Although Glenn, now 87, agrees with many of the points argued in the report, he said, “We have a $115 billion investment in the most unique laboratory ever put together, and we are cutting out the ability to do research that may have enormous value to everybody right here on the Earth? This is folly.”

Sally Ride, 57, a physicist and the first American woman to fly into space believes the risky option of extending the life of the Shuttle should be considered to allow US manned access to the space station to continue. The greater risk of being frozen out of the outpost simply is not an option. However, she advocates the report’s suggestion that NASA should also focus on finding solutions to climate change. “It will take us awhile to dig ourselves out,” she said. “But the long-term challenge we have is solving the predicament we have put ourselves in with energy and the environment.”

Franklin Chang Diaz, who shares world’s record for the most spaceflights (seven), believes that NASA has been given a very bad deal. He agrees with many of the report’s recommendations, not because the space agency should turn its back on space exploration, it’s because the agency has been put in an impossible situation.

NASA has moved away from being at the edge of high tech and innovation,” said Chang Diaz. “That’s a predicament NASA has found itself in because it had to carry out a mission to return humans to the moon by a certain time (2020) and within a budget ($17.3 billion for 2008). It’s not possible.”

In Conclusion

This discussion reminds me of a recent debate not about space exploration, but another science and engineering endeavour here on Earth. The Large Hadron Collider (LHC) has its critics who will argue that this $5 billion piece of kit is not worth the effort, where the money spent on accelerating particles could be better spent on finding solutions for climate change, or a cure for cancer.

You did NOT just say that! Brian Cox's expression says it all... (still from the BBC's Newsnight program)
You did NOT just say that! Brian Cox's expression says it all... (still from the BBC's Newsnight program)
In a September 2008 UK televised debate on BBC Newsnight between Sir David King (former Chief Scientific Advisor for the UK government) and particle physicist Professor Brian Cox, King questioned the the importance of the science behind the LHC. By his limited reasoning, the LHC was more “navel-searching”, “curiosity-driven” research with little bearing on the advancement of mankind. In King’s view the money would be better spent on finding solutions to known problems, such as climate change. It is fortunate Brian Cox was there to set the records straight.

Prof. Cox explained that the science behind the LHC is “part of a journey” where the technological spin-offs and the knowledge gained from such a complex experiment cannot be predicted before embarking on scientific endeavour. Indeed, advanced medical technologies are being developed as a result of LHC research; the Internet may be revolutionized by new techniques being derived from work at the LHC; even the cooling system for the LHC accelerator electromagnets can be adapted for use in fusion reactors.

The point is that we may never fully comprehend what technologies, science or knowledge we may gain from huge experiments such as the LHC, and we certainly don’t know what spin-offs we can derive from continued advancement of space travel technology. Space exploration can only enhance our knowledge and scientific understanding.

If NASA starts pulling back on endeavours in space, taking a more introverted view of finding specific solutions to particular problems (such as finding a solution to climate change at the detriment to space exploration, as suggested by the Rice University report), we may never fully realise our potential as a race, and many of the problems here on Earth will never be solved…

Sources: Chron.com, Astroengine.com

CERN Aims for LHC Restart in September, First Collisions in October

The Large Hadron Collider at CERN. Credit: CERN/LHC

[/caption]It may seem that the delay is getting longer and longer for the restart of the LHC after the catastrophic quench in September 2008, but progress is being made. Repair costs are expected to hit the $16 million mark as engineers quickly rebuild the damaged electromagnets and track down any further electrical faults that could jeopardize the future operation of the complex particle accelerator.

According to the European Organization for Nuclear Research (CERN), the Large Hadron Collider will resume operations in September. But the best news is: we could be seeing the first particle collisions only a month later

If, like me, you were restlessly awaiting the grand LHC “switch-on” on September 10th, 2008, only to be disappointed by the transformer breakdown the following day, but then buoyed up by the fact LHC science was still on track, only for your hopes to be completely quenched by the quench that explosively ripped the high-tech magnets from their mounts on September 20th, you’ll probably be weary about getting your hopes up too high. However, allow yourself a little levity, the LHC repairs are going well, potential faults are being identified and fixed, and replacement parts are falling into place. But there is more good news.

Via Twitter, one of my contacts (@dpodolsky) hinted that he’d heard, via word of mouth, that LHC scientists’ optimism was growing for an October 2009 start to particle collisions. However, as of February 2nd, there was no official word from CERN. Today, the CERN Director General issued a statement.

The schedule we have now is without a doubt the best for the LHC and for the physicists waiting for data,” Rolf Heuer said. “It is cautious, ensuring that all the necessary work is done on the LHC before we start-up, yet it allows physics research to begin this year.”

So, the $5 billion LHC is expected to be restarted in September and the first experiments will hopefully commence by the end of October 2009. It may be a year later than when the first particle collisions were planned, but at least a better idea is forming about when the hunt for the Higgs particle will recommence…

Source: CNET Cutting Edge

More Faults Found in LHC, But No Further Delay to Start-up

The LHC repairs are progressing well (CERN)

[/caption]In September 2008, the Large Hadron Collider (LHC) suffered a catastrophic quench, triggered by a faulty connection in the electronics connecting two of the supercooled magnets between Sections 3 and 4 of the 27 km-circumference particle accelerator. The “S34-incident” caused tonnes of helium coolant to explosively leak into the LHC tunnel, ripping the heavy electromagnets from their concrete mounts.

Naturally, this was a huge blow for CERN, delaying the first particle collisions by several months. However, the repair work is progressing well, and hopes are high for commencement of LHC science as early as this summer. Now engineers are working hard to avoid a recurrence of the S34 Incident, tracking down similar electrical faults between the accelerator magnets. It seems like they have found many more faults than expected

According to a recently published progress report, the LHC repairs are progressing as planned, but more electrical faults have been discovered in other sections of the accelerator. An electrical short has been blamed for the quench four months ago, only weeks after the first circulation of protons around the LHC in the beginning of September 2008. It is now of paramount importance to isolate any further potential shorts in the complex experiment. It would appear engineers are doing a good job in tracking them down.

Ribbons of superconducting niobium-titanium wire is used by the LHC to carry thousands of amps of current to the magnets. Connecting the ribbon from electromagnet-to-electromagnet are splices that are soldered in place. Should one of these splices be weakened by poor soldering, an electrical short can occur, making the magnets lose superconductivity, initiating a quench, rapidly heating the sensitive equipment. Various sections are being re-examined and re-soldered. The good news is that this additional work is not compounding the delay any further.

It has been confirmed that there was a lack of solder on the splice joint. Each sector has more than 2500 splices and a single defective splice can now be identified in situ when the sector is cold. Using this method another magnet showing a similar defect has been identified in sector 6-7. This sector will be warmed and the magnet removed. The warm up of this additional sector can be performed in the shadow of the repair to sector 3-4 and will therefore not add any additional delay to the restart schedule. — CERN

Hopefully we’ll see a second circulation of protons this summer, and according to informal rumours from a contact involved in the LHC science, the first particle collisions could start as early as October 2009. I will listen out for any further official confirmation of this information

Sources: CERN, Nature.com

After the Storm: Measuring the Structure and Temperature of a Quiescent Neutron Star

Accretion can cause neutron stars to flare violently

[/caption]So how do you take the temperature of one of the most exotic objects in the Universe? A neutron star (~1.35 to 2.1 solar masses, measuring only 24 km across) is the remnant of a supernova after a large star has died. Although they are not massive enough become a black hole, neutron stars still accrete matter, pulling gas from a binary partner, often undergoing prolonged periods of flaring.

Fortunately, we can observe X-ray flares (using instrumentation such as Chandra), but it isn’t the flare itself that can reveal the temperature or structure of a neutron star.

At the AAS conference last week, details about the results from an X-ray observing campaign of MXB 1659-29, a quasi-persistent X-ray transient source (i.e. a neutron star that flares for long periods), revealed some fascinating insights to the physics of neutron stars, showing that as the crust of a neutron star cools, the crustal composition is revealed and the temperature of these exotic supernova remnants can be measured…

During a flare outburst, neutron stars generate X-rays. These X-ray sources can be measured and their evolution tracked. In the case of MXB 1659-29, Ed Cackett (Univ. of Michigan) used data from NASA’s Rossi X-ray Timing Explorer (RXTE) to monitor the cooling of the neutron star crust after an extended period of X-ray flaring. MXB 1659-29 flared for 2.5 years until it “turned off” in September 2001. Since then, the source was periodically observed to measure the exponential decrease in X-ray emissions.

So why is this important? After a long period of X-ray flaring, the crust of a neutron star will heat up. However, it is thought that the core of the neutron star will remain comparatively cool. When the neutron star stops flaring (as the accretion of gas, feeding the flare, shuts off), the heating source for the crust is lost. During this period of “quiescence” (no flaring), the diminishing X-ray flux from the cooling neutron star crust reveals a huge wealth of information about the characteristics of the neutron star.

The cross section of a neutron star
The cross section of a neutron star
During quiescence, astronomers will observe X-rays emitted from the surface of the neutron star (as opposed to the flares), so direct measurements can be made of the neutron star. In his presentation, Cackett examined how the X-ray flux from MXB 1659-29 reduced exponentially and then levelled off at a constant flux. This means the crust cooled rapidly after the flaring, eventually reaching thermal equilibrium with the neutron star core. Therefore, by using this method, the neutron star core temperature can be inferred.

Including the data from another neutron star X-ray transient KS 1731-260, the cooling rates observed during the onset of quiescence suggests these objects have well-ordered crustal lattices with very few impurities. The rapid temperature decrease (from flare to quiescence) took approximately 1.5 years to reach thermal equilibrium with the neutron star core. Further work will now be carried out using Chandra data so more information about these rapidly spinning exotic objects can be uncovered.

Suddenly, neutron stars became a little less mysterious to me in the 10 minute talk last Tuesday, I love conferences

Related publications:

NASA Tests New Super-Thin High Alitude Balloon

Super pressure balloon in flight. Credit: NASA

[/caption]
High altitude balloons are an inexpensive means of getting payloads to the brink of space, where all sorts of great science and astronomy can be done. A new prototype of balloon that uses material as thin as plastic food wrap was successfully checked out in an 11-day test flight, and this new design may usher in a new era of high altitude flight. NASA and the National Science Foundation sponsored the test, which was launched from McMurdo Station in Antarctica. The balloon reached a float altitude of more than 111,000 feet and maintained it for the entire 11 days of flight. It’s hoped that the super-pressure balloon ultimately will carry large scientific experiments to the edge of space for 100 days or more.

The flight tested the durability and functionality of the scientific balloon’s novel globe-shaped design and the unique lightweight and thin polyethylene film. It launched on December 28, 2008 and returned on January 8, 2009.

“Our balloon development team is very proud of the tremendous success of the test flight and is focused on continued development of this new capability to fly balloons for months at a time in support of scientific investigations,” said David Pierce, chief of the Balloon Program Office at NASA’s Wallops Flight Facility at Wallops Island, Va. “The test flight has demonstrated that 100 day flights of large, heavy payloads is a realistic goal.”

This seven-million-cubic-foot super-pressure balloon is the largest single-cell, super-pressure, fully-sealed balloon ever flown. When development concludes, NASA will have a 22 million-cubic-foot balloon that can carry a one-ton instrument to an altitude of more than 110,000 feet, which is three to four times higher than passenger planes fly. Ultra-long duration missions using the super pressure balloon cost considerably less than a satellite and the scientific instruments flown can be retrieved and launched again, making them ideal very-high altitude research platforms.

CREAM team.  Credit: CREAM
CREAM team. Credit: CREAM

In addition to the super pressure test flight, two additional long-duration balloons were launched from McMurdo during the 2008-2009 campaign. The University of Maryland’s Cosmic Ray Energetics and Mass, or CREAM IV, experiment launched December 19, 2008, and landed January 6, 2009. The CREAM investigation was used to directly measure high energy cosmic-ray particles arriving at Earth after originating from distant supernova explosions elsewhere in the Milky Way galaxy. The payload for this experiment was refurbished from an earlier flight. The team released data and their findings from their first flight in August 2008.

The University of Hawaii Manoa’s Antarctic Impulsive Transient Antenna launched December 21, 2008, and is still aloft. Its radio telescope is searching for indirect evidence of extremely high-energy neutrino particles possibly coming from outside our Milky Way galaxy.

Source: NASA

Could Quark Stars Explain Magnetars Strong Magnetic Field?

The magnetic field surrounding the mysterious magnetar (NASA)

[/caption]Magnetars are the violent, exotic cousins of the well known neutron star. They emit excessive amounts of gamma-rays, X-rays and possess a powerful magnetic field. Neutron stars also have very strong magnetic fields (although weak when compared with magnetars), conserving the magnetic field of the parent star before it exploded as a supernova. However, the huge magnetic field strength predicted from observations of magnetars is a mystery. Where do magnetars get their strong magnetic fields? According to new research, the answer could lie in the even more mysterious quark star…

It is well known that neutron stars have very strong magnetic fields. Neutron stars, born from supernovae, preserve the angular momentum and magnetism of the parent star. Therefore, neutron stars are extremely magnetic, often rapidly spinning bodies, ejecting powerful streams of radiation from their poles (seen from Earth as a pulsar should the collimated radiation sweep through our field of view). Sometimes, neutron stars don’t behave as they should, ejecting copious amounts of X-rays and gamma-rays, exhibiting a very powerful magnetic field. These strange, violent entities are known as magnetars. As they are a fairly recent discovery, scientists are working hard to understand what magnetars are and how they acquired their strong magnetic field.

Denis Leahy, from the University of Calgary, Canada, presented a study on magnetars at a January 6th session at this week’s AAS meeting in Long Beach, revealing the hypothetical “quark star” could explain what we are seeing. Quark stars are thought to be the next stage up from neutron stars; as gravitational forces overwhelm the structure of the neutron degenerate matter, quark matter (or strange matter) is the result. However, the formation of a quark star may have an important side effect. Colour ferromagnetism in color-flavour locking quark matter (the most dense form of quark matter) could be a viable mechanism for generating immensely powerful magnetic flux as observed in magnetars. Therefore, magnetars may be the consequence of very compressed quark matter.

These results were arrived at by computer simulation, how can we observe the effect of a quark star — or the “quark star phase” of a magnetar — in a supernova remnant? According to Leahy, the transition from neutron star to quark star could occur from days to thousands of years after the supernova event, depending on the conditions of the neutron star. And what would we see when this transition occurs? There should be a secondary flash of radiation from the neutron star after the supernova due to liberation of energy as the neutron structure collapses, possibly providing astronomers with an opportunity to “see” a magnetar being “switched on”. Leahy also calculates that 1-in-10 supernovae should produce a magnetar remnant, so we have a pretty good chance at spotting the mechanism in action.

Studying Black Holes Using a PlayStation 3

Binary waves from black holes. Image Credit: K. Thorne (Caltech) , T. Carnahan (NASA GSFC)

[/caption]
If you’re a PlayStation 3 fan, or if you just received one as a holiday gift, you may be able to do more with the system than just gaming. A group of gravity researchers have configured 16 PlayStation 3’s together to create a type of supercomputer that is helping them estimate properties of the gravitational waves produced by the merger of two black holes. The research team from the University of Alabama in Huntsville and the University of Massachusetts, Dartmouth, calls their configuration the Gravity Grid, and they say the Sony PlayStation 3 has a number of unique features that make it particularly suited for scientific computation. Equally important, the raw computing power per dollar provided by the PS3 is significantly higher than anything else on the market today.

PlayStation 3s have also been used by the Folding@Home project, to harness the PS3’s technology to help study how proteins are formed in the human body and how they sometimes form incorrectly. This helps in research in several diseases such as Parkinson’s, Alzheimer’s, cystic fibrosis, and even Mad-Cow disease.

Front view of the cluster of PS3's. Credit:  GravityGrid
Front view of the cluster of PS3's. Credit: GravityGrid

The PS3 uses a powerful new processor called the Cell Broadband Engine to run its highly realistic games, and can connect to the Internet so gamers can download new programs and take each other on.

The PlayStation 3 cluster used by the gravity research team can solve some astrophysical problems, such as ones involving many calculations but low memory usage, equaling the speed of a rented super-computer.
“If we had rented computing time from a supercomputer center it would have cost us about $5,000 to run our [black hole] simulation one time. For this project we ran our simulation several dozens of times to test different parameters and circumstances,” study author Lior Burko told Inside Science News Service.

One of the unique features of the PS3 is that it is an open platform, where different system software can be run on it. It’s special processor has a main CPU (called the PPU) and six special compute engines (called SPUs) available for raw computation. Moreover, each SPU performs vector operations, which implies that they can compute on multiple data, in a single step.

But the low cost is especially attractive to university researchers. The Gravity Grid team received a partial donation from Sony, and are using “stock” PS3s for the cluster, with no hardware modifications and are networked together using inexpensive equipment.

Gravitational waves are “ripples” in space-time that travel at the speed of light. These were theoretically predicted by Einstein’s general relativity, but have never been directly observed. Other research is being done in this area by the newly constructed NSF LIGO laboratory and various other such observatories in Europe and Asia. The ESA and NASA also have a mission planned in the near future – the LISA mission – that will also be attempting to detect these waves. To learn more about these waves and the recent attempts to observe them, please visit the LISA mission website.

More information on the PS3 Gravity Grid.

Sources: USA Today, Gravity Grid

Nominations: The Universe Today Top 10 Scientific Endeavours of 2008

2008 has been a landmark year for space science and physics endeavour. We’ve peered deep into the cosmos and fitted new pieces into some of the most intriguing universal puzzles. We’ve explored other planets with technology we wouldn’t have recognised a decade ago. We’ve assembled some of the most complex experiments to test theories of the very small and the very big. 2008 has built strong foundations for the future of the exploration of the Universe in so many ways…

This week, Time Magazine published the top 10 “Scientific Discoveries” of 2008. Technically, as many readers pointed out, a few of the entries are not “discoveries”, they are “achievements”. Although this might have been the case, space exploration and physics dominated, with the #1 slot going to the LHC and #2 slot going to the Phoenix Mars Lander (#4 and #6 went to the Chinese spacewalk and exoplanets respectively). After reading the superb suggestion put forward by Astrofiend (thanks!), it was the push I needed to want to create a Universe Today version of a “Top 10” for 2008 (I’d love to do a top 20, but I have to find some time for Christmas shopping).

This top ten will focus on the last 12 months of Universe Today articles, so take a journey through the year’s events in space science and physics to find your favourite scientific endeavour of 2008. If you can’t find the article, just leave the name of the specific endeavour and we’ll do the rest. Please leave all nominations in the comments box below…

You have one week to get your nominations in (so your deadline is December 19th), and I’ll compile the list of winners hopefully in time for Christmas. The nominations will be considered not only according to popularity, but also chosen by your unbiased Universe Today team…

So, get nominating! You have 7 days…