Happy 10th Birthday, Chandra X-Ray Observatory!


Ten years ago, on July 23, 1999, NASA’s Chandra X-ray Observatory was deployed into orbit by the space shuttle Columbia.  Far exceeding its intened 5-year life span, Chandra has demonstrated an unrivaled ability to create high-resolution X- ray images, and enabled astronomers to investigate phenomena as diverse as comets, black holes, dark matter and dark energy.

“Chandra’s discoveries are truly astonishing and have made dramatic changes to our understanding of the universe and its constituents,” said Martin Weisskopf, Chandra project scientist at NASA’s Marshall Space Flight Center in Huntsville, Ala.

The science generated by Chandra — both on its own and in conjunction with other telescopes in space and on the ground — led to a widespread, transformative impact on 21st century astrophysics. Chandra has provided the strongest evidence yet that dark matter must exist. It has independently confirmed the existence of dark energy and made spectacular images of titanic explosions produced by matter swirling toward supermassive black holes.

To commemorate the 10th anniversary of Chandra, three new versions of classic Chandra images will be released during the next three months. These images, the first of which was released today, provide new data and a more complete view of objects that Chandra observed in earlier stages of its mission. The image being released today is of the spectacular supernova remnant E0102-72.

“The Great Observatories program — of which Chandra is a major part — shows how astronomers need as many tools as possible to tackle the big questions out there,” said Ed Weiler, associate administrator of NASA’s Science Mission Directorate at NASA Headquarters in Washington. NASA’s other “Great Observatories” are the Hubble Space Telescope, Compton Gamma-Ray Observatory and Spitzer Space Telescope.

The next image will be released in August to highlight the anniversary of when Chandra opened up for the first time and gathered light on its detectors. The third image will be released during “Chandra’s First Decade of Discovery” symposium in Boston, which begins Sept. 22.

“I am extremely proud of the tremendous team of people who worked so hard to make Chandra a success,” said Harvey Tananbaum, director of the Chandra X-ray Center at the Smithsonian Astrophysical Observatory in Cambridge, Mass. “It has taken partners at NASA, industry and academia to make Chandra the crown jewel of high-energy astrophysics.”

Tananbaum and Nobel Prize winner Riccardo Giacconi originally proposed Chandra to NASA in 1976. Unlike the Hubble Space Telescope, Chandra is in a highly elliptical orbit that takes it almost one third of the way to the moon, and was not designed to be serviced after it was deployed.

The Chandra X-ray Observatory was named after the great Indian-born American astrophysicist Subrahmanyan Chandrasekhar, who served on the faculty at the University of Chicago for almost 60 years, winning the 1983 Nobel Prize in Physics for his work on explaining the structure and evolution of stars.

A Table-Top Test of General Relativity?


Even Albert Einstein might have been impressed. His theory of general relativity, which describes how the gravity of a massive object, such as a star, can curve space and time, has been used to predict small shifts in the orbit of Mercury, gravitational lensing by galaxies and black holes, and the existence of gravitational waves.  Now, new research shows it may soon be possible to study the effects of general relativity in bench-top laboratory experiments.

Xiang Zhang, a faculty scientist with the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and professor at the University of California Berkeley, lead a study that shows the interactions of light and matter with spacetime, as predicted by general relativity, can be studied using the new breed of artificial optical materials that feature extraordinary abilities to bend light and other forms of electromagnetic radiation.

“We propose a link between the newly emerged field of artificial optical materials to that of celestial mechanics, thus opening a new possibility to investigate astronomical phenomena in a table-top laboratory setting,” says Zhang. “We have introduced a new class of specially designed optical media that can mimic the periodic, quasi-periodic and chaotic motions observed in celestial objects that have been subjected to complex gravitational fields.”

Zhang, a principal investigator with Berkeley Lab’s Materials Sciences Division and director of UC Berkeley’s Nano-scale Science and Engineering Center, has been one of the pioneers in the creation of artificial optical materials. Last year, he and his research group made headlines when they fashioned unique metamaterials – composites of metals and dielectrics – that were able to bend light backwards, a property known as a negative refraction that is unprecedented in nature. More recently, he and his group fashioned a “carpet cloak” from nanostructured silicon that concealed the presence of objects placed under it from optical detection. These efforts not only suggested that true invisibility materials are within reach, Zhang said, but also represented a major step towards transformation optics that would “open the door to manipulating light at will.”

Now he and his research group have demonstrated that a new class of metamaterials called “continuous-index photon traps” or CIPTs can serve as broadband and radiation-free “perfect” optical cavities. CIPTs can control, slow and trap light in a manner similar to such celestial phenomena as black holes and gravitational lenses. This equivalence between the motion of the stars in curved spacetime and propagation of the light in optical metamaterials engineered in a laboratory is referred to as the “optical-mechanical analogy.”

Zhang says that such specially designed metamaterials can be valuable tools for studying the motion of massive celestial bodies in gravitational potentials under a controlled laboratory environment. Observations of such celestial phenomena by astronomers are often impractical because of the long time scales of the interactions on a astronomical scale.

“If we twist our optical metamaterial space into new coordinates, the light that travels in straight lines in real space will be curved in the twisted space of our transformational optics,” says Zhang. “This is very similar to what happens to starlight when it moves through a gravitational potential and experiences curved spacetime. This analogue between classic electromagnetism and general relativity, may enable us to use optical metamaterials to study relativity phenomena such as gravitational lens.”

In their demonstration studies, the team used a composite structure of air and the semiconductor Gallium Indium Arsenide Phosphide (GaInAsP). This material provided operation at the infrared spectral range and featured a high refractive index with low absorption.

In their paper, Zhang and his coauthors cite as a particularly intriguing prospect for applying artificial optical materials to the optical-mechanical analogy the study of the phenomenon known as chaos. The onset of chaos in dynamic systems is one of the most fascinating problems in science and is observed in areas as diverse as molecular motion, population dynamics and optics. In particular, a planet around a star can undergo chaotic motion if a perturbation, such as another large planet, is present. However, because of the large spatial distances between the celestial bodies, and the long periods involved in the study of their dynamics, the direct observation of chaotic planetary motion has been a challenge. The use of the optical-mechanical analogy may enable such studies to be accomplished on demand in a bench-top laboratory setting.

“Unlike astronomers, we will not have to wait 100 years to get experimental results,” Zhang says.

The paper titled “Mimicking Celestial Mechanics in Metamaterials” is now available on-line in the journal Nature Physics.

Source: Lawrence Berkeley National Lab

Solved: Mystery of Gamma Ray Distribution in the Milky Way

A team of astrophysicists has solved the mystery of the distribution of gamma rays in our Milky Way galaxy.  While some researchers thought the distribution suggested a form of undetectable “dark matter”, the team from the University of California, San Diego, proposed an explanation based on standard physical models of the galaxy.

In two separate scientific papers, the most recent of which appears in the July 10 issue of the journal Physical Review Letters, the astrophysicists show that this distribution of gamma rays can be explained by the way “antimatter positrons” from the radioactive decay of elements, created by massive star explosions in the galaxy, propagate through the galaxy. That means, the scientists said, the observed distribution of gamma rays is not evidence for dark matter.

“There is no great mystery,” said Richard Lingenfelter, a research scientist at UC San Diego’s Center for Astrophysics and Space Sciences who conducted the studies with Richard Rothschild, a research scientist also at UCSD, and James Higdon, a physics professor at the Claremont Colleges. “The observed distribution of gamma rays is in fact quite consistent with the standard picture.”

Over the past five years, gamma ray measurements from the European satellite INTEGRAL have perplexed astronomers, leading some to argue that a “great mystery” existed because the distribution of these gamma rays across different parts of the Milky Way galaxy was not as expected.

To explain the source of this mystery, some astronomers had hypothesized the existence of various forms of dark matter, which astronomers suspect exists—from the unusual gravitational effects on visible matter such as stars and galaxies—but have not yet found.

What is known for certain is that our galaxy—and others—are filled with tiny subatomic particles known as positrons, the antimatter counterpart of typical, everyday electrons. When an electron and positron encounter each other in space, the two particles annihilate and their energy is released as gamma rays. That is, the electron and positron disappear and two or three gamma rays appear.

”These positrons are born at nearly the speed of light, and travel thousands of light years before they slow down enough in dense clouds of gas to have a chance of joining with an electron to annihilate in a dance of death,” explains Higdon. “Their slowing down occurs from the drag of other particles during their journey through space. Their journey is also impeded by the many fluctuations in the galactic magnetic field that scatter them back and forth as they move along. All of this must be taken into account in calculating the average distance the positrons would travel from their birthplaces in supernova explosions.”

”Some positrons head towards the center of the Galaxy, some towards the outer reaches of the Milky Way known as the galactic halo, and some are caught in the spiral arms,” said Rothschild. “While calculating this in detail is still far beyond the fastest supercomputers, we were able to use what we know about how electrons travel throughout the solar system and what can be inferred about their travel elsewhere to estimate how their anti-matter counterparts permeate the galaxy.”

The scientists calculated that most of the gamma rays should be concentrated in the inner regions of the galaxy, just as was observed by the satellite data, the team reported in a paper published last month in the Astrophysical Journal.

“The observed distribution of gamma rays is consistent with the standard picture where the source of positrons is the radioactive decay of isotopes of nickel, titanium and aluminum produced in supernova explosions of stars more massive than the Sun,” said Rothschild.

In their companion paper in this week’s issue of Physical Review Letters, the scientists point out that a basic assumption of one of the more exotic explanations for the purported mystery—dark matter decays or annihilations—is flawed, because it assumes that the positrons annihilate very close to the exploding stars from which they originated.

“We clearly demonstrated this was not the case, and that the distribution of the gamma rays observed by the gamma ray satellite was not a detection or indication of a ‘dark matter signal’,” said Lingenfelter.

Source: UC San Diego

Australian Astronomers Reveal Image of A Cosmic “Blue Whale”


Astronomers at Australia’s Commonwealth Scientific Research Organization (CSRIO) have revealed the hidden face of an enormous galaxy called Centaurus A, which emits a radio glow covering an area 200 times bigger than the full Moon.

The galaxy’s radio waves have been painstakingly transformed into a highly detailed image, which is being unveiled to the public for the first time.

Centaurus A lies 14 million light-years away, in the southern constellation Centaurus, and houses a monster black hole 50 million times the mass of the Sun.

The galaxy’s black hole generates jets of radio-emitting particles that billow millions of light years out into space.

The spectacular sight is invisible to the naked eye.

“If your eyes could see radio waves you would look up in the sky and see the radio glow from this galaxy covering an area 200 times bigger than the full Moon,” said the lead scientist for the project, Dr Ilana Feain of CSIRO’s Australia Telescope National Facility (ATNF).

“Only a small percentage of galaxies are of this kind. They’re like the blue whales of space – huge and rare.”

Seen at radio wavelengths, Centaurus A is so big and bright that no-one else has ever tried making such an image.

“This is the most detailed radio image ever made of Centaurus A, and indeed of any galaxy that produces radio jets,” said Dr Lewis Ball, Acting Director of the ATNF.

“Few other groups in the world have the skills and the facilities to make such an image, and we were the first to try.”

Dr Feain and her team used CSIRO’s Australia Telescope Compact Array telescope near Narrabri, NSW, to observe the galaxy for more than 1200 hours, over several years.

This produced 406 individual images, which were ‘mosaiced’ together to make one large image.

Dr Feain combined the Compact Array data and data taken from CSIRO’s Parkes radio telescope.

Processing the image – combining the data, taking out the effects of radio interference, and adjusting the dynamic range – took a further 10,000 hours.

Astronomers will use the image to help them understand how black holes and radio jets interact with a galaxy’s stars and dust, and how the galaxy has evolved over time.

Centaurus A is the closest of the galaxies with a supermassive black hole producing radio jets, which makes it the easiest to study.

Astronomers are interested in studying more of these rare, massive galaxies to determine the role black holes play in galaxy formation and growth.

Centaurus A was one of the first cosmic radio sources known outside our own Galaxy.

The (visible) galaxy was discovered and recorded at Parramatta Observatory near Sydney in 1826. It was later catalogued as  NGC 5128.

As a radio source, Centaurus A was discovered from Dover Heights in Sydney by CSIRO scientists in 1947.

The CSIRO image of Centaurus A were presented on Friday July 3 at an international conference, The Many Faces of Centaurus A, at the Mint in Sydney.

Source: CSRIO

NASA IBEX Spacecraft Detects Neutral Hydrogen Bouncing Off Moon

NASA's Interstellar Boundary Explorer has made the first detection of neutral atoms coming from the Moon (background image). The color-coded data toward the bottom shows the neutral particles and geometry measured at the Moon on Dec. 3, 2008.


NASA’s Interstellar Boundary Explorer (IBEX) spacecraft has made the first observations of fast hydrogen atoms coming from the moon, following decades of speculation and searching for their existence.   Launched last October, the IBEX has a mission to image and map the dynamic interactions caused by the hot solar wind slamming into the cold expanse of space.  But as the IBEX team commissioned the spacecraft, they discovered the stream of neutral hydrogen atoms which are caused by the solar wind scattering off the moon’s surface.

The detector which made the discovery, called IBEX-Hi, was designed and built by the Southwest Research Institute and Los Alamos National Labs to measure particles moving at speeds of 0.5 million to 2.5 million miles an hour.

“Just after we got IBEX-Hi turned on, the moon happened to pass right through its field of view, and there they were,” says Dr. David J. McComas, IBEX principal investigator and assistant vice president of the SwRI Space Science and Engineering Division, where the IBEX-Hi particle detector was primarily built. “The instrument lit up with a clear signal of the neutral atoms being detected as they backscattered from the moon.”

The solar wind, the supersonic stream of charged particles that flows out from the sun, moves out into space in every direction at speeds of about a million mph. The Earth’s strong magnetic field shields our planet from the solar wind. The moon, with its relatively weak magnetic field, has no such protection, causing the solar wind to slam onto the moon’s sunward side.

From its vantage point in high earth orbit, IBEX sees about half of the moon — one quarter of it is dark and faces the nightside (away from the sun), while the other quarter faces the dayside (toward the sun). Solar wind particles impact only the dayside, where most of them are embedded in the lunar surface, while some scatter off in different directions. The scattered ones mostly become neutral atoms in this reflection process by picking up electrons from the lunar surface.

The IBEX team estimates that only about 10 percent of the solar wind ions reflect off the sunward side of the moon as neutral atoms, while the remaining 90 percent are embedded in the lunar surface. Characteristics of the lunar surface, such as dust, craters and rocks, play a role in determining the percentage of particles that become embedded and the percentage of neutral particles, as well as their direction of travel, that scatter.

McComas says the results also shed light on the “recycling” process undertaken by particles throughout the solar system and beyond. The solar wind and other charged particles impact dust and larger objects as they travel through space, where they backscatter and are reprocessed as neutral atoms. These atoms can travel long distances before they are stripped of their electrons and become ions and the complicated process begins again.

The combined scattering and neutralization processes now observed at the moon have implications for interactions with objects across the solar system, such as asteroids, Kuiper Belt objects and other moons. The plasma-surface interactions occurring within protostellar nebula, the region of space that forms around planets and stars — as well as exoplanets, planets around other stars — also can be inferred.

IBEX’s primary mission is to observe and map the complex interactions occurring at the edge of the solar system, where the million miles per hour solar wind runs into the interstellar material from the rest of the galaxy. The spacecraft carries the most sensitive neutral atom detectors ever flown in space, enabling researchers to not only measure particle energy, but also to make precise images of where they are coming from.

And the spacecraft is just getting started.  Towards the end of the summer, the team will release the spacecraft’s first all-sky map showing the energetic processes occurring at the edge of the solar system. The team will not comment until the image is complete, but McComas hints, “It doesn’t look like any of the models.”

The research was published recently in the journal Geophysical Research Letters.

Source: Southwest Research Institute

New Sky Survey To Catch Exploding Stars In The Act


An innovative new sky survey called the Palomar Transient Factory (PTF) will use a 48-inch telescope together with the U.S. Department of Energy’s (DOE’s) National Energy Research Scientific Computing Center (NERSC) to discover relatively rare and fleeting cosmic events like supernovae and gamma ray bursts.  The survery is already in progress, and during the commissioning phase alone, the survey has already uncovered more than 40 supernovae.  Astronomers expect to discover thousands more each year.

“This survey is a trail blazer in many ways – it is the first project dedicated solely to finding transient events, and as part of this mission we’ve worked with NERSC to develop an automated system that will sift through terabytes of astronomical data every night to find interesting events, and have secured time on some of the world’s most powerful ground-based telescopes to conduct immediate follow up observations as events are identified,” says Shrinivas Kulkarni, a professor of astronomy and planetary science at the California Institute of Technology (Caltech), and Director of Caltech Optical Observatories. He is also principle investigator of the PTF survey.

“This truly novel survey combines the power of a wide-field telescope, a high-resolution camera, and high-performance network and computing, as well as the ability to conduct rapid follow-up observations with telescopes around the globe for the first time,” says Peter Nugent, a computational staff scientist in Berkeley Lab’s Computational Research Division (CRD) and the NERSC Analytics Group. Nugent is also the Real-time Transient Detection Lead for the PTF project.

Every night the PTF camera – a 100-megapixel machine mounted on the 48-inch Samuel Oschin Telescope at Palomar Observatory in Southern California – will automatically snap pictures of the sky, then send those images to NERSC for archiving via a high-speed network provided by DOE’s Energy Sciences Network (ESnet) and the National Science Foundation’s (NSF’s) High Performance Wireless Research and Education Network (HPWREN).

At NERSC, computers running machine-learning algorithms in the Real-time Transient Detection pipeline scour the PTF observations for “transient” sources, cosmic objects that change in brightness or position, by comparing the new observations with all of the data collected from previous nights. Within minutes after interesting event is discovered, machines at NERSC will send its coordinates to Palomar’s 60-inch telescope for follow up observations.

“We are currently uncovering one event every 12 minutes. This project will be keeping the astronomical community busy for quite a while,” says Kulkarni.

The primary target of the sky survey are Type Ia and Type II supernovae.

Because they are relatively uniform in brightness, Type Ia supernovae act as cosmic lighthouses, helping astronomers judge the distance scale of the universe. Many astronomers participating in the PTF survey are specifically searching for these phenomena.

And Type II supernovae, the kind cause by the detonation of a massive star that’s run out of fuel, blast heavy elements into interstellar space, where they eventually form new stars and planets.

“These tools are extremely valuable because they not only help us identify supernova, they uncover them while the star is in the act of exploding,” says Robert Quimby of Caltech, who is the software lead for the PTF program. “This gives us valuable information about how cosmic dust is spread across the universe.”

“It is very exciting to find so many supernovae, so early in the project. It’s like we’ve just turned on the spigot and are now waiting for the fire hose to blast,” says Quimby.

Source: Lawrence Berkeley National Labs

Astronomers Predict Birth of a New Star


A computer simulation of the dark nebula Barnard 68 suggests the cloud will collapse into a brand new star relatively soon… at least on an astronomical time scale.

Astrophysicist João Alves, director of the Calar Alto Observatory in Spain, and his colleague Andreas Bürkert from  the University of Munich, believe the dark cloud Barnard 68 will inevitably collapse and give rise to the new star, according to an article published recently in the April 2009 issue of  The Astrophysical Journal.

Barnard 68 (B68) is a dark nebula about 400 light years away in the constellation of Ophiuchus. Such nebulae are interstellar clouds of dust and gas located within the Milky Way which block out the light of the stars and other objects behind them.

Most astronomers believe stars form from giant gas clouds which collapse under their own gravity until high density and temperatures lead to nuclear fusion.  Although many details of the process are still not understood, the new study may be able to shed some light on this.

Alves and Bürkert suggest the collision of two gas clouds could be the mechanism that activates the birth of a star. They suggest Barnard 68 is already in an initial unstable state and that it will collapse “soon” – within some 200,000 years.

Images show B68 is a cold gas cloud with a mass equivalent to that of two suns.  But there’s a smaller cloud just 1/10 as massive getting close enough to collide with the larger cloud.

In order to prove their theory, the two astrophysicists have simulated the scenario in a supercomputer at the University of Munich. They modelled two globules separated by one light year, with masses and speeds similar to those of Barnard 68 and its “small” companion. By using a numerical algorithm, the researchers showed how these two virtual gas clouds evolved over time.

The results showed that the smaller globule penetrated the larger one after around 1.7 million years at a speed of 370 metres per second. The model also showed that the stability of the initial situation declined over time. At the moment when the two globules merged, enormous densities were generated, making the system collapse and creating the ideal conditions for the formation of a star.

The researchers varied the physical parameters of the globules until they worked out the circumstances in which the merger of two gas clouds will lead to their subsequent collapse. According to Bürkert and Alves’ calculations, a new star system will form from B68 within 200,000 years.

Source: FECYT – Spanish Foundation for Science and Technology

The Strange Case of Supernova SN2008ha

Image of SN2008ha in the galaxy UGC 12682 in Pegasus.


Almost immediately after it was discovered last November by 14-year-old Caroline Moore of the Puckett Observatory Supernova Search Team, professional astronomers knew supernova SN2008ha was a strange one.  The spectra of the blast showed no signs of hydrogen, which meant it must be a Type Ia supernova caused by the explosion of a white dwarf accreting matter in a binary star system.  But if so, why was it some 50 times fainter than other supernova of its type?

Now in a controversial new paper in the journal Nature, astronomers from Queen’s University Belfast have proposed a new explanation of this supernova.  The researchers, led by Dr. Stefano Valenti, suggest that even though the explosion contained no hydrogen, SN2008ha could be a Type II supernova, the kind caused by the core collapse of a massive star.

Valenti and his colleagues argue that, despite the lack of hydrogen, the spectrum of SN2008ha more closely resembles Type II supernovae.  They cite the lack of emission lines from ionized silicon as as evidence of why SN2008ha is not a Type Ia.  And they cite other supernovae that exhibited similar characteristics, which he says might be less extreme examples of hydrogen-deficient Type II supernovae.

“SN2008ha is the most extreme example of a group of supernovae that show similar properties”, said Dr. Valenti. Up until now the community had thought that they were from the explosion of white dwarfs, which we call type  Ia supernovae. But we think SN2008ha doesn’t quite fit this picture and appears physically related to massive stars”.

But if SN2008ha is a Type II supernova, where did the hydrogen go?  The answer might be mass loss.  Some stars are so massive and luminous that they lose their outer hydrogen layers in strong outflowing stellar winds.  And because they’re so massive, their cores collapse into a black hole without transfering energy to the outer layers of the star, which may explain the low luminosity of the explosion.

“The implications are quite important. If this is a massive star explosion, then it is the first one that might fit the theoretical models of massive stars that lose their outer layers through their huge luminosity pressure and then, perhaps, collapse to black holes with a whimper”, said Dr. Valenti.

Professor Stephen Smartt from Queen’s added “This is still quite controversial, we have put this idea forward and it certainly needs to be taken seriously.

Dr. Valenti’s team is keen to use new deep, time resolved surveys of the Universe to find more of these and test their ideas. One such experiment is the first of the Pan-STARRS telescopes that has started surveying the sky in the last month.

Source:  Queen’s University Belfast

Original Paper:  Nature

So Where Is ET, Anyway?

While having lunch with colleagues at Los Alamos National Labs in 1950, physicist Enrico Fermi mused about the likelihood of intelligent life existing elsewhere in the Universe.  Fermi, one of the most astute scientists of his day, thought the size and age of the Universe means many advanced civilizations should have already colonized the galaxy, just as humans colonized and explored the Earth.   But if such galaxy-wide extraterrestrial civilizations exist, he wondered, where are they?

Some believe this problem, called the Fermi Paradox, means advanced extraterrestrial societies are rare or nonexistent.  Others suggest they must destroy themselves before they move on to the stars.

But this week, Jacob D. Haqq-Misra and Seth D. Baum at Penn State University proposed another solution to the Fermi Paradox: that extraterrestrial civilizations haven’t colonized the galaxy because the exponential growth of a civilization required to do so is unsustainable.

The researchers call their idea the “Sustainability Solution”.  It states: “The absence of ETI (extra-terrestrial intelligence) observation can be explained by the possibility that exponential or other faster growth is not a sustainable development pattern for intelligent civilizations.”

The researchers base their conclusions on a study of civilizations on Earth.  Historically, rapid growth of societies means rapid resource depletion and environmental degradation, usually with dire results.  They cite the example of Easter Island, where resource depletion likely caused a collapse of the local population.  And they conclude that while there are examples of sustainable growth like the !Kung San people of the Kalahari Desert, exponential growth in population and spatial expansion of a society is almost always linked to unsustainable growth and eventual collapse.

This principle has implications for our current global civilization.  Since Earth’s resources are finite and it receives solar radiation at a constant rate, human civilization cannot sustain an indefinite, exponential growth.  But even if we survive and advance as a civilization, we may have trouble colonizing the galaxy should we ever decide to do so.  And if this limitation applies to us, it may apply to other civilizations as well.

But the Sustainability Solution doesn’t mean ET is not out there.  Slower-growth extraterrestrial societies might still communicate by radio or other wavelengths, so current SETI programs still make sense.  Or ETI may result in chemical bio-markers in planetary atmospheres which may leave spectroscopic signatures detectable with upcoming generations of Earth and space-based planet-hunting telescopes.

The Sustainability Solution also allows that advanced civilizations may indeed colonize the galaxy, then collapse as resources are consumed at an unsustainable rate.

And some civilizations may send small messenger probes to other stars, which suggests a search for extraterrestrial artifacts (SETA) within our own solar system might be just as fruitful as radio-based SETI.  Searches might involve radio or visible detection of extraterrestrial probes orbiting the sun.  Or artifacts may even be embedded within planets or moons of our solar system, just like the giant black monoliths in Arthur C. Clarke’s 2001: A Space Odyssey.

In any case, the discovery of artifacts from a slow-growth extraterrestrial civilization would be an example “sustainable development” on a galactic scale.

You can read the original article here.

What If There Is Only One Universe?

When it comes to universes, perhaps one is enough after all.

Many theories in physics and cosmology require the existence of alternate, or parallel, universes.  But Dr. Lee Smolin of the Perimeter Institute for Theoretical Physics in Waterloo, Canada, explains the flaws of theories that suggest our universe is just one of many, and which also perpetuate the notion that time does not exist.  Smolin, author of the bestselling science book ‘The Trouble with Physics’ and a founding member of the Perimeter Institute, explains his views in the June issue of Physics World.

Smolin explains how theories describing a myriad of possible universes, or a “multiverse”, with many dimensions and particles and forces have become more popular in the last few years. However, through his work with the Brazilian philosopher Roberto Mangabeira Unger, Smolin believes that multiverse theories, which imply that time is not a fundamental concept, are “profoundly mistaken”.

Smolin says a timeless multiverse means our laws of physics can’t be determined from experiment.  And he explains the unclear connection between fundamental laws, which are unique and applicable universally, and effective laws, which hold based on what we can actually observe.

Smolin suggests new principles that rethink the notion of physical law to apply to a single universe.  These principles say there is only one universe; that all that is real is real in a moment, as part of a succession of moments; and that everything real in each moment is a process of change leading to future moments. As he explains, “If there is just one universe, there is no reason for a separation into laws and initial conditions, as we want a law to explain just one history of one universe.”

He hopes these principles will bring a fresh adventure in science.

If we accept there is only one universe and that time is a fundamental property of nature, then this opens up the possibility that the laws of physics evolve with time. As Smolin writes, “The notion of transcending our time-bound experiences in order to discover truths that hold timelessly is an unrealizable fantasy. When science succeeds, we do nothing of the sort; what we physicists really do is discover laws that hold in the universe we experience within time. This, I would claim, should be enough; anything beyond that is more a religious urge for transcendence than science.”

Source: Institute of Physics