Oops! In a happy accident, Comet Lovejoy just happened to be in the field of view of the 570-megapixel Dark Energy Camera, the world’s most powerful digital camera. One member of the observing team said it was a “shock” to see Comet Lovejoy pop up on the display in the control room.
“It reminds us that before we can look out beyond our Galaxy to the far reaches of the Universe, we need to watch out for celestial objects that are much closer to home!” wrote the team on the Dark Energy Detectives blog.
On December 27, 2014, while the Dark Energy Survey was scanning the southern sky, C2014 Q2 entered the camera’s view. Each of the rectangular shapes above represents one of the 62 individual fields of the camera.
At the time this image was taken, the comet was passing about 82 million km (51 million miles) from Earth. That’s a short distance for the Dark Energy Camera, which is sensitive to light up to 8 billion light years away. The comet’s center is likely made of rock and ice and is roughly 5 km (3 miles) across. The visible coma of the comet is a cloud of gas and dust about 640,000 km (400,000 miles) in diameter.
The Dark Energy Survey (DES) is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision.
The camera just finished up the third, six-month-long season of observations, and the camera won’t be observing again until this fall.
You can download higher resolution versions of this image here.
Cosmologists are intellectual time travelers. Looking back over billions of years, these scientists are able to trace the evolution of our Universe in astonishing detail. 13.8 billion years ago, the Big Bang occurred. Fractions of a second later, the fledgling Universe expanded exponentially during an incredibly brief period of time called inflation. Over the ensuing eons, our cosmos has grown to such an enormous size that we can no longer see the other side of it.
But how can this be? If light’s velocity marks a cosmic speed limit, how can there possibly be regions of spacetime whose photons are forever out of our reach? And even if there are, how do we know that they exist at all?
The Expanding Universe
Like everything else in physics, our Universe strives to exist in the lowest possible energy state possible. But around 10-36 seconds after the Big Bang, inflationary cosmologists believe that the cosmos found itself resting instead at a “false vacuum energy” – a low-point that wasn’t really a low-point. Seeking the true nadir of vacuum energy, over a minute fraction of a moment, the Universe is thought to have ballooned by a factor of 1050.
Since that time, our Universe has continued to expand, but at a much slower pace. We see evidence of this expansion in the light from distant objects. As photons emitted by a star or galaxy propagate across the Universe, the stretching of space causes them to lose energy. Once the photons reach us, their wavelengths have been redshifted in accordance with the distance they have traveled.
This is why cosmologists speak of redshift as a function of distance in both space and time. The light from these distant objects has been traveling for so long that, when we finally see it, we are seeing the objects as they were billions of years ago.
The Hubble Volume
Redshifted light allows us to see objects like galaxies as they existed in the distant past; but we cannot see all events that occurred in our Universe during its history. Because our cosmos is expanding, the light from some objects is simply too far away for us ever to see.
The physics of that boundary rely, in part, on a chunk of surrounding spacetime called the Hubble volume. Here on Earth, we define the Hubble volume by measuring something called the Hubble parameter (H0), a value that relates the apparent recession speed of distant objects to their redshift. It was first calculated in 1929, when Edwin Hubble discovered that faraway galaxies appeared to be moving away from us at a rate that was proportional to the redshift of their light.
Dividing the speed of light by H0, we get the Hubble volume. This spherical bubble encloses a region where all objects move away from a central observer at speeds less than the speed of light. Correspondingly, all objects outside of the Hubble volume move away from the center faster than the speed of light.
Yes, “faster than the speed of light.” How is this possible?
The Magic of Relativity
The answer has to do with the difference between special relativity and general relativity. Special relativity requires what is called an “inertial reference frame” – more simply, a backdrop. According to this theory, the speed of light is the same when compared in all inertial reference frames. Whether an observer is sitting still on a park bench on planet Earth or zooming past Neptune in a futuristic high-velocity rocketship, the speed of light is always the same. A photon always travels away from the observer at 300,000,000 meters per second, and he or she will never catch up.
General relativity, however, describes the fabric of spacetime itself. In this theory, there is no inertial reference frame. Spacetime is not expanding with respect to anything outside of itself, so the the speed of light as a limit on its velocity doesn’t apply. Yes, galaxies outside of our Hubble sphere are receding from us faster than the speed of light. But the galaxies themselves aren’t breaking any cosmic speed limits. To an observer within one of those galaxies, nothing violates special relativity at all. It is the space in between us and those galaxies that is rapidly proliferating and stretching exponentially.
The Observable Universe
Now for the next bombshell: The Hubble volume is not the same thing as the observable Universe.
To understand this, consider that as the Universe gets older, distant light has more time to reach our detectors here on Earth. We can see objects that have accelerated beyond our current Hubble volume because the light we see today was emitted when they were within it.
Strictly speaking, our observable Universe coincides with something called the particle horizon. The particle horizon marks the distance to the farthest light that we can possibly see at this moment in time – photons that have had enough time to either remain within, or catch up to, our gently expanding Hubble sphere.
And just what is this distance? A little more than 46 billion light years in every direction – giving our observable Universe a diameter of approximately 93 billion light years, or more than 500 billion trillion miles.
(A quick note: the particle horizon is not the same thing as the cosmological event horizon. The particle horizon encompasses all the events in the past that we can currently see. The cosmological event horizon, on the other hand, defines a distance within which a future observer will be able to see the then-ancient light our little corner of spacetime is emitting today.
In other words, the particle horizon deals with the distance to past objects whose ancient light that we can see today; the cosmological event horizon deals with the distance that our present-day light that will be able to travel as faraway regions of the Universe accelerate away from us.)
Thanks to the expansion of the Universe, there are regions of the cosmos that we will never see, even if we could wait an infinite amount of time for their light to reach us. But what about those areas just beyond the reaches of our present-day Hubble volume? If that sphere is also expanding, will we ever be able to see those boundary objects?
This depends on which region is expanding faster – the Hubble volume or the parts of the Universe just outside of it. And the answer to that question depends on two things: 1) whether H0 is increasing or decreasing, and 2) whether the Universe is accelerating or decelerating. These two rates are intimately related, but they are not the same.
In fact, cosmologists believe that we are actually living at a time when H0 is decreasing; but because of dark energy, the velocity of the Universe’s expansion is increasing.
That may sound counterintuitive, but as long as H0 decreases at a slower rate than that at which the Universe’s expansion velocity is increasing, the overall movement of galaxies away from us still occurs at an accelerated pace. And at this moment in time, cosmologists believe that the Universe’s expansion will outpace the more modest growth of the Hubble volume.
So even though our Hubble volume is expanding, the influence of dark energy appears to provide a hard limit to the ever-increasing observable Universe.
Our Earthly Limitations
Cosmologists seem to have a good handle on deep questions like what our observable Universe will someday look like and how the expansion of the cosmos will change. But ultimately, scientists can only theorize the answers to questions about the future based on their present-day understanding of the Universe. Cosmological timescales are so unimaginably long that it is impossible to say much of anything concrete about how the Universe will behave in the future. Today’s models fit the current data remarkably well, but the truth is that none of us will live long enough to see whether the predictions truly match all of the outcomes.
Disappointing? Sure. But totally worth the effort to help our puny brains consider such mind-bloggling science – a reality that, as usual, is just plain stranger than fiction.
Sometimes when you stare at something long enough, you begin to see things. This is not the case with optical sensors and telescopes. Sure, there is noise from electronics, but it’s random and traceable. Stargazing with a telescope and camera is ideal for staring at the same patches of real estate for very long and repeated periods. This is the method used by the Dark Energy Survey (DES), and with less than one percent of the target area surveyed, astronomers are already discovering previously unknown objects in the outer Solar System.
The Dark Energy Survey is a five year collaborative effort that is observing Supernovae to better understand the structures and expansion of the universe. But in the meantime, transient objects much nearer to home are passing through the fields of view. Trans-Neptunian Objects (TNOs), small icy worlds beyond the planet Neptune, are being discovered. A new scientific paper, released as part of this year’s American Astronomical Society gathering in Seattle, Washington, discusses these newly discovered TNOs. The lead authors are two undergraduate students from Carleton College of Northfield, Minnesota, participating in a University of Michigan program.
The Palomar Sky Survey (POSS-1, POSS-2), the Sloan Digital Sky Survey, and every other sky survey have mapped not just the static, nearly unchanging night sky, but also transient events such as passing asteroids, comets, or novae events. The Dark Energy Survey is looking at the night sky for structures and expansion of the Universe. As part of the five year survey, DES is observing ten select 3 square degree fields for Type 1a supernovae on a weekly basis. As the survey proceeds, they are getting more than anticipated. The survey is revealing more trans-Neptunian objects. Once again, deep sky surveys are revealing more about our local environment – objects in the farther reaches of our Solar System.
DES is an optical imaging survey in search of Supernovae that can be used as weather vanes to measure the expansion of the universe. This expansion is dependent on the interaction of matter and the more elusive exotic materials of our Universe – Dark Energy and Dark Matter. The five year survey is necessary to achieve a level of temporal detail and a sufficient number of supernovae events from which to draw conclusions.
In the mean time, the young researchers of Carleton College – Ross Jennings and Zhilu Zhang – are discovering the transients inside our Solar System. Led by Professor David Gerdes of the University of Michigan, the researchers started with a list of nearly 100,000 observations of individual transients. Differencing software and trajectory analysis helped identify those objects that were trans-Neptunian rather than asteroids of the inner Solar System.
While asteroids residing in the inner solar system will pass quickly through such small fields, trans-Neptunian objects (TNOs) orbit the Sun much more slowly. For example, Pluto, at an approximate distance of 40 A.U. from the Sun, along with the object Eris, presently the largest of the TNOs, has an apparent motion of about 27 arc seconds per day – although for a half year, the Earth’s orbital motion slows and retrogrades Pluto’s apparent motion. The 27 arc seconds is approximately 1/60th the width of a full Moon. So, from one night to the next, TNOs can travel as much as 100 pixels across the field of view of the DES survey detectors since each pixel has a width of 0.27 arc seconds.
The scientific sensor array, DECam, is located at Cerro Tololo Inter-American Observatory (CTIO) in Chile utilizing the 4-meter (13 feet) diameter Victor M. Blanco Telescope. It is an array of 62 2048×4096 pixel back-illuminated CCDs totaling 520 megapixels, and altogether the camera weighs 20 tons.
With a little over 2 years of observations, the young astronomers stated, “Our analysis revealed sixteen previously unknown outer solar system objects, including one Neptune Trojan, several objects in mean motion resonances with Neptune, and a distant scattered disk object whose 1200-year orbital period is among the 50 longest known.”
“So far we’ve examined less than one percent of the area that DES will eventually cover,” says Dr. Gerdes. “No other survey has searched for TNOs with this combination of area and depth. We could discover something really unusual.”
What does it all mean? It is further confirmation that the outer Solar System is chock-full of rocky-icy small bodies. There are other examples of recent discoveries, such as the search for a TNO for the New Horizons mission. As New Horizons has been approaching Pluto, the team turned to the Hubble space telescope to find a TNO to flyby after the dwarf planet. Hubble made short shrift of the work, finding three that the probe could reach. However, the demand for Hubble time does not allow long term searches for TNOs. A survey such as DES will serve to uncover many thousands of more objects in the outer Solar System. As Dr. Michael Brown of Caltech has stated, there is a fair likelihood that a Mars or Earth-sized object will be discovered beyond Neptune in the Oort Cloud.
Imagine a single mission that would allow you to explore the Milky Way and beyond, investigating cosmic chemistry, hunting planets, mapping galactic structure, probing dark energy and analyzing the expansion of the wider Universe. Enter the Sloan Digital Sky Survey, a massive scientific collaboration that enables one thousand astronomers from 51 institutions around the world to do just that.
At Tuesday’s AAS briefing in Seattle, researchers announced the public release of data collected by the project’s latest incarnation, SDSS-III. This data release, termed “DR12,” represents the survey’s largest and most detailed collection of measurements yet: 2,000 nights’ worth of brand-new information about nearly 500 million stars and galaxies.
One component of SDSS is exploring dark energy by “listening” for acoustic oscillation signals from the the acceleration of the early Universe, and the team also shared a new animated “fly-through” of the Universe that was created using SDSS data.
The SDSS-III collaboration is based at the powerful 2.5-meter Sloan Foundation Telescope at the Apache Point Observatory in New Mexico. The project itself consists of four component surveys: BOSS, APOGEE, MARVELS, and SEGUE. Each of these surveys applies different trappings to the parent telescope in order to accomplish its own, unique goal.
BOSS (the Baryon Oscillation Spectroscopic Survey) visualizes the way that sound waves produced by interacting matter in the early Universe are reflected in the large-scale structure of our cosmos. These ancient imprints, which date back to the first 500,000 years after the Big Bang, are especially evident in high-redshift objects like luminous-red galaxies and quasars. Three-dimensional models created from BOSS observations will allow astronomers to track the expansion of the Universe over a span of 9 billion years, a feat that, later this year, will pave the way for rigorous assessment of current theories regarding dark energy.
At the press briefing, Daniel Eistenstein from the Harvard-Smithsonian Center for Astrophysics explained how BOSS requires huge volumes of data and that so far 1.4 million galaxies have been mapped. He indicated the data analyzed so far strongly confirm dark energy’s existence.
APOGEE (the Apache Point Observatory Galactic Evolution Experiment) employs a sophisticated, near-infrared spectrograph to pierce through thick dust and gather light from 100,000 distant red giants. By analyzing the spectral lines that appear in this light, scientists can identify the signatures of 15 different chemical elements that make up the faraway stars – observations that will help researchers piece together the stellar history of our galaxy.
MARVELS (the Multi-Object APO Radial Velocity Exoplanet Large-Area Survey) identifies minuscule wobbles in the orbits of stars, movements that betray the gravitational influence of orbiting planets. The technology itself is unprecedented. “MARVELS is the first large-scale survey to measure these tiny motions for dozens of stars simultaneously,” explained the project’s principal investigator Jian Ge, “which means we can probe and characterize the full population of giant planets in ways that weren’t possible before.”
At the press briefing, Ge said that MARVELS observed 5,500 stars repeatedly, looking for giant exoplanets around these stars. So far, the data has revealed 51 giant planet candidates as well as 38 brown dwarf candidates. Ge added that more will be found with better data processing.
SEGUE (the Sloan Extension for Galactic Understanding and Exploration) rounds out the quartet by analyzing visible light from 250,000 stars in the outer reaches of our galaxy. Coincidentally, this survey’s observations “segue” nicely into work being done by other projects within SDSS-III. Constance Rockosi, leader of the SDSS-III domain of SEGUE, recaps the importance of her project’s observations of our outer galaxy: “In combination with the much more detailed view of the inner galaxy from APOGEE, we’re getting a truly holistic picture of the Milky Way.”
One of the most exceptional attributes of SDSS-III is its universality; that is, every byte of juicy information contained in DR12 will be made freely available to professionals, amateurs, and lay public alike. This philosophy enables interested parties from all walks of life to contribute to the advancement of astronomy in whatever capacity they are able.
As momentous as the release of DR12 is for today’s astronomers, however, there is still much more work to be done. “Crossing the DR12 finish line is a huge accomplishment by hundreds of people,” said Daniel Eisenstein, director of the SDSS-III collaboration, “But it’s a big universe out there, so there is plenty more to observe.”
DR12 includes observations made by SDSS-III between July 2008 and June 2014. The project’s successor, SDSS-IV, began its run in July 2014 and will continue observing for six more years.
Here is the video animation of the fly-through of the Universe:
The Cosmic Microwave Background (CMB) radiation is one of the greatest discoveries of modern cosmology. Astrophysicist George Smoot once likened its existence to “seeing the face of God.” In recent years, however, scientists have begun to question some of the attributes of the CMB. Peculiar patterns have emerged in the images taken by satellites such as WMAP and Planck – and they aren’t going away. Now, in a paper published in the December 1 issue of The Astronomical Journal, one scientist argues that the existence of these patterns may not only imply new physics, but also a revolution in our understanding of the entire Universe.
Let’s recap. Thanks to a blistering ambient temperature, the early Universe was blanketed in a haze for its first 380,000 years of life. During this time, photons relentlessly bombarded the protons and electrons created in the Big Bang, preventing them from combining to form stable atoms. All of this scattering also caused the photons’ energy to manifest as a diffuse glow. The CMB that cosmologists see today is the relic of this glow, now stretched to longer, microwave wavelengths due to the expansion of the Universe.
As any fan of the WMAP and Planck images will tell you, the hallmarks of the CMB are the so-called anisotropies, small regions of overdensity and underdensity that give the picture its characteristic mottled appearance. These hot and cold spots are thought to be the result of tiny quantum fluctuations born at the beginning of the Universe and magnified exponentially during inflation.
Given the type of inflation that cosmologists believe occurred in the very early Universe, the distribution of these anisotropies in the CMB should be random, on the order of a Gaussian field. But both WMAP and Planck have confirmed the existence of certain oddities in the fog: a large “cold spot,” strange alignments in polarity known as quadrupoles and octupoles, and, of course, Stephen Hawking’s initials.
In his new paper, Fulvio Melia of the University of Arizona argues that these types of patterns (Dr. Hawking’s signature notwithstanding) reveal a problem with the standard inflationary picture, or so-called ΛCDM cosmology. According to his calculations, inflation should have left a much more random assortment of anisotropies than the one that scientists see in the WMAP and Planck data. In fact, the probability of these particular anomalies lining up the way they do in the CMB images is only about 0.005% for a ΛCDM Universe.
Melia posits that the anomalous patterns in the CMB can be better explained by a new type of cosmology in which no inflation occurred. He calls this model the R(h)=ct Universe, where c is the speed of light, t is the age of the cosmos, and R(h) is the Hubble radius – the distance beyond which light will never reach Earth. (This equation makes intuitive sense: Light, traveling at light speed (c) for 13.7 billion years (t), should travel an equivalent number of light-years. In fact, current estimates of the Hubble radius put its value at about 13.4 billion light-years, which is remarkably close to the more tightly constrained value of the Universe’s age.)
R(h)=ct holds true for both the standard cosmological scenario and Melia’s model, with one crucial difference: in ΛCDM cosmology, this equation only works for the current age of the Universe. That is, at any time in the distant past or future, the Universe would have obeyed a different law. Scientists explain this odd coincidence by positing that the Universe first underwent inflation, then decelerated, and finally accelerated again to its present rate.
Melia hopes that his model, a Universe that requires no inflation, will provide an alternative explanation that does not rely on such fine-tuning. He calculates that, in a R(h)=ct Universe, the probability of seeing the types of strange patterns that have been observed in the CMB by WMAP and Planck is 7–10%, compared with a figure 1000 times lower for the standard model.
So, could this new way of looking at the cosmos be a death knell for ΛCDM? Probably not. Melia himself cites a few less earth-shattering explanations for the anomalous signals in the CMB, including foreground noise, statistical biases, and instrumental errors. Incidentally, the Planck satellite is scheduled to release its latest image of the CMB this week at a conference in Italy. If these new results show the same patterns of polarity that previous observations did, cosmologists will have to look into each possible explanation, including Melia’s theory, more intensively.
Since the early 20th century, scientists and physicists have been burdened with explaining how and why the Universe appears to be expanding at an accelerating rate. For decades, the most widely accepted explanation is that the cosmos is permeated by a mysterious force known as “dark energy”. In addition to being responsible for cosmic acceleration, this energy is also thought to comprise 68.3% of the universe’s non-visible mass.
Much like dark matter, the existence of this invisible force is based on observable phenomena and because it happens to fit with our current models of cosmology, and not direct evidence. Instead, scientists must rely on indirect observations, watching how fast cosmic objects (specifically Type Ia supernovae) recede from us as the universe expands.
This process would be extremely tedious for scientists – like those who work for the Dark Energy Survey (DES) – were it not for the new algorithms developed collaboratively by researchers at Lawrence Berkeley National Laboratory and UC Berkeley.
“Our algorithm can classify a detection of a supernova candidate in about 0.01 seconds, whereas an experienced human scanner can take several seconds,” said Danny Goldstein, a UC Berkeley graduate student who developed the code to automate the process of supernova discovery on DES images.
Currently in its second season, the DES takes nightly pictures of the Southern Sky with DECam – a 570-megapixel camera that is mounted on the Victor M. Blanco telescope at Cerro Tololo Interamerican Observatory (CTIO) in the Chilean Andes. Every night, the camera generates between 100 Gigabytes (GB) and 1 Terabyte (TB) of imaging data, which is sent to the National Center for Supercomputing Applications (NCSA) and DOE’s Fermilab in Illinois for initial processing and archiving.
Object recognition programs developed at the National Energy Research Scientific Computing Center (NERSC) and implemented at NCSA then comb through the images in search of possible detections of Type Ia supernovae. These powerful explosions occur in binary star systems where one star is a white dwarf, which accretes material from a companion star until it reaches a critical mass and explodes in a Type Ia supernova.
“These explosions are remarkable because they can be used as cosmic distance indicators to within 3-10 percent accuracy,” says Goldstein.
Distance is important because the further away an object is located in space, the further back in time it is. By tracking Type Ia supernovae at different distances, researchers can measure cosmic expansion throughout the universe’s history. This allows them to put constraints on how fast the universe is expanding and maybe even provide other clues about the nature of dark energy.
“Scientifically, it’s a really exciting time because several groups around the world are trying to precisely measure Type Ia supernovae in order to constrain and understand the dark energy that is driving the accelerated expansion of the universe,” says Goldstein, who is also a student researcher in Berkeley Lab’s Computational Cosmology Center (C3).
The DES begins its search for Type Ia explosions by uncovering changes in the night sky, which is where the image subtraction pipeline developed and implemented by researchers in the DES supernova working group comes in. The pipeline subtracts images that contain known cosmic objects from new images that are exposed nightly at CTIO.
Each night, the pipeline produces between 10,000 and a few hundred thousand detections of supernova candidates that need to be validated.
“Historically, trained astronomers would sit at the computer for hours, look at these dots, and offer opinions about whether they had the characteristics of a supernova, or whether they were caused by spurious effects that masquerade as supernovae in the data. This process seems straightforward until you realize that the number of candidates that need to be classified each night is prohibitively large and only one in a few hundred is a real supernova of any type,” says Goldstein. “This process is extremely tedious and time-intensive. It also puts a lot of pressure on the supernova working group to process and scan data fast, which is hard work.”
To simplify the task of vetting candidates, Goldstein developed a code that uses the machine learning technique “Random Forest” to vet detections of supernova candidates automatically and in real-time to optimize them for the DES. The technique employs an ensemble of decision trees to automatically ask the types of questions that astronomers would typically consider when classifying supernova candidates.
At the end of the process, each detection of a candidate is given a score based on the fraction of decision trees that considered it to have the characteristics of a detection of a supernova. The closer the classification score is to one, the stronger the candidate. Goldstein notes that in preliminary tests, the classification pipeline achieved 96 percent overall accuracy.
“When you do subtraction alone you get far too many ‘false-positives’ — instrumental or software artifacts that show up as potential supernova candidates — for humans to sift through,” says Rollin Thomas, of Berkeley Lab’s C3, who was Goldstein’s collaborator.
He notes that with the classifier, researchers can quickly and accurately strain out the artifacts from supernova candidates. “This means that instead of having 20 scientists from the supernova working group continually sift through thousands of candidates every night, you can just appoint one person to look at maybe few hundred strong candidates,” says Thomas. “This significantly speeds up our workflow and allows us to identify supernovae in real-time, which is crucial for conducting follow up observations.”
“Using about 60 cores on a supercomputer we can classify 200,000 detections in about 20 minutes, including time for database interaction and feature extraction.” says Goldstein.
Goldstein and Thomas note that the next step in this work is to add a second-level of machine learning to the pipeline to improve the classification accuracy. This extra layer would take into account how the object was classified in previous observations as it determines the probability that the candidate is “real.” The researchers and their colleagues are currently working on different approaches to achieve this capability.
It’s well past the Fourth of July, but you can still easily find fireworks in the sky if you look around. The Chandra X-Ray Observatory has been doing just that for the past 15 years, revealing what the universe looks like in these longer wavelengths that are invisible to human eyes.
“Chandra changed the way we do astronomy. It showed that precision observation of the X-rays from cosmic sources is critical to understanding what is going on,” stated Paul Hertz, NASA’s Astrophysics Division director, in a press release. “We’re fortunate we’ve had 15 years – so far – to use Chandra to advance our understanding of stars, galaxies, black holes, dark energy, and the origin of the elements necessary for life.”
The telescope launched into space in 1999 aboard the space shuttle and currently works at an altitude as high as 86,500 miles (139,000 miles). It is named after Indian-American astrophysicist Subrahmanyan Chandrasekhar; the name “Chandra” also means “moon” or “luminous” in Sanskrit.
And there’s more to come. You can learn more about Chandra’s greatest discoveries and its future in this Google+ Hangout, which will start at 3 p.m. EDT (7 p.m. EDT) at this link.
When it comes to accuracy, everyone strives for a hundred percent, but measuring cosmic distances leaves a bit more to chance. Just days ago, researchers from the Baryon Oscillation Spectroscopic Survey (BOSS) announced to the world that they have been able to measure the distance to galaxies located more than six billion light-years away to a confidence level of just one percent. If this announcement doesn’t seem exciting, then think on what it means to other studies. These new measurements give a parameter to the properties of the ubiquitous “dark energy” – the source of universal expansion.
“There are not many things in our daily lives that we know to one-percent accuracy,” said David Schlegel, a physicist at Lawrence Berkeley National Laboratory (LBNL) and the principal investigator of BOSS. “I now know the size of the universe better than I know the size of my house.”
The research team’s findings were presented at the meeting of the American Astronomical Society by Harvard University astronomer Daniel Eisenstein, the director of the Sloan Digital Sky Survey III (SDSS-III), the worldwide organization which includes BOSS. They are detailed in a series of articles submitted to journals by the BOSS collaboration last month, all of which are now available as online preprints.
“Determining distance is a fundamental challenge of astronomy,” said Eisenstein. “You see something in the sky — how far away is it? Once you know how far away it is, learning everything else about it is suddenly much easier.”
When it comes to measuring distances in space, astronomers have employed many methods. To measure distances to planets has been accomplished using radar, but it has its constraints and going further into space means a less direct method. Even though they have been proved to be amazingly accurate, there is still an uncertainty factor involved – one that is expressed as a percentage. For example, if you were to measure the distance from an object 200 miles away to within a true value of 2 miles, then you have measured with an accuracy of 1%. Cosmically speaking, just a few hundred stars and a handful of star clusters are actually close enough to have their distances so accurately predicted. They reside within the Milky Way and are just a few thousand light-years away. BOSS takes it to the extreme… its measurements go well beyond our galactic boundaries, more than a million times further, and maps the Universe with unparalleled accuracy.
Thanks to these new, highly-accurate distance measurements, BOSS astronomers are making headway in the field of dark energy. “We don’t yet understand what dark energy is,” explained Eisenstein, “but we can measure its properties. Then, we compare those values to what we expect them to be, given our current understanding of the universe. The better our measurements, the more we can learn.”
Just how is it done? To achieve a one-percent measurement at six billion light years isn’t as easy as measuring a solar system object, or even one contained within our galaxy. That’s where the BOSS comes into play. It’s the largest of the four projects that make up the Sloan Digital Sky Survey III (SDSS-III), and was built to take advantage of this technique: measuring the so-called “baryon acoustic oscillations” (BAOs), subtle periodic ripples in the distribution of galaxies in the cosmos. These ripples are the signature of pressure waves which once cruised the early Universe at a time when things were so hot and dense that photons marched along with baryons – the stuff which creates the nuclei of atoms. Since the size of the ripple is known, that size can now be measured by mapping galaxies.
“With these galaxy measurements, nature has given us a beautiful ruler,” said Ashley Ross, an astronomer from the University of Portsmouth. “The ruler happens to be half a billion light-years long, so we can use it to measure distances precisely, even from very far away.
Using its specialized instrumentation which can make detailed measurements of a thousand galaxies at a time, BOSS took on a huge challenge – mapping the location of more than a million galaxies. “On a clear night when everything goes perfectly, we can add more than 8000 galaxies and quasars to the map,” said Kaike Pan, who leads the team of observers at the SDSS-III’s Sloan Foundation 2.5-meter Telescope at Apache Point Observatory in New Mexico.
Although the BOSS research team presented its early galaxy maps and beginning BAO measurements a year ago, this new data covers twice as much territory and gives an even more accurate measurement – including those to nearby galaxies. “Making these measurements at two different distances allows us to see how the expansion of the universe has changed over time, which will help us understand why it is accelerating,” explained University of Portsmouth astronomer Rita Tojeiro, who co-chairs the BOSS galaxy clustering working group along with Jeremy Tinker of New York University.
Also doing a similar study is Mariana Vargas-Magana, a postdoctoral researcher at Carnegie Mellon University. To enable even more accuracy, she’s looking into any subtle effects which could influence the BOSS measurements. “When you’re trying to reach one percent, you have to be paranoid about everything that could go even slightly wrong,” said Vargas-Magana — for example, slight differences in how galaxies were identified could have thrown off the entire measurement of their distribution, so different parts of the sky had to be checked carefully. “Fortunately,” Vargas-Magana said, “there are plenty of careful people on our team to check our assumptions. By the time all of them are satisfied, we are sure we didn’t miss anything.”
As of the present, these new BOSS findings would seem to be consistent with what we consider to be form of dark energy – a constant found throughout the history of the Universe. According to the news release, this “cosmological constant” is one of just six numbers required to create a model which coincides with the scale and structure of the Universe. Schlegel compares this six-number model to a pane of glass, which is pinned in place by bolts that represent different measurements of the history of the Universe. “BOSS now has one of the tightest of those bolts, and we just gave it another half-turn,” said Schlegel. “Each time you ratchet up the tension and the glass doesn’t break, that’s a success of the model.”
We keep saying dark matter is so very hard to find. Astronomers say they can see its effects — such as gravitational lensing, or an amazing bendy feat of light that takes place when a massive galaxy brings forward light from other galaxies behind it. But defining what the heck that matter is, is proving elusive. And considering it makes up most of the universe’s matter, it would be great to know what dark matter looks like.
A new experiment — billed as the most sensitive dark matter detector in the world — spent three months searching for evidence of weakly interacting massive particles (WIMPs), which may be the basis of dark matter. So far, nothing, but researchers emphasized they have only just started work.
“Now that we understand the instrument and its backgrounds, we will continue to take data, testing for more and more elusive candidates for dark matter,” stated physicist Dan McKinsey of Yale University, who is one of the collaborators on the Large Underground Xenon (LUX) detector.
LUX operates a mile (1.6 kilometers) beneath the Earth in the state-owned Sanford Underground Research Facility, which is located in South Dakota. The underground location is perfect for this kind of work because there is little interference from cosmic ray particles.
“At the heart of the experiment is a six-foot-tall titanium tank filled with almost a third of a ton of liquid xenon, cooled to minus 150 degrees Fahrenheit. If a WIMP strikes a xenon atom it recoils from other xenon atoms and emits photons (light) and electrons. The electrons are drawn upward by an electrical field and interact with a thin layer of xenon gas at the top of the tank, releasing more photons,” stated the Lawrence Berkeley National Laboratory, which leads operations at Sanford.
“Light detectors in the top and bottom of the tank are each capable of detecting a single photon, so the locations of the two photon signals – one at the collision point, the other at the top of the tank – can be pinpointed to within a few millimeters. The energy of the interaction can be precisely measured from the brightness of the signals.”
LUX’s sensitivity for low-mass WIMPs is more than 20 times better than other detectors. That said, the detector was unable to confirm possible hints of WIMPs found in other experiments.
“Three candidate low-mass WIMP events recently reported in ultra-cold silicon detectors would have produced more than 1,600 events in LUX’s much larger detector, or one every 80 minutes in the recent run,” the laboratory added.
Don’t touch that dial yet, however. LUX plans to do more searching in the next two years. Also, the Sanford Lab is proposing an even more sensitive LUX-ZEPLIN experiment that would be 1,000 times more sensitive than LUX. No word yet on when LUX-ZEPLIN will get off the ground, however.
Atoms, string theory, dark matter, dark energy… there’s an awful lot about the Universe that might make sense on paper (to physicists, anyway) but is extremely difficult to detect and measure, at least with the technology available today. But at the core of science is observation, and what’s been observed of the Universe so far strongly indicates an overwhelming amount of… stuff… that cannot be observed. But just because it can’t be seen doesn’t mean it’s not there; on the contrary, it’s what we can’t see that actually makes up the majority of the Universe.
If this doesn’t make sense, that’s okay — they’re all pretty complex concepts. So in order to help non-scientists (which, like dark energy, most of the population is comprised of) get a better grasp as to what all this “dark” stuff is about, CERN scientist and spokesperson James Gillies has teamed up with TED-Ed animators to visually explain some of the Universe’s darkest secrets. Check it out above (and see more space science lessons from TED-Ed here.)
Because everything’s easier to understand with animation!