Forget Black Holes, How Do You Find A Wormhole?

An artists impression of what it would look like inside a wormhole. Pretty. (credit: Space.com)

Finding a black hole is an easy task… compared with searching for a wormhole. Suspected black holes have a massive gravitational effect on planets, stars and even galaxies, generating radiation, producing jets and accretion disks. Black holes will even bend light through gravitational lensing. Now, try finding a wormhole… Any ideas? Well, a Russian researcher thinks he has found an answer, but a highly sensitive radio telescope plus a truckload of patience (I’d imagine) is needed to find a special wormhole signature…

A wormhole connecting two points within spacetime.
Wormholes are a valid consequence of Einstein’s general relativity view on the universe. A wormhole, in theory, acts as a shortcut or tunnel through space and time. There are several versions on the same theme (i.e. wormholes may link different universes; they may link the two separate locations in the same universe; they may even link black and white holes together), but the physics is similar, wormholes create a link two locations in space-time, bypassing normal three dimensional travel through space. Also, it is theorized, that matter can travel through some wormholes fuelling sci-fi stories like in the film Stargate or Star Trek: Deep Space Nine. If wormholes do exist however, it is highly unlikely that you’ll find a handy key to open the mouth of a wormhole in your back yard, they are likely to be very elusive and you’ll probably need some specialist equipment to travel through them (although this will be virtually impossible).

Alexander Shatskiy, from the Lebedev Physical Institute in Moscow, has an idea how these wormholes may be observed. For a start, they can be distinguished from black holes, as wormhole mouths do not have an event horizon. Secondly, if matter could possibly travel through wormholes, light certainly can, but the light emitted will have a characteristic angular intensity distribution. If we were viewing a wormhole’s mouth, we would be witness to a circle, resembling a bubble, with intense light radiating from the inside “rim”. Looking toward the center, we would notice the light sharply dim. At the center we would notice no light, but we would see right through the mouth of the wormhole and see stars (from our side of the universe) shining straight through.

For the possibility to observe the wormhole mouth, sufficiently advanced radio interferometers would be required to look deep into the extreme environments of galactic cores to distinguish this exotic cosmic ghost from its black hole counterpart.

However, just because wormholes are possible does not mean they do exist. They could simply be the mathematical leftovers of general relativity. And even if they do exist, they are likely to be highly unstable, so any possibility of traveling through time and space will be short lived. Besides, the radiation passing through will be extremely blueshifted, so expect to burn up very quickly. Don’t pack your bags quite yet…

Source: arXiv publication

When Do Asteroids Turn Dangerous?

meteorcrater.thumbnail.jpg

One of the most spectacular sights in the night sky is a fireball; a rock from space impacts the atmosphere and blazes a trail that can last seconds or even minutes. These burn up harmlessly, but when do they turn dangerous? When do asteroids get large enough that they can actually get through the atmosphere and cause some destruction here on the ground?

During an invited talk at the Meteoroids 2007 conference held in Barcelona, Spain, Clark R. Chapman from the Southwest Research Institute delivered a presentation about how to define this line between harmless explosion in the sky and an impact that causes destruction here on the ground. The paper, entitled Meteoroids, Meteors, and the Near-Earth Object Impact Hazard was later published in the journal Earth, Moon and Planets.

Originally, researchers focused their efforts on the largest asteroids: the objects 2 km (1.2 miles) and above. These are the space rocks that could cause wide scale devastation across the planet, affecting the climate and leading the the deaths of hundreds of millions of people. It was calculated that an individual might have a 1-in-25,000 chance of dying in an asteroid impact.

Now that the Spaceguard Survey has discovered 75% of the asteroids 1 km and larger, your chances of dying have dropped to about 1-in-720,000. About the same chance as dying from a fireworks accident or amusement park ride.

According to Chapman, astronomers are now shifting their focus from the largest impacts – like the one that wiped out the dinosaurs 65 million years ago – to the smaller, but still dangerous space rocks. For example, the rock that detonated in the air above Tunguska, Siberia in 1908. That object was probably only between 20-100 metres (65-325 feet) across.

And yet, it leveled the forest for thousands of square kilometres and would have caused immense destruction if it had hit a populated area.

A new survey, informally called the Spaceguard Two Survey, will begin soon with the goal of finding 90% of the near-Earth asteroids larger than 140 metres (460 feet) within the next 15 years.

There are many variables that go into calculating the resulting destruction from an impact. You have to consider the velocity, if it’s a metallic or rocky asteroid, and whether it’s fragmented or not.

What should the response be of national and international emergency management officials to a prediction that a 35 m NEA will strike a populated country a decade in the future? Following current interpretations, we would simply tell people near ground-zero to stay inside and not look directly at the high-altitude explosion. But if objects of that size could cause Tunguska-like damage, we might not only evacuate people for 100 km surrounding ground-zero but we would certainly consider a space mission to move or blow-up the threatening NEA.

Originally, researchers thought that Tunguska level events happened once in 4,000 years, but it might be more common, maybe as often as 1-in-700. And perhaps even smaller, more common, asteroids could still cause destruction on the ground – 1-in-200 years.

If Spaceguard Two Survey gets going, it should locate most of the larger asteroids, but even 50% of the Tunguska-sized impactors. It will even be tracking 1-2 million 30 metre objects.

And if one of those rocks is on a collision course with Earth, governments and space agencies will be able to work out an evacuation or prevention strategy.

Or at least encourage people to avert their eyes.

Original Source: SWRI

Podcast: Cosmic Rays

3c442a_opt_comp.thumbnail.jpg

We’re going to return to a long series of episodes we like to call: Radiation that Can Turn You Into a Superhero. This time we’re going to look at cosmic rays, which everyone knows made the Fantastic Four. These high-energy particles are streaming from the Sun and even intergalactic space, and do a wonderful job of destroying our DNA, giving us radiation sickness, and maybe (hopefully!) turning us into superheroes.

Click here to download the episode

Cosmic Rays – Show notes and transcript

Or subscribe to: astronomycast.com/podcast.xml with your podcatching software.

Using GPS Could Better Tsunami Warning System

tsunami_jas_2004361.thumbnail.jpg

When there is a tsunami coming towards your home, you want to know about it as far in advance as possible. An early warning about such a disaster could save countless lives, and using Global Positioning System information may just be the way to speed up our reaction time in the future.

The traditional tsunami warning system relies on measuring the magnitude of the earthquake that causes the tsunami. This method is not always reliable, though, as calculating accurately the power of the resulting ocean waves takes hours or days.

For example, 2005 Nias quake near Indonesia was estimated to cause about the same size of tsunami as the powerful 2004 Indian Ocean quake, which destroyed cities in portions of Indonesia, India and Thailand and killed more than 225,000 people. The 2005 tsunami did not nearly meet the same proportions as the earlier quake. There have been five false tsunami alarms between 2005 and 2007, which can reduce the effectiveness of the warnings in the eye of the public.

In a study published in the December Geophysical Research Letters, researcher Y. Tony Song of NASA’s Jet Propulsion Laboratory in Pasadena, California, showed that using GPS from coastal areas near the epicenter of the quake could help more accurately and quickly determine the scale of a tsunami.

Here’s how it would potentially work: data from seismometers near the earthquake’s epicenter is first registered, as in the traditional system. After that, GPS data of the seafloor displacement is factored in, which gives a more complete picture of the extent and power of the earthquake. The size of the predicted tsunami is then quickly calculated and given a number between 1 and 10 – 1 being the lowest – much like the Richter scale. This information could then be passed through the tsunami warning system to evacuate people to safety.

GPS data helps create a 3-dimensional model of the tsunami by giving details about the horizontal and vertical displacement of the seafloor, and this data can be sent and analyzed in minutes from coastal GPS stations. Song’s methods have accurately modeled three previous tsunamis: one in Alaska in 1964, the Indian Ocean tsunami in 2004, and the 2005 Nias tsunami.

Source: JPL Press Release

Flying Telescope Passes Its First Stage of Tests

192396main_sofia2_330.thumbnail.jpg

Telescopes on the ground – while having all sorts of good qualities – have the disadvantage of peering through the whole of the atmosphere when looking at the stars. Space-based telescopes like Hubble are an effective way around this, but launching a telescope into space and maintaining it is not exactly cheap. What about something in between the two?

This is where SOFIA (Stratospheric Observatory for Infrared Astronomy) flies in. SOFIA is a converted 747SP airliner that used to carry passengers for United Airlines and Pan Am, but now only has one voyager: an infrared telescope.

SOFIA recently completed the first phase of flight tests to determine its structural integrity, aerodynamics and handling abilities. This first series of tests were done with the door through which the telescope will peer closed, and open-door testing will begin in late 2008.

What makes SOFIA valuable is its ability to fly high in the stratosphere for observations, at around 41,000 feet (12.5km). This eliminates the atmosphere in between the ground and space, which causes turbulence in the light coming through, and also absorbs almost completely some wavelengths of infrared light.

Cloudy nights, normally the bane of observational astronomy, will not impede the ability of SOFIA. Other advantages are that scientists will be able to add specialized observing instruments for specific observations, and fly to anywhere in the world.

The telescope is 10 feet across, and weighs around 19 tons. It will look through a 16-foot high door in the fuselage to study planetary atmospheres, star formation and comets in the infrared spectrum.

During this stage of testing, the ability of the telescope to compensate for the motion and vibrations of the airplane was checked. After the first open-door tests are run this year, the mobile observatory will begin making observations in 2009, and will be completely operational in 2014.

SOFIA is a cooperation between NASA, who will maintain the plane, and the German Aerospace Center, who built and will maintain the telescope.

Source: NASA Press Release

A Possible Answer to Flyby Anomalies

galileo_earth.thumbnail.jpg
Artist's impression of the Galileo mission above Earth - which spent seven years (1995–2003) orbiting Jupiter. Credit: NASA

Strange things are happening to our robotic space explorers. Also known as the “Pioneer effect“ (the unexpected and sudden alterations to Pioneer 10 and Pioneer 11 trajectories measured as they continue their journey into the outer solar system), similar anomalies are being seen in flybys by modern space probes. Earth flybys by Galileo, Rosetta, NEAR and Cassini have all experienced a sudden boost in speed. After cancelling out all possible explanations, including leakage of fuel and velocity measurement error, a new study suggests the answer may lie in a bizarre characteristic of universal physics…

Planetary flybys are an essential aid to interplanetary missions to gain energy as they accelerate on their merry way to their destination. Gravity assists are accurately calculated by mission scientists so the time of arrival can be calculated down to the minute. Considering most missions take years to complete, this degree of accuracy is amazing, but essential.

So, when Galileo completed gravity assist past Earth on December 8, 1990, to speed it toward Jupiter, you can imagine NASA’s surprise to find that Galileo had accelerated suddenly, and for no apparent reason. This small boost was tiny, but through the use of the Deep Space Network, extremely accurate measurements of the speeding craft could be made. Galileo had accelerated 3.9 mm/s.

This isn’t an isolated case. During Earth flybys by the space probes NEAR, Cassini-Huygens and Rosetta, all experienced a unexplained boosts of 13 mm/s, 0.11 mm/s and 2 mm/s respectively. Once technical faults, observational errors, radiation pressure, magnetic instabilities and electrical charge build-up could be ruled out, focus is beginning to turn to more exotic explanations.

A recent study by Magic McCulloch suggests that “Unruh radiation” may be the culprit. The Unruh effect, put simply, suggests that accelerating bodies experience a type of electromagnetic radiation. At very low acceleration, the wavelength emitted will be so large that a whole wavelength will be longer than the dimensions of the Universe (otherwise known as the Hubble Distance). Low acceleration would therefore generate waves that have no effect on the body. However, should the accelerating body (i.e. Galileo getting accelerated by Earth’s gravity during the 1990 flyby) slowly exceed an acceleration threshold, the Unruh radiation will decrease in wavelength (smaller than the Hubble Distance), causing a tiny, but measurable “boost” to its increasing velocity.

Although complex, this theory is very interesting and proves that although we can calculate the arrival time of space probes down to the nearest minute, the Universe will continue to throw up some perplexing issues for a long time yet.

Sources: arXiv Blog, arXiv abstract and paper download

Engineering, Budget Problems for NASA’s New Spacecraft

070828_ares_vmed_2p_widec.thumbnail.jpg

NASA has discovered a potentially dangerous problem with the first stage of the Ares 1 rocket that will launch the new Orion crew capsule to the space station and to the moon. Engineers are concerned that during the first few minutes of flight, the rocket could shake violently, possibly causing significant damage to the entire launch stack. Meanwhile, nasaspaceflight.com reports that a budget review of the Constellation program found a short term deficit of $700m that will likely delay test flights and development of the yet-to-be built rockets.

The shaking problem is called thrust oscillation, and is typical in solid rocket motors. The phenomenon is characterized by increased acceleration pulses during the latter part of first-stage flight. Depending on the amplitude of these pulses, the impact on the vehicle structure and astronauts may be quite significant.

The Associated Press reported that NASA discovered the problem in the fall of 2007, but did not discuss the problem publicly until January 18, 2008 after the AP filed a Freedom of Information Act request and Keith Cowing of NASAWatch.com submitted detailed engineering questions regarding the oscillations.

In the response given to both NASAWatch and AP, NASA said they are working to understand how the thrust oscillation may impact the entire stack – the Ares first stage, upper stage and the Orion crew vehicle — and to determine how to minimize the impact. They have brought in experts from within NASA and outside industry to review the issues and to determine if lessons learned from previous launch vehicles will help solve the problems. NASA said they are studying multiple systems to identify all possible scenarios.

“This is a development project like Apollo. I hope no one was so ill-informed as to believe that we would be able to develop a system to replace the shuttle without facing any challenges in doing so,” NASA Administrator Mike Griffin said in a separate statement to the Associated press. “NASA has an excellent track record of resolving technical challenges. We’re confident we’ll solve this one as well.”

The first stage is a single, five-segment reusable solid rocket booster derived from the Space Shuttle solid rocket motors developed and produced by ATK Launch Systems.

The Ares I rocket is the core of the new space transportation system that will carry crewed missions back to the moon, and possibly on to Mars. The rocket may also use its 29-ton payload capacity to deliver resources and supplies to the International Space Station.

Concerning the problems of budget shortfalls, Ares program managers have offered a re-aligned development and test flight schedule in an attempt to protect Orion’s debut mission to the ISS in 2015.

The reason for the changes relates to additional costs associated with the challenges of Ares I’s development, creating a shortfall of funds for the financial year period 2008 to 2010.

Among numerous changes, a test flight of the Ares I originally scheduled for 2012 has been delayed by a year, while test flights with the Orion crew vehicle will possibly delayed between nine and three months. The Ares V’s lunar mission debut will now be an unmanned fly-by, according to nasaspaceflight.com.

Original New Sources: Associated Press, nasaspaceflight.com

Scaled Composites at Fault for Fatal Explosion

spaceship_one_in_flight_1.thumbnail.jpg

Safety inspectors in California have cited Scaled Composites for being at fault for the explosion that killed three employees at their Mojave Air and Space Port. The explosion occurred in July 2007 and stunned the X-Prize winning company. They now face a maximum fine of $25,310.

Burt Rutan’s Scaled Composite company has been going from strength to strength in recent years. In 2004 the company was successful at launching their SpaceShipOne to an altitude of 114km, claiming the $10 million Ansari X-Prize in 2004. Since this historic win, Rutan has formed a powerful alliance with British businessman Richard Branson’s Virgin Galactic. Branson is currently heading the construction of the world’s first Space Port in the New Mexico desert, using SpaceShipOne’s successor, the larger SpaceShipTwo, as the principal craft to take 6 people into space. Stowed under a WhiteKnightTwo aircraft, SpaceShipTwo is designed for a conventional take-off until the pair are at approximately 15 km in altitude. At this point, the pair separate, allowing SpaceShipTwo to accelerate to 2500 mph by firing its single hybrid rocket engine. Weightlessness will be achieved, giving space tourists an idea of what it feels like to be in a low Earth orbit.

Tragically, three personnel working on the Scaled Composites site in the Mojave Desert last July were killed in an accident during tests involving a rocket propellant. The report from safety officials state that the company failed to provide “effective information and training of the health and physical hazards associated with nitrous oxide,” the fuel used to power the rockets. Since the incident safety measures have been stepped up and it is hoped that this sad event won’t delay the 2009 launch of the first space tourism flights by Virgin Galactic.

Source: Space.com

Most Advanced Ion Engines For 2013 BepiColombo Mission to Mercury

bebicolumbo.thumbnail.jpg

British scientists have been given the green light to begin the development of the most advanced ion engines ever to be used in space travel history. Set for launch in 2013, the European/Japanese BepiColombo mission to Mercury will be propelled to the Solar System’s innermost planet by advanced ion engines, with an efficiency equivalent to 17.8 million miles per gallon. This is one very cheap spaceship to fly!

We are currently being dazzled and amazed at the sheer detail of the images being transmitted by NASA’s MESSENGER mission flyby of the tiny planet Mercury. While we watch and wait for MESSENGER to eventually establish an orbit (insertion should occur in the spring of 2011), UK scientists, working with the ESA and Astrium (Europe’s largest space contractor), are hard at work designing the engines for the next big mission to the inner Solar System: BepiColombo. The mission consists of two orbiters: the Mercury Planetary Orbiter (MPO), to carry out mapping tasks over the planet, and the Mercury Magnetospheric Orbiter (MMO), to characterize the planets mysterious magnetosphere. The two craft will travel as one for the 6 year journey to Mercury, but separate at orbital insertion.
ESAs BepiColumbo planned orbital configuration around Mercury (credit: ESA)
Although BepiColombo will use the gravitational pull of the Moon, Earth, Venus and then Mercury to actually get it to its destination, a large amount of energy is required to slow the craft down, countering the Sun’s gravity. Without an engine to thrust against BepiColombo‘s decent into the huge gravitational pull of the Sun, the mission would be doomed to overshoot Mercury and fall to a fiery end. This is where the ion engines come in.

Ion engines have been used in space missions before (such as the SMART-1 mission to the Moon in 2003), but the new generation engines currently undergoing development for the next Mercury mission will be far more efficient while providing sufficient thrust. Better efficiency means less fuel. Less fuel means less mass and volume, saving on launch cost and allowing more room for scientific instrumentation.

Ion engines work by channeling electrically charged particles (ions) through an electric field. Doing this accelerates the ions to high velocities. Each particle has a mass (albeit tiny), so each particle also carries a momentum when fired from the engine. Shoot enough particles out of the engine and you produce a thrust the spacecraft can use to accelerate or (in the case of BepiColombo) slow down. Ion engines do have a drawback. Although they are fuel efficient, the thrust can be small, so missions can take longer to complete; time must be allowed for the long-term thrust to have an effect on the velocity of the spacecraft. However, this shortfall for ion propulsion won’t deter space scientists from using this new technology, as the pros definitely outweigh the cons.

So, we can now look forward to over a decade of exploration of Mercury by MESSENGER and BepiColombo, one of the most uncharted and mysterious planets to orbit the Sun.

Source: Telegraph.co.uk

2007 was Tied for the Second Hottest Year on Record

208481main_gcm_temp_anomaly_2003_2007_labeled0990.thumbnail.jpg

You weren’t imagining things, 2007 really was an unseasonably hot year. In fact, it was tied with 1998 for the second hottest year on record. All in all, the 8 warmest years have all occurred since 1998, and the 14 warmest years since 1990. This mini-record was announced by NASA climatologists this week.

Researchers from NASA’s Goddard Institute for Space Studies used temperature data from weather stations on land, satellite measurements of sea ice temperatures since 1982 and data from ships for earlier years.

“As we predicted last year, 2007 was warmer than 2006, continuing the strong warming trend of the past 30 years that has been confidently attributed to the effect of increasing human-made greenhouse gases,” said James Hansen, director of NASA GISS.

Perhaps the most warming occurred up in the Arctic and high latitude regions of the planet, where vast regions of ice melted away. In fact, the Northwest Passage opened up for the first time, and scientists are predicting that the region could be ice free in the Summer in less than a decade.

The lower ice levels in the Arctic provides more open water and reduces the amount of sunlight reflected back into space. This is expected to increase the rate of warming.

Let’s hope 2008 isn’t so hot.

Original Source: NASA News Release