Going Up? Top Floor, Space Elevator Games 2009

BREAKING NEWS: LaserMotive successfully qualified for the $900,000 prize! Their official speed was 3.72 m/s. Way to go! See more below.

Though it’s unlikely that anyone will be pressing the elevator button labeled ‘Space’ on one of the competitors’ vehicles this year at the 2009 Space Elevator Games, there is hope that a winner will walk away with the $1.1 million prize. Three different teams will compete to see if any can send a laser powered vehicle up a thin but strong ribbon 1km (.6 miles) into the sky.

This is the 5th year of the games, which started in 2005. The games are part of NASA’s Centennial Challenges program, which awards monetary prizes in the attempt to spur new technologies. This is a busy week for the program; as we covered earlier today, the Northrop Grumman Lunar X-prize announced two winners, and is part of the Centennial Challenge program.

To win the $1.1 million prize, one of the teams must propel their vehicle 1 km (.6 miles) into the sky at an average of at least 5 m/s (16.4ft/s). A second place prize of $900,000 will be awarded to any team that can go the 1km at an average of 2m/s (6.6 ft/s). The games this year will run from November 4th-6th, with each team getting the chance to launch their laser powered vehicles during a pre-determined 45-minute window for each day of the competition. The event takes place at NASA’s Dryden Flight Research Center at Edwards Air Force Base near Mojave, California.

Three teams have qualified to enter this year’s event: The Kansas City Space Pirates, LaserMotive, and the University of Saskatchewan Space Design Team (USST). The entire event will be live broadcast on Ustream, and updates will be provided on the official site.

For each test, a helicopter brings the elevator up the cable to a fixed starting point. The team is then given a go to calibrate their laser, and start beaming power to the craft. Each elevator uses small wheels to grip the ribbon, which is held aloft by a balloon tethered by three guy wires.

For a taste of what these elevators look like, check out this video:

Here’s a breakdown of what happened so far today: The Kansas City Space Pirates gave it three tries. In the first attempt, their elevator failed to take off. After fixing the problem, they were able to get the craft to move, but it then stopped. During the third, it started to climb the ribbon but they were unable to keep the laser locked on the elevator to power it, and it wasn’t able to climb the 1km to the top of the ribbon and brought back down.

LaserMotive had much better luck, despite a no-go on their initial attempt. Their elevator was lifted to the start by the helicopter, but failed to move despite repeated lasing attempts. After bringing it down for a tweak or two, the elevator was again placed at the start. It took off, making the first 300m (985ft) in a little under a minute, which met the 5m/s goal. The speed tapered off towards the top, but they bumped up against the 1km mark at approximately 4 minutes, making them the first to successfully claim the minimum 2km/s prize! While watching the live feed of this fantastic feat, I overheard a transmission from LaserMotive saying, “This is LaserMotive requesting permission to breathe.”

USST will not launch today, as there are no more open windows where satellites overhead will not be accidentally hit by the intense lasers used as power sources for the elevators. They will go tomorrow, November 5th, at 7am PST. Be sure to check back with us at Universe Today for more coverage, or head over to the official site for live streaming.

Source: Physorg, Space Games Live Feed

Mars Explorers May Use AI to Become ‘Cyborg Astrobiologists’

Ever heard of a ‘Cyborg Astrobiologist’? Probably not. But I bet you’ll want to be one after learning that future exploration of Mars (and other planets, for that matter) may employ the use of artificial intelligence integrated into spacesuits to enhance the ability of astronauts in taking scientific data while exploring. The AI assistance could help future astronauts exploring planets to recognize differences in their surroundings as being due to the presence of life. Does this sound like something from 50 years from now? Well, a prototype model has already been tested, and has shown the principle behind this idea to be sound.

University of Chicago geoscientist Patrick McGuire and his team have developed the basic systems needed for such a spacesuit, using mostly off-the shelf technology. The system uses a Hopfield neural network to analyze data taken in by a either a camera phone or a microscope. The AI system employs a ‘novelty detection algorithm’ which analyzes images from either imaging device, and is able to identify features in images that are out of place.

The Hopfield system compares patterns against ones it has already seen, and learns from this process to correctly identify novel patterns that could be of interest. The full prototype spacesuit has a wearable computer that houses the AI system, which uses Bluetooth to receive data from a cell phone camera or is connected to a USB digital microscope.

The system was tested at the Mars Desert Research Station (MDRS) in the San Rafael Swell of Utah, which is maintained by the Mars Society. The MDRS is a semi-arid desert with “greenish, grey or light gray mudstone,
limestone, siltstone and sandstone, partially inter-bedded by white sandstone layers”. For the last two weeks of February 2009, two members of McGuire’s team tested the wearable technology, which was able to successfully learn to identify patches of lichen from a background of rock, and identify different color patterns that signified different rock formations.

Another test, conducted in September of 2005 at Rivas Vaciamadrid in Spain, utilized a USB digital microscope to image rocks with lichen on them. As you can see in the image below, the AI system was able to identify as uncommon the spores of the lichen, which are about 1mm in diameter.The Hopfield AI system was able to successfully identify lichen spores imaged by a digital microscope as a novel feature on rock formations in Rivas Vaciamadrid, Spain. Image Credit: Patrick McGuire arXiv:0910.5454

There are still some bugs to be worked out, though, as the system detected cast shadows in rough terrain our low standing Sun as novel features, the researchers wrote in their paper, The Cyborg Astrobiologist: Testing a Novelty-Detection Algorithm on Two Mobile Exploration Systems at Rivas Vaciamadrid in Spain and at the Mars Desert Research Station in Utah, available on Arxiv. The researchers also tested a head-mounted digital microscope display, but instead opted for a tripod due to the blurriness associated with the head movement of the researcher wearing the suit.

Though it may be a while until there are any Martian astronauts utilizing such a system – let alone Martian astronauts with the title of ‘Cyborg Astrobiologist’ – the combination of the AI with imaging systems could start to prove very useful on future orbital surveyors of Mars. Additionally, these systems could be used to collect and analyze data outside of the visible light spectrum, which could be incredibly useful for both robotic and human explorers.

Source: Physorg, Arxiv

WiFi in Space Coming Soon?

Although current astronauts are Twittering and blogging from space, it’s a cumbersome process as the ISS, shuttle and Soyuz do not have internet access. Instead, they have to downlink their information to mission control, where someone posts it to the web. But if future commercial space travelers or astronauts living on the Moon want to blog, Tweet and share their experiences real-time, will it be possible? Well, a group of engineers are working on applying the same wireless systems that keep our mobile phones, laptops and other devices connected to the web to a new generation of networked space hardware. They say that wireless technologies will likely be important part of future space exploration, not only for human communication but for transfer of data and commands.

The Wireless Working Group (WWG) of the Consultative Committee for Space Data Services (CCSDS) is a group of engineers that coordinates wireless research among global space agencies and promotes interoperability of spacecraft data systems.

Multiple microsensors like this one could be scattered across planetary surfaces to gather more information than a single lander could provide. The microsensors would then configure a wireless network to assemble data for its relay back to Earth.  Credit:  ESA
Multiple microsensors like this one could be scattered across planetary surfaces to gather more information than a single lander could provide. The microsensors would then configure a wireless network to assemble data for its relay back to Earth. Credit: ESA

They say that wireless sensor nodes placed throughout a spacecraft might function as a networked nervous system, yielding a wealth of currently inaccessible structural or environmental data to mission controllers. Similar nodes scattered across a planetary surface would generate a much higher scientific return than a single lander could, configuring a network to combine their findings for relaying to Earth.

And establishing ‘plug and play’ wireless networking between multiple spacecraft could enable the seamless transfer of data and commands. This would work for formation-flying satellite constellations and orbiter-lander-rover combinations , but proximity networks could be set up by any spacecraft within signal range as easily as a laptop plugs into a WiFi network.

Of course, the technology is still being developed and having Wifi in space isn’t going to happen anytime soon, but engineers say the underlying technologies are already with us, in the protocols delivering wireless connectivity to homes, offices and public places.

“This research is an example of us ‘spinning in’ technology developed elsewhere into the space sector,” said ESA data handling engineer Jean-François Dufour, who is part of the CCSDS. “Commercial wireless protocols such as the IEEE 802.11 family of standards for computer WiFi or sensor networking standards such as IEEE 802.15.4 are already available so we are assessing how they might transfer to the space environment.”

Source: ESA

Manned Solar Plane Will Attempt Flight Around the World

A man who circled the globe in a balloon in 1999 has a new global adventure planned. Bertrand Piccard has unveiled a prototype of a solar-powered plane he hopes to fly around the world. Until now, only unmanned solar airplanes have been flown, but Piccard’s HB-SIA would be manned. The glider-like plane has solar panels covering the wings, and the wingspan of the prototype reaches 61m, while the entire vehicle weighs only 1,500 kg. The first tests of the plane will be done to prove it can fly at night. Piccard says he wants to demonstrate the potential of renewable energies.

Piccard just unveiled the prototype, and he hopes to attempt a flight across the Atlantic by 2012.

Solar and battery technology is just now maturing enough to enable solar flight. In 2007 the UK defence company Qinetiq flew an unmanned aerial vehicle called the Zephyr unmanned for 54 continuous hours during tests.

The HB-SIA. Credit: BBC
The HB-SIA. Credit: BBC

But Piccard and his company, Solar Impulse are working on what they believe to be a breakthrough design, using super-efficient solar cells, batteries, motors and propellers to get it through the dark hours and composite materials to keep it extremely light.

Although the vehicle is expected to be capable of flying non-stop around the globe, Piccard will in fact make five long hops, sharing flying duties with project partner Andre Borschberg.

“The aeroplane could do it theoretically non-stop – but not the pilot,” said Piccard told the BBC. “We should fly at roughly 25 knots and that would make it between 20 and 25 days to go around the world, which is too much for a pilot who has to steer the plane. In a balloon you can sleep, because it stays in the air even if you sleep. We believe the maximum for one pilot is five days.”

More info on Solar Impulse.. And just for your interest, here’s an article about the biggest plane in the world.

Source: BBC

5 Spinoffs from the Hubble Space Telescope

As we wait (impatiently) for the Hubble Space Telescope to return to action following its repair and updating by the STS-125 astronauts, it is easy to think about how Hubble has impacted society. Hubble has become a household name, bringing astronomy to the masses with its dramatic images of the cosmos. It has also changed our understanding of the universe. But there’s more ways that HST has impacted the world. Various technologies developed for the famous orbiting telescope have helped create or improve several different medical and and scientific tools. Here are five technology spinoffs from Hubble:

Micro-Endoscope for Medical Diagnosis:

Micro-endoscope. Credit: NASA
Micro-endoscope. Credit: NASA

The same technology that enhances HST’s images are now helping physicians perform micro-invasive arthroscopic surgery with more accurate diagnoses. Hubble technology helped improve the micro-endoscope, a surgical tool that enables surgeons to view what is happening inside the body on a screen, eliminating the need for a more invasive diagnostic procedure. This saves time, money and lessens the discomfort patients experience.

CCDs Enable Clearer, More Efficient Biopsies

A biopsy from HST CCD technology. Credit: NASA
A biopsy from HST CCD technology. Credit: NASA

Charge coupled devices (CCDs) used on the HST to convert light into electronic files—such as a distant star’s light directly into digital images—have been adapted to improve imaging and optics here on Earth. When scientists realized that existing CCD technology could not meet scientific requirements for the Hubble’s needs, NASA worked with an industry partner to develop a new, more advanced CCD. The industry partner then applied many of the NASA-driven enhancements to the manufacture of CCDs for digital mammography biopsy techniques, using CCDs to image breast tissue more clearly and efficiently. This allows doctors to analyze the tissue by stereotactic biopsy, which requires a needle rather than surgery.

Mirror Technology Increases Semiconductor Productivity, Performance

Hubble mirror technology helps superconductors. Credit: NASA
Hubble mirror technology helps superconductors. Credit: NASA

The semiconductor industry has benefitted from the ultra-precise mirror technology that gives the HST its full optical vision and telescopic power. This technological contribution helped improve optics manufacturing in microlithography—a method for printing tiny circuitry, such as in computer chips. The system uses molecular films that absorb and scatter incoming light, enabling superior precision and, consequently, higher productivity and better performance. This translates into better-made and potentially less costly computer circuitry and semiconductors.

Software Enhances Other Observatories

Hubble software used by other observatories. Credit: NASA
Hubble software used by other observatories. Credit: NASA

With the help of a software suite created by a NASA industry partner in 1995, students and astronomers were able to operate a telescope at the Mount Wilson Observatory Institute via the Internet. The software is still widely in use for various astronomy applications; using the CCD technology, the software locates, identifies, and acquires images of deep sky objects, allowing a user to control computer-driven telescopes and CCD cameras.

Optics Tool Sharpens Record-Breaking Ice Skates

Hubble technology helps Olympic skaters. Credit: NASA
Hubble technology helps Olympic skaters. Credit: NASA

Current Olympic record-holding speed skater Chris Witty raced her way to a gold medal in the 1,000-meter at the 2002 Salt Lake City Winter Olympics. Witty and other American short- and long-track speed skaters used a blade-sharpening tool designed with the help of NASA Goddard Space Flight Center and technology from HST. NASA had met with the U.S. Olympic Committee and helped to develop a new tool for sharpening speed skates, inspired by principles used to create optics for the HST. Speed skates sharpened with this new instrument demonstrated a marked improvement over conventionally sharpened skates.

More information on other NASA Spinoffs.

NASA Creates a New NEBULA: Cloud Computing Project

NASA has developed a new cloud computing project based on open source components that provides high capacity computing, storage, and networking. Called NEBULA, the space agency said the cloud project could be used in support of space missions, as well as for education, public outreach and input, and collaborations. NASA said NEBULA is a more open Web strategy designed to give the public greater participation in the space program.

Currently, the NEBULA cloud is being used to host a website, Nebula.nasa.gov.

On that site, NASA says the “fully-integrated nature of the NEBULA components provides for extremely rapid development of policy-compliant and secure web applications, fosters and encourages code reuse, and improves the coherence and cohesiveness of NASA’s collaborative web applications.” It integrates open source components into a seamless, self-service platform.

“Built from the ground up around principles of transparency and public collaboration, Nebula is also an open source project,” according to NASA.

NASA describes Nebula as a combination of infrastructure, platform, and software as a service, and the space agency has created an IT architecture in support of that. An article in Information Week says the components include the Eucalyptus software developed at the University of California at Santa Barbara, the Lustre file system deployed on 64-bit storage nodes, the Django Web application framework, the SOLR indexing and search engine, and an integrated development environment. Nebula will be compatible with Amazon Web Services, which means AWS-compatible tools will work with it and Nebula virtual servers can run on Amazon’s Elastic Compute Cloud.

In a paper written by Chris Kemp, CIO of NASA’s Ames Research Center Kemp of NASA Ames, he says NEBULA could be used for an overhaul of NASA’s many websites, consolidating into a “single facility” with a Web application framework that would include templates for user-generated blogs, wikis, and other content.

Kemp wrote that such an approach would support the public’s desire to be more actively engaged with NASA and its space missions.

Sources: NEBULA, Information Week

Dextre vs. HAL

As Endeavour departs from the International Space Station on Monday, the space shuttle crew leaves behind a two-armed robot, the Special Purpose Dexterous Manipulator (SPDM), which the astronauts affectionately refer to as Dextre. Any reference to robots in space brings to mind other famous, albeit fictitious, machines that have interacted with humans on board a spacecraft. And, with the recent passing of science fiction writer Arthur C. Clarke, one famous machine named HAL particularly comes to mind, especially when you factor in that Dextre is what’s called a “telemanipulator.” Any chance the space station crew needs to worry about the robot lurking right outside their hatch?

Endeavour crewmember Rick Linnehan said, don’t worry, there is no comparison between Dextre and HAL, the famous malfunctioning computer who killed astronauts in the 1968 movie “2001: A Space Odyssey.”

“I’m a big Arthur C. Clarke fan and I have to tell you Dextre just isn’t as smart as HAL,” said Linnehan in new conference from the ISS on Sunday. “He’s built to be brawn not brains and he’s going to serve a big purpose up here in terms of moving a lot of hardware around.”

HAL 9000.  Image credit:  Wikipedia

Dextre, the two-armed, $200-million robot will reduce the amount of time astronauts must spend outside the space station, and could eliminate the need for up to a dozen spacewalks a year, said Daniel Rey, head of the Canadian technical team that prepared Dextre for his mission on board the space station.

“He will free up astronauts so they can do more science and more research rather than maintenance,” said Rey. Dextre will perform exterior construction and tasks like changing batteries and handling experiments outside the space station. Dextre also comes equipped with a tool holster which allows the robot to change equipment as needed “like any good handyman.”

Rey also concurred that 3.7-meter robot Dextre can’t be compared to HAL. “He doesn’t have an artificial intelligence. . .he can be remote controlled from the ground or from the space station.” Dextre will be able to manipulate items “from the size of a phone book to a phone booth,” Rey added.

As for HAL, in the movie “2001: A Space Odyssey,” he maintains all systems on an interplanetary voyage, plays chess, and has a special penchant for lip reading. Those capabilities just aren’t in Dextre’s database. However, HAL was programmed with the objective to ensure mission success. That’s one area where HAL and Dex do have something in common.

Original News Source: NASA TV and the Canadian Press