Underwater Robot Captures its First Sample 500 Meters Below the Surface of the Ocean

The Woods Hole Oceanographic Institution (WHOI) says their underwater robot has just completed the first-ever automated underwater sampling operation. The robot is called Nereid Under Ice (NEI) and it collected the sample in Greece. WHOI is developing Nereid in association with NASA’s Planetary Science and Technology from Analog Research (PSTAR) program.

Continue reading “Underwater Robot Captures its First Sample 500 Meters Below the Surface of the Ocean”

Cool Photo of Canadarm2 With its Dextre Hand. Oh and the Earth. That’s Nice Too.

Check out this image of the Canadian Space Agency’s (CSA) Canadarm2 on the International Space Station. The CSA’s Dextre is attached to one end of the arm. The Canadarm2 played a vital role in assembling the ISS, while Dextre helps maintain the ISS, freeing astronauts from routine yet dangerous spacewalks, and allowing them to focus on science.

Continue reading “Cool Photo of Canadarm2 With its Dextre Hand. Oh and the Earth. That’s Nice Too.”

Aquatic Rover Drives on the Underside of the Ice in Antarctica

Not all rovers are designed to roam around on the surface of other worlds like Mars. One rover, at least, is aquatic; a necessary development if we’re going to explore Enceladus, Europa, and the Solar System’s other watery worlds. This rover is called the Buoyant Rover for Under-Ice Exploration, or BRUIE.

Continue reading “Aquatic Rover Drives on the Underside of the Ice in Antarctica”

Shape-shifting Robots Like These Could Be Just What We Need to Explore Titan

When it comes to space exploration, it’s robots that do most of the work. That trend will continue as we send missions onto the surfaces of worlds further and further into the Solar System. But for robots to be effective in the challenging environments we need to explore—like Saturn’s moon Titan—we need more capable robots.

A new robot NASA is developing could be the next step in robotic exploration.

Continue reading “Shape-shifting Robots Like These Could Be Just What We Need to Explore Titan”

Astronaut Scott Tingle Was Able To Control A Ground-Based Robot… From Space.

If something called “Project METERON” sounds to you like a sinister project involving astronauts, robots, the International Space Station, and artificial intelligence, I don’t blame you. Because that’s what it is (except for the sinister part.) In fact, the Meteron Project (Multi-Purpose End-to-End Robotic Operation Network) is not sinister at all, but a friendly collaboration between the European Space Agency (ESA) and the German Aerospace Center (DLR.)

The idea behind the project is to place an artificially intelligent robot here on Earth under the direct control of an astronaut 400 km above the Earth, and to get the two to work together.

“Artificial intelligence allows the robot to perform many tasks independently, making us less susceptible to communication delays that would make continuous control more difficult at such a great distance.” – Neil Lii, DLR Project Manager.

On March 2nd, engineers at the DLR Institute of Robotics and Mechatronics set up the robot called Justin in a simulated Martian environment. Justin was given a simulated task to carry out, with as few instructions as necessary. The maintenance of solar panels was the chosen task, since they’re common on landers and rovers, and since Mars can get kind of dusty.

Justin is a pretty cool looking robot. Image: (DLR) German Aerospace Center (CC-BY 3.0)

The first test of the METERON Project was done in August. But this latest test was more demanding for both the robot and the astronaut issuing the commands. The pair had worked together before, but since then, Justin was programmed with more abstract commands that the operator could choose from.

American astronaut Scott Tingle issued commands to Justin from a tablet aboard the ISS, and the same tablet also displayed what Justin was seeing. The human-robot team had practiced together before, but this test was designed to push the pair into more challenging tasks. Tingle had no advance knowledge of the tasks in the test, and he also had no advance knowledge of Justin’s new capabilities. On-board the ISS, Tingle quickly realized that the panels in the simulation down here were dusty. They were also not pointed in the optimal direction.

This was a new situation for Tingle and for Justin, and Tingle had to choose from a range of commands on the tablet. The team on the ground monitored his choices. The level of complexity meant that Justin couldn’t just perform the task and report it completed, it meant that Tingle and the robot also had to estimate how clean the panels were after being cleaned.

“Our team closely observed how the astronaut accomplished these tasks, without being aware of these problems in advance and without any knowledge of the robot’s new capabilities,” says DLR engineer Daniel Leidner.

Streaks of dust or sand on NASA’s Mars rover Opportunity show what can happen to solar panels on the red planet. For any more permanent structures that we may put on Mars, an artificially intelligent maintenance robot under the control of an astronaut in orbit could be the perfect solution to the maintenance of solar panels. Credits: NASA/JPL-Caltech

The next test will take place in Summer 2018 and will push the system even further. Justin will have an even more complex task before him, in this case selecting a component on behalf of the astronaut and installing it on the solar panels. The German ESA astronaut Alexander Gerst will be the operator.

If the whole point of this is not immediately clear to you, think Mars exploration. We have rovers and landers working on the surface of Mars to study the planet in increasing detail. And one day, humans will visit the planet. But right now, we’re restricted to surface craft being controlled from Earth.

What METERON and other endeavours like it are doing, is developing robots that can do our work for us. But they’ll be smart robots that don’t need to be told every little thing. They are just given a task and they go about doing it. And the humans issuing the commands could be in orbit around Mars, rather than being exposed to all the risks on the surface.

“Artificial intelligence allows the robot to perform many tasks independently, making us less susceptible to communication delays that would make continuous control more difficult at such a great distance,” explained Neil Lii, DLR Project Manager. “And we also reduce the workload of the astronaut, who can transfer tasks to the robot.” To do this, however, astronauts and robots must cooperate seamlessly and also complement one another.

These two images from the camera on NASA’s Mars Global Surveyor show the effect that a global dust storm has on Mars. On the left is a normal view of Mars, on the right is Mars obscured by the haze from a dust storm. Image: NASA/JPL/MSSS

That’s why these tests are important. Getting the astronaut and the robot to perform well together is critical.

“This is a significant step closer to a manned planetary mission with robotic support,” says Alin Albu-Schäffer, head of the DLR Institute of Robotics and Mechatronics. It’s expensive and risky to maintain a human presence on the surface of Mars. Why risk human life to perform tasks like cleaning solar panels?

“The astronaut would therefore not be exposed to the risk of landing, and we could use more robotic assistants to build and maintain infrastructure, for example, with limited human resources.” In this scenario, the robot would no longer simply be the extended arm of the astronaut: “It would be more like a partner on the ground.”

This Video of a Cyborg Quadriped Will Have You Gasping in Terror

This is both wonderful and terrifying. A DARPA-funded four-legged robot named WildCat is being developed by a company called Boston Dynamics (tagline of “Changing Your Idea of What Robots Can Do”). They’ve previously developed a humanoid capable of walking across multiple terrains called Atlas, and the scarily-fast Cheetah which set a new land-speed record for legged robots. But the WildCat is a brand new robot created to run fast on all types of terrain, and so far its top speed has been about 16 mph on flat terrain using both bounding and galloping gaits.

The video, released yesterday, shows WildCat’s best performance so far. Don’t let the sound fool you — yes, it does sound like a weed-whacker. But as soon as it raises up off its haunches, you know you’re doomed.

I’ve been trying to figure out what sci-fi equivalent might describe it best: the Terminator’s pet? A lethal, non-fuzzy Daggit from Battlestar Galactica? An AT-AT Walker on speed?

At any rate … Yikes!

New Meme Puts Space Robots in Their Place

Oh, those space robots. They don’t always do what we want them to do, but we love them anyway. If you need a fun diversion in your day, a new Tumblr site has arisen to call out the robots who have made mistakes. Called “Shaming Robots” it started innocently with an image posted of the engineering model of the Curiosity rover blaming the engineering Opportunity rover for messing up JPL’s Mars Yard. There’s now pages of shamed robots (both space and Earth-based). Submit your own if you have a robot you’d like to shame. You can also follow the fun discussion on Twitter at the hashtag #robotshaming.

Astronomer Alex Parker stared the 'Robot Shaming' meme with this image of the engineering model of Curiosity at JPL.
Astronomer Alex Parker stared the ‘Robot Shaming’ meme with this image of the engineering model of Curiosity at JPL.

New Amazingly Life-like Android Better Than Star Trek’s Data

Even though the Star Trek character “Data” was played by a human, this new android might be more life-like. Watch the video, and I think you’ll agree that it is hard to tell (at first) that this is a robot. It’s called Geminoid DK, built by the Intelligent Robotics lab at Osaka University and designed by professor Hiroshi Ishiguro. Just like Data was modeled after his creator Doctor Noonian Soong, the Geminoid DK is created in the likeness of professor Henrik Scharfe of Aalborg University in Denmark. Not sure if it can whistle or if it remembers every fact to which it is exposed, but Geminoid DK has a better hairdo (and beard) than Data, and it can smile.


“All of the movements and expressions of Geminoid DK are remote controlled by an operator with a computer, who uses a motion-capture system that tracks facial expressions and head movements. Turn your head and the Geminoid does the same; move your mouth and the android follows suit,” IEEE Spectrum reports.

The Geminoid is going to be used for researching “emotional affordances” in human-robot interaction, the novel notion of “blended presence,” as well as cultural differences (from different continents) in the perception of robots.

This is the third in a series of life-like robots built by Ishiguro – the first was made to look like Ishiguro himself, the second resembled a young Japanese model. Ishiguro and Sharfe are working together on this latest robot project.

[/caption]

For more info see the Geminoid DK website.

Source: IEEE Spectrum via EarthSKy Blog