By Thor’s Mighty Helmet!

Image of the Thor's Helmet nebula (NGC 2359) Credit: R. Barrena (IAC) and D. López

[/caption]

Going to see the new Avengers movie this weekend, either for the first or fortieth time? You may not see much of Thor’s helmet in the film (as he opts for more of a “Point Break” look) but astronomers using the Isaac Newton Group of telescopes on the Canary Islands have succeeded in spotting it… in this super image of the Thor’s Helmet nebula!

Named for its similarity to the famous horned Viking headgear (seen horizontally), the Thor’s Helmet nebula is a Wolf-Rayet structure created by stellar winds from the star seen near the center blowing the gas of the bluish “helmet” outwards into space via pre-supernova emissions.

The colors of the image above, acquired with the ING’s Isaac Newton Telescope, correspond to light emitted in hydrogen alpha, doubly-ionised oxygen and single-ionised sulfur wavelengths.

Super-sized for the thunder god himself, Thor’s Helmet measures at about 30 light-years across. It’s located in the constellation Canis Major, approximately 15,000 light-years from Earth. (You’d think Thor would have left his favorite accessory in a more convenient location… I suspect Loki may be behind this.)

Astronomers, assemble!

Read more about this and see other images from the ING telescopes here.

The Isaac Newton Group of Telescopes (ING) is owned by the Science and Technology Facilities Council (STFC) of the United Kingdom, and it is operated jointly with the Nederlandse Organisatie voor Wetenschappelijk Onderzoek (NWO) of the Netherlands and the Instituto de Astrofísica de Canarias (IAC) of Spain. The telescopes are located in the Spanish Observatorio del Roque de los Muchachos on La Palma, Canary Islands, which is operated by the Instituto de Astrofísica de Canarias (IAC).

Grab a seat for the Celestial Lights show!

Ole's cameras capture shimmering sheets of aurora over the Arctic. (© Ole C. Salomonsen)


Painstakingly assembled from over 150,000 digital photos taken over the course of eight months, this stunning time-lapse video of aurora-filled Arctic skies is the latest creation by photo/video artist Ole C. Salomonsen. Take a moment, turn up the sound, sit back and enjoy the show!

This is Ole’s second video project. The footage was shot on location in parts of Norway, Finland and Sweden from September 2011 to April 2012, and shows the glorious effects that the Sun’s increasing activity has had on our planet’s upper atmosphere.

Ole writes on his Vimeo page:

The video is a merge of two parts; the first part contains some more wild and aggressive auroras, as well as a few Milky Way sequences, hence either auroras are moving fast because they are or they are fast due to motion of the Milky Way / stars. Still, some of the straight-up shots are very close to real-time speed — although auroras mostly are slower, she can also be FAST!

The second part has some more slow and majestic auroras, where I have focused more on composition and foreground. The music should give you a clear indication of where you are.

[/caption]

The music was provided by Norwegian composer Kai-Anders Ryan.

Ole’s “hectic” aurora season is coming to a close now that the Sun is rising above the horizon in the Arctic Circle, and he figured that it was a good time to release the video. It will also be available on 4K Digital Cinema on request.

“Hope you like the video, and that you by watching it are able to understand my fascination and awe for this beautiful celestial phenomenon,” says Ole.

You can follow Ole’s work on Facebook at facebook.com/arcticlightphoto, and check out his website here.

Video © Ole C. Salomonsen. Music by Kai-Anders Ryan.

How to Avoid ‘Bad Astrophotography:’ Advice from Thierry Legault

A collection of different satellites...or are they? Credit: Thierry Legault

Take a look at the collection of images above. All are high resolution astrophotos of different artificial satellites, taken by renowned astrophotographer Thierry Legault, using one of his 10″ telescopes and a simple webcam. The images have been sharpened and enlarged so that it’s easy to see small structures on the satellites such as antennas or solar panels.

Like this one, which is surely the Soyuz, with solar panels on each side:

Could this object be a Soyuz spacecraft? Credit: Thierry Legault

These are pretty awesome images….

…except Thierry and I are not telling the truth.

These images are not of satellites, but are all pictures of the star Vega.

What you have just seen is an example of what Legault calls “Bad Astrophotography,” a phrase Legault uses in homage to Phil Plait and his Bad Astronomy blog. Basically, this means that because of image artifacts or over-processing you can be fooled – intentionally or unintentionally — into seeing something that is not really there.

“In any raw image there is noise and if you process this image too strongly, the noise appears and some processing can transform the noise into something that looks like detail – but it is not detail,” said Legault.

So just like the images that have been touted as the Bigfoot on Mars, or even blurry pictures of supposed UFOs, sometimes astrophotos can look like something they are not.

“Many people are not aware that an image is not reality — it is a transformation of reality,” Legault told Universe Today, “and any image that is taken under difficult conditions or close to the resolution limits of the telescope, the image is less and less reliable or reflects less and less the reality.”

Many things can cause problems in astrophotography:

  • atmospheric turbulence, which can distort images and even create false details or make real ones disappear
  • the unavoidable shaking of the telescope due to manual tracking, especially in satellite imaging
  • noise, the variation of brightness or color in images, due to sensor and circuitry of a digital camera, or the diffraction of light from the telescope

These problems may be hard to avoid, depending on your equipment and level of skill. So what should an astrophotographer do?

“The solution for these issues is to be careful with processing,” Legault explained. “I’ve often said the best, most skilled person in imaging processing is not the one that knows all the possibilities of processing, but the person that knows when to stop processing an image.”

Overprocessing

Over-processing, such as multiple smoothing, sharpening and enlargement operations, or layer transformations and combinations in Photoshop can create false details in images.

The issues with the lead image in this article of all the “satellites” — the structures and the different colors you see — are mainly caused by atmospheric turbulence and noise in the raw images, combined with effects from the color sensor in the camera.

Atmospheric Turbulence

Think of how when you look at a star that is low on the horizon with the naked eye, you see twinkling, and sometimes even changes in color, so the atmospheric turbulence can definitely make an effect on colors.

The star Vega again, a series of images put together in an animation: it appears to be a satellite during its flight showing variation of size and apparent rotation. But it is not. Credit: Thierry Legault.

“When you observe a star through a telescope at high magnification, it can become even more distorted,” Legault said. “You have spikes, distortions and changes in shape, and a star that is supposed to be a point or a disk, unfortunately, by turbulence is transformed into something that is completely distorted and can take many shapes.”

Equipment issues

Additionally, Legault said, combining the distortions with an effect from color sensors in the camera, called the Bayer sensor, can cause additional issues.

“For the sensor, you have pixels by groups of four: one red, one blue and two green in square,” Legault said, “and you can easily imagine that if the object is very small, such as a very small star, the light can fall on a red pixel and then the image can become red. Then the image of the star is distorted and you have some spikes that fall on a different color pixel.”

And then the processing does the rest, transforming turbulence and camera artifacts into details that may look real, Legault said.

Legault recalled an amateur who, a few years ago, published an image of Saturn’s moon Titan.

“The image contained surface details and a sharp disk edge,” he said, “and looked quite convincing. But we all know that Titan is covered with an opaque and uniform atmosphere, and surface details can’t be seen. The details were actually only artifacts created from noise or other image defects by over-processing a poor resolution image with multiple upsizing, downsizing, sharpening and smoothing operations.”

What’s an amateur astrophotographer to do?

So, with more and more people doing astrophotography these days, how can they make sure that what they think they are seeing is real?

“There are solutions like combining raw images,” Legault said. “When you combine 10 or 20 or 100 raw images, you can decrease the noise and the image is more reliable and less distorted by turbulence.”

For example, take a look at the images of the space shuttle Discovery below. The two left images are consecutive single frames, processed by smoothing (noise reduction), sharpening (wavelets) and was enlarged 3 times.

The space shuttle discovery imaged in orbit. Credit and copyright: Thierry Legault

The first and second images, although blurry, seem to show lots of very small details. But when they are compared together or with a combination of the 27 best images of the series (on the right), only the larger structures are finally common.

“The bright line marked A is not real, it is an artifact likely caused by turbulence,” Legault said, “and if it were an image of the space station taken during an EVA, I could perhaps claim that this detail is an astronaut, but I would be wrong. The double dark spot marked B, could be taken for windows on top of the cockpit of Discovery. But it is not real; if it were an image of the Space Station, I could claim that it’s the windows of the Cupola, but again I would be wrong. In C, the two parallel lines of the payload bay door is common to both images, but a comparison with the right image, which contains only real details, show that they are not real and that they are probably a processing artifact.”

One of the drawbacks of color sensors is that there is more noise in the image, so the image is less reliable than with black and white sensors. This is the reason that deep sky cameras often use black and white sensors. And so for imaging satellites like the International Space Station, Legault uses a black and white camera.

“It is more reliable, and you don’t need a color camera because the space station is colorless, except for the solar panels,” Legault said. “In addition, the monochrome sensor is much more sensitive to light, by 3 or 4 times. More sensitive means you have less noise.”

Logical advice

Legault’s main advice is just to be logical about what you are seeing in both raw and processed images.

“You need to look at the whole image, the consistency of the whole image, and not just one detail,” he said. “If I take an image that I say has detail on Jupiter’s satellites and on the same image I cannot even see the great red spot on Jupiter, it doesn’t work – that is not possible. The image must have an overall consistency and include details of an object larger than the one that we are interested in. So, if we see an image where someone is supposed to have an astronaut and a module of the space station, and a larger module is not visible or is completely distorted, there is a problem.”

On March 7, 2011 the robotic arm on space shuttle Discovery is used for a last inspection of the protection tiles before landing on the STS-133 mission. Image credit and copyright: Thierry Legault

Another piece of advice is to compare your image to another image taken by someone else — another amateur astrophotographer, a professional or even a space agency.

“If You have a photo of the space shuttle or the space station, for example, you can compare it to a real photo and see if all the details are there,” Legault said.

And if you still have questions about what you are seeing on your own images, Legault also suggests posting your images on astronomy forums so you can get the analysis and insights of other amateur astrophotographers.

“So, there are solutions to make sure that details are real details,” Legault said, “and as you get used to observing raw images and processed images, it will become easier to understand if everything is real, if just a part is real, or if almost nothing is real.”

But Legault’s main advice is not to over-process your images. “Many of amateurs take amazing, sharp images and using gentle and reasonable processing so that there are no artifacts.”

For more information and advice from Thierry Legault, see his website, especially the technical pages. Legault has written a detailed article for the March issue of Sky & Telescope on how to image the International Space Station.

You can also read our article on Legault’s astrophotography, published on March 1, 2012.

ISS Caught Between the Moon and New York City

The ISS passes across the face of a daytime Moon. © Alan Friedman.

[/caption]

Now as the theme from Arthur plays in your head you can enjoy this GIF animation of the ISS passing across the face of a daytime Moon, photographed by Alan Friedman from his location in upstate New York.

I know it’s crazy, but it’s true.

Alan captured these images at 10:30 a.m. EST back on September 2, 2007, and slowed down the animation a bit; in real-time the event lasted less than half a second. (Click the image for an even larger version.)

Atmospheric distortion creates the “wobbly” appearance of the Moon.

Alan Friedman is a talented photographer, printer (and avid vintage hat collector) living in Buffalo, NY. His images of the Sun in hydrogen alpha light are second-to-none and have been featured on many astronomy websites. When he’s not taking amazing photos of objects in the sky he creates beautiful hand-silkscreened greeting cards at his company Great Arrow Graphics.

See more of Alan’s astrophotography on his website, Averted Imagination.

Image © Alan Friedman. All rights reserved.

_____________

NOTE: Although this article previously stated that the images were taken Jan. 12, 2012, they were actually captured in September 2007 and re-posted on Jan. 13 of this year. Alan states that he’s since learned how to judge exposure so the ISS doesn’t appear as a streak, but personally he likes (as do I) how this one came out.

Let’s see… September 2007… that would have been Expedition 15!

Deep Blue Astrophotography – Imaging Galactic Shells

NGC7600 is an elliptical galaxy and is around 50 Mpc in distance. This image shows an interleaved system of shells that are described in this Astronomical Journal Letters here. These types of structures around elliptical galaxies were first revealed by Malin & Carter in 1980. This deep image of NGC7600 shows faint features not previously seen. Credit: Ken Crawford

[/caption]

As a professional astronomy journalist, I read a lot of science papers. It hasn’t been all that long ago that I remember studying about galaxy groups – with the topic of dark matter and dwarf galaxies in particular. Imagine my surprise when I learn that two of my friends, who are highly noted astrophotographers, have been hard at work doing some deep blue science. If you aren’t familiar with the achievements of Ken Crawford and R. Jay Gabany, you soon will be. Step inside here and let us tell you why “it matters”…

According to Ken’s reports, Cold Dark Matter (or CDM) is a theory that most of the material in the Universe cannot be seen (dark) and that it moves very slowly (cold). It is the leading theory that helps explain the formation of galaxies, galaxy groups and even the current known structure of the universe. One of the problems with the theory is that it predicts large amounts of small satellite galaxies called dwarf galaxies. These small galaxies are about 1000th the mass of our Milky Way but the problem is, these are not observed. If this theory is correct, then where are all of the huge amounts of dwarf galaxies that should be there?

Enter professional star stream hunter, Dr. David Martinez-Delgado. David is the principal investigator of the Stellar Tidal Stream Survey at the Max-Planck Institute in Heidelberg, Germany. He believes the reason we do not see large amounts of dwarf galaxies is because they are absorbed (eaten) by larger galaxies as part of the galaxy formation. If this is correct, then we should find remnants of these mergers in observations. These remnants would show up as trails of dwarf galaxy debris made up mostly of stars. These debris trails are called star streams.

“The main aim of our project is to check if the frequency of streams around Milky Way-like galaxies in the local universe is consistent with CDM models similar to that of the movie.” clarifies Dr. Martinez-Delgado. “However, the tidal destruction of galaxies is not enough to solve the missing satellite problem of the CDM cosmology. So far, the best given explanation is that some dark matter halos are not able to form stars inside, that is, our Galaxy would surround by a few hundreds of pure dark matter satellites.”

Enter the star stream hunters professional team. The international team of professional astronomers led by Dr. David Martinez-Delgado has identified enormous star streams on the periphery of nearby spiral galaxies. With deep images he showed the process of galactic cannibalism believed to be occurring between the Milky Way and the Sagittarius dwarf galaxy. This is in our own back yard! Part of the work is using computer modeling to show how larger galaxies merge and absorb the smaller ones.

This image has been inverted and contrast enhanced to help display the faint shell features and debris fragments. The farthest fragment is 140 kpc in projection from the center of the galaxy. Credit: Ken Crawford
“Our observational approach is based on deep color-magnitude diagrams that provide accurate distances, surface brightness, and the properties of stellar population of the studied region of this tidal stream.” says Dr. Martinez-Delgado (et al). “These detections are also strong observational evidence that the tidal stream discovered by the Sloan Digitized Sky Survey is tidally stripped material from the Sagittarius dwarf and support the idea that the tidal stream completely enwraps the Milky Way in an almost polar orbit. We also confirm these detections by running numerical simulations of the Sagittarius dwarf plus the Milky Way. This model reproduces the present position and velocity of the Sagittarius main body and presents a long tidal stream formed by tidal interaction with the Milky Way potential.”

Enter the team of amateurs led by R. Jay Gabany. David recruited a small group of amateur astrophotographers to help search for and detect these stellar fossils and their cosmic dance around nearby galaxies, thus showing why there are so few dwarf galaxies to be found.

“Our observations have led to the discovery of six previously undetected, gigantic, stellar structures in the halos of several galaxies that are likely associated with debris from satellites that were tidally disrupted far in the distant past. In addition, we also confirmed several enormous stellar structures previously reported in the literature, but never before interpreted as being tidal streams.” says the team. “Our collection of galaxies presents an assortment of tidal phenomena exhibiting strikingly diverse morphological characteristics. In addition to identifying great circular features that resemble the Sagittarius stream surrounding the Milky Way, our observations have uncovered enormous structures that extend tens of kiloparsecs into the halos of their host’s central spiral. We have also found remote shells, giant clouds of debris within galactic halos, jet-like features emerging from galactic disks and large-scale, diffuse structures that are almost certainly related to the remnants of ancient, already thoroughly disrupted satellites. Together with these remains of possibly long defunct companions, our survey also captured surviving satellites caught in the act of tidal disruption. Some of these display long tails extending away from the progenitor satellite very similar to the predictions forecasted by cosmological simulations.”

The .5 meter Ritchey-Chretien Telescope of the Blackbird Observatory is situated at 7300 ft.(2225 meters) elevation under spectacularly clear and dark skies in the south central Sacramento Mountains of New Mexico, near Mayhill. Photo credit: R. Wodaski

Can you imagine how exciting it is to be part of deep blue science? It is one thing to be a good astrophotographer – even to be an exceptional astrophotographer – but to have your images and processing to be of such high quality as to be contributory to true astronomical research would be an incredible honor. Just ask Ken Crawford…

“Several years ago I was asked to become part of this team and have made several contributions to the survey. I am excited to announce that my latest contribution has resulted in a professional letter that has been recently accepted by the Astronomical Journal.” comments Ken. “There are a few things that make this very special. One, is that Carlos Frenk the director of the Institute for Computational Cosmology at Durham University (UK) and his team found that my image of galaxy NGC7600 was similar enough to help validate their computer model (simulation) of how larger galaxies form by absorbing satellite dwarf galaxies and why we do not see large number of dwarf galaxies today.”

Dr. Carlos Frenk has been featured on several television shows on the Science and Discovery channels, to name a few, to explain and show some of these amazing simulations. He is the director of the Institute for Computational Cosmology at Durham University (UK), was one of the winners of the 2011 Cosmology Prize of The Peter and Patricia Gruber Foundation.

“The cold dark matter model has become the leading theoretical picture for the formation of structure in the Universe. This model, together with the theory of cosmic inflation, makes a clear prediction for the initial conditions for structure formation and predicts that structures grow hierarchically through gravitational instability.” says Frenk (et al). “Testing this model requires that the precise measurements delivered by galaxy surveys can be compared to robust and equally precise theoretical calculations.”

The Rancho Del Sol Observatory is located in the foothills of the northern California's Sierra Mountains approximately one hour north of Sacramento. It houses a .5 meter Ritchey-Chretien Telescope. Credit: Ken Crawford
And it requires very accurate depictions of studies. According to the team, this pilot survey was conducted with three privately owned observatories equipped with modest sized telescopes located in the USA and Australia. Each observing site features very dark, clear skies with seeing that is routinely at and often below 1.5 arcseconds. These telescopes are manufactured by RC Optical Systems and follow a classic Ritchey-Chretien design. The observatories are commanded with on-site computers that allow remote operation and control from any global location with highband web accesses. Each observatory uses proven, widely available remote desktop control software. Robotic orchestration of all observatory and instrument functions, including multiple target acquisition and data runs, is performed using available scripting software. Additional use of a wide field instrument was employed for those galaxies with an extended angular size. For this purpose, they selected the Astro Physics Starfire 160EDF6, a short focal length (f/7) 16 cm aperture refractor that provides a FOV of 73.7 × 110.6 arcmin. But, it’s more than just taking a photograph. The astrophotographer needs to completely understand what needs to be drawn out of the exposure. It’s more than just taking a “pretty picture”… it’s what matters.

The formation of shell galaxies in the cold dark matter universe from Kenneth Crawford on Vimeo.

“The galaxy I want to show you has some special features called ‘shells’. I had to image very deep to detect these structures and carefully process them so you can see the delicate structures within.” explains Crawford. “The galaxy name is NGC7600 and these shell structures have not been captured as well in this galaxy before. The movie above shows my image of NGC7600 blending into the simulation at about the point when the shells start to form. The movie below shows the complete simulation.”

“What is ground breaking is that the simulation uses the cold dark matter theory modeling the dark matter halos of the galaxies and as you can see, it is pretty convincing.” concludes Crawford. “So now you all know why we do not observe lots of dwarf galaxies in the Universe.”

But, we can observe some very incredible science done by some very incredible friends. It’s what matters…

For Further Reading: Tracing Out the Northern Tidal Stream of the Sagittarius Dwarf Spheroidal Galaxy, Stellar Tidal Streams in Spiral Galaxies of the Local Volume, Carlos Frenk, Simulations of the formation, evolution and clustering of galaxies and quasars, The formation of shell galaxies similar to NGC 7600 in the cold dark matter cosmogony, Star Stream Survey Images By Ken Crawford and be sure to check out the zoomable Full Size Image of NGC 7600 done by Ken Crawford. We thank you all so much for sharing your work with us!

Make iPhone Astrophotography Easier With The AstroClip!

The AstroClip™ is a simple mount that attaches your iPhone onto any telescope. Credit: Matthew Geyster.

[/caption]

They say necessity is the mother of invention, and if you’ve ever tried to take a picture through a telescope with your iPhone you’ll understand the necessity behind this invention: the AstroClip, an ingenious bit of injection-molded awesomeness that mounts an iPhone 4 onto any standard 1.25″ telescope eyepiece, keeping it stable and centered with the camera lens. I think this is a great idea and would certainly get one… that is, if it actually becomes a reality.

The AstroClip is designed to be very minimal while still being fully functional.

Invented by Boston designer Matthew Geyster, the AstroClip (patent pending) is still in development stage right now, awaiting the funding to go into production. Injection molding is a “simple but very expensive” process and in order to get the AstroClip produced Geyster has put his project up on Kickstarter, a web site that lets people pitch their great ideas that need funding and gives them a timeline to gather pledges.

If the AstroClip project can accumulate $15,000 in pledges by September 3, it will go into production. At the time of this writing there are 38 days left until then and it’s only 10% toward its goal. I’m hoping that drawing some more attention to this cool idea will help it along!

By becoming a “backer” you can pledge in several denomination categories, ranging from $1 or more to $500 or more. Each category above $25 comes with a “reward” of some sort… these are all listed on the project page.

I think Matthew has a great concept here. The camera on the iPhone 4 is very good and could take some great shots of the Moon and other astronomical objects, were it to just have a secure mount on a telescope lens. I’ve tried to do it without a mount before and really, it’s not easy.

Moon image taken with an iPhone and AstroClip

“The AstroClip is designed to be very minimal, while still being fully functional. The clip is very simple and rigid to hold your iPhone 4 steady and securely for the perfect shot. I also added the three adjustment screws that look like they’re meant to be on a telescope. With the simplicity and functionality of the AstroClip you will be taking great photos of outer space in no time at all.”

– Matthew Geyster

Honestly, I have no connection personally with this project or with Matthew… I just think this is something that would be very popular with iPhone users and astronomy enthusiasts. (I don’t even have a telescope… the light pollution in my city is pretty bad.) I just liked the idea so much I wanted to help support it however I could, and Universe Today seemed the perfect place to call attention to it!

If it proceeds the AstroClip will be entirely produced in the USA. Check it out on Kickstarter by clicking the image above or visit theastroclip.com.

Best of luck to a great idea!

All images and video © Matthew Geyster. Universe Today is not endorsing or otherwise officially supporting this project, all opinions of awesomeness are my own and all product claims are made by the product designer.

_______________________

Jason Major is a graphic designer, photo enthusiast and space blogger. Visit his websiteLights in the Dark and follow him on Twitter @JPMajor or on Facebook for the most up-to-date astronomy news and images!

Spying on Spy Satellites with Thierry Legault

Ground-based images of three different classified satellites: the X-37B, USA-186 Keyhole, and the LaCrosse 3. Credit: Thierry Legault and Emmanuel Rietsch

[/caption]

Shhhh! Don’t tell anyone, but we’ve got pictures….. ground-based pictures of secret spy satellites in Earth orbit. We’re not revealing our sources, but … oh wait, I guess we might as well tell you. Even if we didn’t reveal our source, you’d probably guess that astrophotographer extraordinaire Thierry Legault — who has been sharing his wonderfully detailed ground-based images of the space shuttle and International Space Station with Universe Today – has been working on capturing other satellites in orbit as well. Legault and his partner in imaging crime, Emmanuel Rietsch have tackled the difficult task of tracking down spy satellites and then tracking them with a telescope. For imaging the shuttle and ISS, they developed their own design of a motorized mount outfitted with a computer program so it can slowly and precisely rotate in order to track and follow an object in Earth orbit with a telescope and video camera. Now they are able to image even smaller objects.

Above are images they were able to capture of three different spy satellites, including the X-37B spaceplane. More images and videos are available at Legault’s website.

Thierry Legault with his customized satellite tracking system. Photo courtesy Thierry Legault.

Since October 2010, Legault has been using the autoguided mount, with the help of a DMK 31AF03 Firewire video camera mounted on the finder (FL 200 mm) and of the software Videos Sky, created by Rietsch, and then modified by Reitsch and Legault for fast tracking with the Takahashi EM400 mount.

The X-37B spaceplane now in orbit is the second of the two Orbital Test Vehicles launched by the US Air Force, launched on March 5, 2011. Reportedly, it will conduct experiments and tests for close to nine months and then autonomously de-orbit and land. Legault and Rietsch were able to image the spaceplane in late May of this year with fairly good results.

“I tried to get help to identify the real orientation of X-37B,” Legault told Universe Today via Skype today, “but on the contrary of the Keyhole and Lacrosse satellites, it’s not easy considering its complex shape with several wings.”

And the Air Force isn’t telling.

“Keyhole-class” (KH) reconnaissance satellites have been used for more than 30 years and are typically used to take overhead photos for military missions. Some of the keyhole satellites resemble the Hubble Space Telescope, but instead of looking out into space, it looks back at Earth. A similar type of spy satellites are the Lacrosse satellites, which are radar-imaging satellites.

But even with the tracking system, getting images of small satellites is not easy. “Despite this performing tracking system and hours of training on airplanes passing in the sky, keeping the space ship inside a sensor of a few millimeters at a focal length of 5000 mm and a speed over 1°/s needs a lot of concentration and training,” said Legault on his website.

The autoguiding and acquisition are done via a laptop with a double hard drive (one of which is a Solid State Drive – made with flash memory), enabling the precision of tracking of about one arc minute.

For security reasons, the sighting times for spy satellites are not published on an official website like NASA does for the shuttle and ISS. But with a bit of digging, Legault said others can try their luck at trying to spot these secret satellites.

“Orbital data are in the Calsky database,” Legault told UT, “therefore their passages are forecast as for the ISS. Generally, orbits are determined by amateurs, some of them are specialized in this activity, especially Kevin Fetter (and data are exchanged on the Seesat mailing list, owned by Ted Molczan).”

Legault is well-known for his images of the shuttle and ISS transiting the sun, but he said the accuracy of orbital data for the spy satellites is not sufficient for capturing a solar transit – and besides, these satellites are much smaller than the ISS and would appear as a small dark dot, at best.

“But for nighttime passages the data is sufficient,” Legault said. “Generally they are not visible with the naked eye or barely (except during flares), but they are easily visible with a finder.”

See more information, information and videos — including a view of what the tracker sees, on Legault’s website.

You can follow Universe Today senior editor Nancy Atkinson on Twitter: @Nancy_A. Follow Universe Today for the latest space and astronomy news on Twitter @universetoday and on Facebook.

Thierry Legault’s Incredible Ground-Based Views of Endeavour’s Final Flight

Four views of Endeavour docked to the ISS on May 29, 2011. Credit: Thierry Legault and Emmanuel Rietsch

[/caption]

Award-winning French astrophotographer Thierry Legault traveled through Germany, France and Spain during Endeavour’s final mission to find clear skies and good seeing to capture the shuttle’s voyage to the International Space Station. While he told us it wasn’t easy, the results are incredible! The visible detail of the shuttle and parts of the International Space Stations is absolutely amazing. You can see the newly installed Alpha Magnetic Spectrometer in one shot, as well as the open payload bay doors on Endeavour in another. The video Legault shot is available on his website, and he has unique 3-D versions as well.

Below are some of his trademark views of transits of the Sun by ISS and Endeavour, with one showing the shuttle just before it docked to the station.

Solar transit taken on May 18th from Essen, Germany through thick clouds showing Endeavour a few minutes before docking to the ISS. Transit duration was 0.7 seconds. Credit: Thierry Legault.

Legault told us he was chasing the shuttle and the station from different parts of Europe, however because of weather problems (clouds and turbulence) he was not very happy with the results. But this image is stunning anyway even though clouds dimmed available light by more than 100 times, Legault said. What is perhaps most amazing is that the transit time for this pass in front of the Sun was 0.7 seconds!

Here’s a less cloudy view taken on May 25:

A close-up view of Endeavour and the ISS transiting the sun on May 25th from France. Transit duration was 0.5 seconds. Credit: Thierry Legault.

And the full view for reference. This transit was only a half second!

Solar transit taken on May 25th from France (Orleans), showing Endeavour docked to the ISS. Credit: Thierry Legault.
Series of transits taken on May 20, 22 and 23, 2011 from different areas of France, showing variations of orientation of the ISS with Endeavour docked. On May 23, the ISS passes besides a sunspot which is larger than the Earth. Credit: Thierry Legault

All transit images were taken with Takahashi TOA-150 6″ apochromatic refractor (focal length 2400mm and 3600mm) on EM-400 mount, Baader Herschel wedge. Nikon D3X at 1/8000s, 100 ISO, working in continuous shooting at 5 frames per second during 5 seconds.

Frames from videos taken from Spain (May 31) and France (June 1) 90 minutes before deorbit burn. Credit: Thierry Legault and Emmanual Rietsch.

Here are frames from videos taken by Legault and fellow astrophotographer Emmanual Rietsch just prior to the deorbit burn for landing on June 1. The video of these shots, as well as more images are also available on Legault’s website.

Thanks to Thierry for sending Universe Today these amazing images and allowing us to post them!

Dazzling Timelapse: Canary Skies

Tenerife, Canary Islands is home to several telescopes and at 2,000 meters above sea level, it claims one of the best skies on the planet. This incredibly stunning timelapse video from astrophotographer Daniel Lopez captures the nocturnal and crepuscular beauty of the island, showing the natural movement of the earth, stars, clouds, Sun and Moon. Lopez worked over a year to capture all possible shades and landscapes, pulling out all the stops by using several different timelapse techniques. Lopez promises more videos are coming, as he says this is the first in a series to capture the beauty of each of the Canary Islands.

Find more information at Lopez’s website, and see more videos at his Vimeo page.

Photopic Sky Survey

The Photopic Sky Survey, the largest true-colour image of the night sky ever created (well, it is when you follow the link to the original 3600 rotatable image anyway). Credit: Risinger/Photopic Sky Survey.

[/caption]

The Photopic Sky Survey, the largest true-color all-sky survey – along with a constellation and star name overlay option – is available here.

For more detail on how it was created read on…

Nick Risinger decided to take a little break from work and embark on a 45,000 miles by air and 15,000 by land journey – along with his Dad, brother and a carload of astrophotography gear – to capture the biggest true color picture of the universe ever. As you do…

The requirement for the long journey is all about trying to snap the whole universe from the surface of a rotating planetary body in a solar orbit – and with a tilted axis yet. So what might be seen in the northern hemisphere isn’t always visible from the south. Likewise with the seasons, what may be overhead in the summer is below the horizon in the winter.

On top of that, there are issues of light pollution and weather to contend with – so you can’t just stop anywhere and snap away at the sky. Nonetheless, with a navigational computer to ensure accuracy and over the course of one year – Risinger broke the sky down into 624 areas (each 12 degrees wide) and captured each portion through 60 exposures. Four short, medium, and long shots with each of six cameras were taken to help reduce noise, satellite trails, and other inaccuracies.

Nick Risinger preparing an array of cameras in Colorado to shoot part of the five gigapixel Photopic Sky Survey image. Credit: Risinger/Photopic Sky Survey.

Further reading: Photopic Sky Survey home page (includes a description of the hardware and software used).