Bringing Space to the Masses: Q&A with Planetary Resources’ Chris Lewicki

Chris Lewicki in the clean room. His role as flight director for the two MER rovers and surface operations manager for the Phoenix mission required an intimate knowledge of all the spacecraft systems. Image courtesy Chris Lewicki.

Chris Lewicki is the President and Chief Engineer for one of the most pioneering and audacious companies in the world today. Planetary Resources was founded in 2008 by two leading space advocates, Peter Diamandis, Chairman and CEO of the X-Prize Foundation and Eric Anderson, a forerunner in the field of space tourism. In from the earliest days of the company, in turning to Lewicki, Anderson and Diamandis have gained scientific and management expertise which reaches far beyond low Earth orbit.

Chris is a recipient of two NASA Exceptional Achievement Medals and has an asteroid name in his honour, 13609 Lewicki. Chris holds bachelor’s and master’s degrees in Aerospace Engineering from the University of Arizona.

In this exclusive interview with Nick Howes, Lewicki gives us a feel for what lies behind Planetary Resources most compelling step yet in their quest to bring space to the masses.

Chris Lewicki is the President and Chief Engineer for Planetary Resources, Inc. Image courtesy Planetary Resources.
Chris Lewicki is the President and Chief Engineer for Planetary Resources, Inc. Image courtesy Planetary Resources.

Nick Howes – So Chris, what first inspired you to get in to astronomy and space science?

Chris Lewicki – So, I guess it wasn’t a person as most would say, but a mission that got me started on this road. Even before college, and you have to remember I grew up in dairy country in Northern Wisconsin, where we didn’t really have much in the way of space. I wanted to do something interesting, and found I was good at math. When I saw the Voyager 2 spacecraft flyby of Neptune and Triton, I thought “wow this is it,” and wanted to work at JPL pretty much from that moment onwards. Thinking that this was a “really special place.”

Voyager 2's encounter with Nepture. Credit: NASA
Voyager 2’s encounter with Nepture. Credit: NASA

NH – At college were you determined to work for someone like NASA, and was your time at Blastoff a good stepping stone in to this?

CL – I think it really did start even before college, like I said, from the Voyager 2 encounter and all the subsequent missions which JPL were involved in this was kind of the goal. Ahead of JPL though, was my first encounter with Peter (Diamandis) and Eric (Anderson) when we worked on starport.com where I was a web developer. Prior to that I’d had a spell at the Goddard Space Flight Centre, but with Eric and Peter, we really did form a bond. Starport didn’t last too long though, as it was at the time of the dotcom boom and bubble, but it taught me some valuable lessons in those months.

Then I took up a position at JPL, but as you probably know, not everything they do is mission design and planning, and while it is an amazing place, I wanted to get my hands on some real mission stuff, so moved on after just under a year.

Then came Blastoff which kind of set a lot of the wheels in motion for ideas relating to the Google Lunar X-Prize. We had a lot of fun there designing rovers and exploratory missions to the Moon, lots of great people with great ideas.

I was then at a small satellites conference in Utah, when a representative of JPL came up to me after my talk, gave me his business card and effectively said I should come and do an interview for them. Peter and Eric didn’t really want me to go, but I told them “I really have to go off and learn how to build rockets.” Thus really started the real journey working with NASA on some of the most exciting missions in recent history.

NH – How thrilling was it being the flight director for two of the most successful missions in NASA’s history?

A view of the Flight Control room at the Jet Propulsion Laboratory during the landing of the Spirit Mars Exploration Rover, Spirit, with Chris in the  Flight Director hotseat. Credit: NASA/JPL.
A view of the Flight Control room at the Jet Propulsion Laboratory during the landing of the Spirit Mars Exploration Rover, Spirit, with Chris in the Flight Director hotseat. Credit: NASA/JPL.

CL – Thrilling really doesn’t come close to covering it. There I was, 29 years old, thinking “should I really be doing this?” but then, realising “yes, I can do this” sitting in the flight directors desk for two of NASA’s most audacious missions, being Spirit and Opportunity. It was my role to get them safely down on the surface, and boy did we test those missions.

The simulators were so realistic; we’d be running so many different scenarios for years prior to the actual EDL phase, now known as the “7 minutes of terror”. It really doesn’t feel quite real though when it’s actually happening, you just know it is because the room is full of TV cameras, and you have that extra notion in the back of your mind saying it’s not a sim this time. The telemetry though in the simulations was so close to the real data, just a few variations, it kind of showed how much testing and planning went in to those missions, and how it all paid off.

NH – With Phoenix you’d obviously experienced the sadness of the loss of Polar Lander before hand; did that teach you any valuable lessons which you have now carried forward to your role at Planetary Resources?

CL – Phoenix started with a failure review, but that’s what I think is so important about engineering and indeed life in general. You have to fail to understand how to make things better. During that design review we figured out a dozen more reasons for things that could have gone wrong with Mars Polar Lander, and implemented the changes for Phoenix. You have to plan for failure so much with missions of this type, and it’s quite an exhilarating but in some ways stressful ride, and one that after Phoenix I felt like I needed to pass the mantle on to for Curiosity.

NH – On the topic of Planetary Resources, when did you start to think about being part of a company of this magnitude?

Artist concept of the ARKYD spacecraft by an asteroid. Credit: Planetary Resources.
Artist concept of the ARKYD spacecraft by an asteroid. Credit: Planetary Resources.

CL – Well working with Peter and Eric again was mooted as long ago as 2008, the company ideas being formulated then when it was called Arkyd Astronautics, a name which stuck with us until 2012. Eric and Peter approached me about possibly coming back. As I said, I’d pretty much resigned myself to not working on Curiosity, and having to put myself through all of the phases associated with that landing, and there’s a quote which many people believe comes from Mark Twain, but is really from Jackson Brown, that basically says

“Twenty years from now you will be more disappointed by the things that you didn’t do than by the ones you did do. So throw off the bowlines. Sail away from the safe harbor. Catch the trade winds in your sails. Explore. Dream. Discover” I decided to throw off the bowlines and set sail with Planetary Resources.

NH – How do you see your relationship with a company like Planetary Resources with the major space agencies? Do you see yourselves as complimenting them or competing?

CL – Complimenting totally. NASA has over 50 years of incredible exploration, missions, research, development and insight, and a great future ahead of them too. With NASA recently transferring some of their low Earth orbit operations in to the commercial sector, we feel that this is really a great time to be in this industry, with our goals for being at the forefront of the types of science and commercial operations that the business sector can excel in, leaving NASA to focus on the amazing deep space missions, like landing on Europa or going back to Titan, missions like that, which only the large government agencies can really pull off at this time.

NH – The Arkyd has to be one of the most staggering Kickstarter success stories ever, raising aaround $800,000 in a week…did you imagine that the reaction to putting a space telescope available for all in to orbit would garner so much enthusiasm?

Artist concept of the ARKYD telescope in space. Credit: Planetary Resources.
Artist concept of the ARKYD telescope in space. Credit: Planetary Resources.

CL – Staggering again doesn’t really do it enough justice. This is the biggest space based Kickstarter in their history, as it’s also in the photography category; it’s the biggest photographic Kickstarter ever too. We have many more surprises planned which I can’t go in to now, but in setting the $1 million minimum bar to “test the water” with public interest in a space telescope, we’ve not really exceeded expectations, but absolutely reached what we felt was possible. From talking to people ahead of the launch, and just seeing their reaction (note from author, I was one of those people, and my reaction was jaw dropping) we knew we had something really special. The idea of the space selfie we felt was part of the cornerstone of what we wanted to achieve, opening up space to everyone, not just the real die hard space enthusiasts.

NH – With the huge initial success of the Arkyd project, do you see any scope for a flotilla of space telescopes for the public, much like say the LCOGT or iTelescope networks are on Earth?

CL – Possibly in the future. You yourself know with your work with the Las Cumbres and Faulkes network and iTelescope networks that having a suite of telescopes around the planet has huge benefits when it comes to observations and science. At present we have the plan for one telescope for public use as you know.

The Arkyd 100, which will be utilising our Arkyd technologies, which we’ll be using to examine near Earth asteroids. If you think, that in the last 100 years, the Hale’s, Lowell’s etc of this world were all private individuals sponsoring and building amazing instruments for space exploration, it’s really just a natural progression on from this. We’re partnering closely with the Planetary Society on this, as they have common goals and interests to us, and also with National Geographic. We feel this really does open up space to a whole new group of people, and it’s apparent from the phenomenal interest we’ve had from Kickstarter, and the thousands of people who’ve pledged their support, that this vision was right.

NH – Planetary Resources has some huge goals in terms of asteroids in future, but you seem to have a very balanced and phased scientific plan to study and then proceed to the larger scale operations. Does this come from your science background?

CL – As I said, I grew up in dairy country in Wisconsin, where I had to really make my own opportunities be a part of this industry, there was no space there. On saying that, I have been an advocate of space pretty much all my life, and yes, I guess my scientific background, and experience with working at JPL has come to bear in Planetary Resources. We have a solid plan in terms of risk management with our “swarm” mentality, of sending up lots of spacecraft, and even if one or more fails, we’ll still be able to get valuable science data. I see it really in that lots of people have big ideas, and set up companies with them, but then after initial investment dries up, the ideas may still be big and there, but there is no way to pursue them.

We’ve all come from companies which have seen this kind of mindset in the past, and now, whilst we love employing students and college graduates who have big ideas, who take chances, we have a plan, a long term, and sustainable plan, and yes, we’re taking a steady approach to this, so that we can guarantee that our investors get a return on what they have supported.

NH – Can you give us a timeline for what Planetary Resources aim to achieve?

CL – Our first test launch will be as early as 2014, and then in 2015 we’ll start with the space telescopes using the Arkyd technology. By 2017 we hope to be identifying and on our way to classification of potentially interesting NEO targets for future mining. By the early 2020’s the aim is to be doing extraction from asteroids, and starting sample return missions.

NH – You were and still it seems from all I have read, remain passionate about student involvement, with SEDS etc, what could you say to younger people inspired by what you’re doing to encourage them to get in to the space industry?

CL – Tough one, but I’d say that looking at the people you admire, always remember that they are not superhuman, they are like you and me, but to have goals, take chances and be determined is a great way to look forward. The SEDS movement played a big part in my early life, and I would encourage any student to get involved in that for sure.

NH – In conclusion, what would be your ultimate goal as a pioneer of the new frontier in space exploration?

CL – Our ultimate goal is to be the developer of the economic engine that makes space exploration commercially viable. Once we have established that, we can then look at more detailed exploration of space, with tourism, scientific missions, and extending our reach out even further. I’ve already been a part of placing three missions on the surface of Mars, so nothing really is beyond our reach.

Nick’s closing comments :

I first met Chris at the Spacefest V conference in Tucson, where he gave me a preview of the Arkyd space telescope. There is no doubt in my mind that after meeting him, that he and the team at Planetary Resources will succeed in their mission. A quite brilliant individual, but humble with it, someone who you can spend hours talking to and come away feeling truly inspired. This interview we talked for what seemed like hours, and Chris said I could have written a book with the answers he gave, I hope this article gives you some taste however of the person behind the missions which, at the new frontier of exploration, much like the prospectors in the Gold Rush, are charting new and unknown, yet hugely exiting territories. As the old saying goes…and possibly more aptly then ever… watch this space.

You can find out more about the ARKYD project at the Planetary Resources website.

Behind the Scenes of SOFIA – The World’s Most Remarkable Observatory

The side of the SOFIA aircraft shows it's joint roots, a collaboration between NASA and German Scientists. Credit: Nick Howes

[/caption]

One of the most remarkable observatories in the world does its work not on a mountaintop, not in space, but 45,000 feet high on a Boeing 747. Nick Howes took a look around this unique airliner as it made its first landing in Europe.

SOFIA (Stratospheric Observatory for Infrared Astronomy) came from an idea first mooted in the mid-1980s. Imagine, said scientists, using a Boeing 747 to carry a large telescope into the stratosphere where absorption of infrared light by atmospheric water molecules is dramatically reduced, even in comparison with the highest ground-based observatories. By 1996 that idea had taken a step closer to reality when the SOFIA project was formally agreed between NASA (who fund 80 percent of the cost of the 330 million dollar mission, an amount comparable to a single modest space mission) and the German Aerospace Centre (DLR, who fund the other 20 percent). Research and development began in earnest using a highly modified Boeing 747SP named the ‘Clipper Lindburgh’ after the famous American pilot, and where the ‘SP’ stands for ‘Special Performance’.

Maiden test flights were flown in 2007, with SOFIA operating out of NASA’s Dryden Flight Research Center at Edwards Airforce Base in the Rogers Dry Lake in California – a nice, dry location that helps with the instrumentation and aircraft operationally.

This scale model shows the telescope position and how the aircraft design works around it. Credit: Nick Howes.

As the plane paid a visit to the European Space Agency’s astronaut training centre in Cologne, Germany, I was given a rare opportunity to look around this magnificent aircraft as part of a European Space ‘Tweetup’ (a Twitter meeting). What was immediately noticeable was the plane’s shorter length to the ones you usually fly on, which enables the aircraft to stay in the air for longer, a crucial aspect for its most important passenger, the 2.7-metre SOFIA telescope. Its Hubble Space Telescope-sized primary mirror is aluminium coated and bounces light to a 0.4-metre secondary, all in an open cage framework that literally pokes out of the side of the aircraft.

As we have seen, the rationale for placing a multi-tonne telescope on an aircraft is that by doing so it is possible to escape most of the absorption effects of our atmosphere. Observations in infrared are largely impossible for ground-based instruments at or near sea-level and only partially possibly even on high mountaintops. Water vapour in our troposphere (the lower layer of the atmosphere) absorbs so much of the infrared light that traditionally the only way to beat this was to send up a spacecraft. SOFIA can fill a niche by doing nearly the same job but at far less risk and with a far longer life-span. The aircraft has sophisticated infrared monitoring cameras to check its own output,and water vapour monitoring to measure what little absorption is occurring.

The Sofia Telescope resides behind the multi tonne frame and control mechanism. Credit: Nick Howes.

The 2.7-metre mirror (although actually only 2.5-metres is really used in practice,) uses a glass ceramic composite that is highly thermally tolerant, which is vital given the harsh conditions that the aircraft puts the isolated telescope through. If one imagines the difficulty amateur astronomers have some nights with telescope stability in blustery conditions, spare a thought for SOFIA, whose huge f/19.9 Cassegrain reflecting telescope has to deal with an open door to the
800 kilometres per hour (500 miles per hour) winds .Nominally some operations will occur at 39,000 feet (approximately 11,880 metres) rather than the possible ceiling of 45,000 feet (13,700 metres), because while the higher altitude provides slightly better conditions in terms of lack of absorption (still above 99 percent of the water vapour that causes most of the problems), the extra fuel needed means that observation times are reduced significantly, making the 39,000
feet altitude operationally better in some instances to collect more data. The aircraft uses a cleverly designed air intake system to funnel and channel the airflow and turbulence away from the open telescope window, and speaking to the pilots and scientists, they all agreed that there was no effect caused by any output from the aircraft engines as well.

Staying cool

The cameras and electronics on all infrared observatories have to be maintained at very low temperatures to avoid thermal noise from them spilling into the image, but SOFIA has an ace up its sleeve. Unlike a space mission (with the exception of the servicing missions to the Hubble Space Telescope that each cost $1.5 billion including the price of launching a space shuttle), SOFIA has the advantage of being able to replace or repair instruments or replenish its coolant, allowing an estimated life-span of at least 20 years, far longer than any space-based infrared mission that runs out of coolant after a few years.

Meanwhile the telescope and its cradle are a feat of engineering. The telescope is pretty much fixed in azimuth, with only a three-degree play to compensate for the aircraft, but it doesn’t need to move in that direction as the aircraft, piloted by some of NASA’s finest, performs that duty for it. It can work between a 20–60 degree altitude range during science operations. It’s all been engineered to tolerances that make the jaw drop. The bearing sphere, for example, is polished to an accuracy of less than ten microns, and the laser gyros provide angular increments of 0.0008 arcseconds. Isolated from the main aircraft by a series of pressurised rubber bumpers, which are altitude compensated, the telescope is almost completely free from the main bulk of the 747, which houses the computers and racks that not only operate the telescope but provide the base station for any observational scientists flying with the plane.

PI in the Sky

The science principle investigators get to sit in relative comfort close to the telescope. Credit: Nick Howes.

The Principle Investigator station is located around the mid-point of the aircraft, several metres from the telescope but enclosed within the plane (exposed to the air at 45,000 feet, the crew and scientists would otherwise be instantly killed). Here, for ten or more hours at a time, scientists can gather data once the door opens and the telescope is pointing at the target of choice, with the pilots following a precise flight path to maintain both the instrument pointing accuracy and also to best avoid the possibility of turbulence. Whilst ground-based telescopes can respond quickly to events such as a new supernova, SOFIA is more regimented in its science operations and, with proposal cycles over six months to a year, one has to plan quite accurately how best to observe an object.

Forecasting the future

Science operations started in 2010 with FORCAST (Faint Object Infrared Camera for Sofia Telescope) and continued into 2011 with the GREAT (German Receiver for Astronomy at Teraherz Frequencies) instrument. FORCAST is a mid/far infrared instrument working with two cameras between at five and forty microns (in tandem they can work between 10–25 microns) with a 3.2 arcminute field-of-view. It saw first light on Jupiter and the galaxy Messier 82, but will be working on imaging the galactic centre, star formation in spiral and active galaxies and also looking at molecular clouds, one of its primary science goals enabling scientists to accurately determine dust temperatures and more detail on the morphology of star forming regions down to less than three-arcsecond resolution (depending on the wavelength the instrument works at). Alongside this, FORCAST is also able to perform grism (i.e. a grating prism) spectroscopy, to get more detailed information on the composition of objects under view. There is no adaptive optics system, but it doesn’t need one for the types of operations it’s doing.

FORCAST and GREAT are just two of the ‘basic’ science operation instruments, which also include Echelle spectrographs, far infrared spectrometers and high resolution wideband cameras, but already the science team are working on new instruments for the next phase of operations. Instrumentation switch over, whilst complex, is relatively quick (comparable to the time it takes to switch instruments on larger ground observatories), and can be achieved in readiness for observations, which the plane aims to do up to 160 times per year. And whilst there were no firm plans to build a sister ship for SOFIA, there have been discussions among scientists to put a larger telescope on an Airbus A380.

A model of the telescope shows its unique control and movement mechanism as well as the optical tube assembly. Credit: Nick Howes.

Sky Outreach

With a planned science ambassador programme involving teachers flying on the aircraft to do research, SOFIA’s public profile is going to grow. The science output and possibilities from instruments that are constantly evolving, serviceable and improvable every time it lands is immeasurable in comparison to space missions. Journalists had only recently been afforded the opportunity to visit this remarkable aircraft, and it was a privilege and honour to be one of the first people to see it up close. To that end I wish to thank ESA and NASA for the invitation and chance to see something so unique.

RoboScopes – Real Armchair Astronomy

The Faulkes Telescope. Credit: Faulkes Telescope/LCOGT

[/caption]

Using and getting the most out of robotic astronomy

Whilst nothing in the field of amateur astronomy beats the feeling of being outside looking up at the stars, the inclement weather many of us have to face at various times of year, combined with the task of setting up and then packing away equipment on a nightly basis, can be a drag. Those of us fortunate enough to have observatories don’t face that latter issue, but still face the weather and usually the limits of our own equipment and skies.

Another option to consider is using a robotic telescope. From the comfort of your home you can make incredible observations, take outstanding astrophotos, and even make key contributions to science!

The main elements which make robotic telescopes appealing to many amateur astronomers are based around 3 factors. The first is that usually, the equipment being offered is generally vastly superior to that which the amateur has in their home observatory. Many of the robotic commercial telescope systems, have large format mono CCD cameras, connected to high precision computer controlled mounts, with superb optics on top, typically these setups start in the $20-$30,000 price bracket and can run up in to the millions of dollars.

A look at the Faulkes Telescope South inside. Credit: Faulkes Telescope/LCOGT

Combined with usually well defined and fluid workflow processes which guide even a novice user through the use of the scope and then acquisition of images, automatically handling such things as dark and flat fields, makes it a much easier learning curve for many as well, with many of the scopes specifically geared for early grade school students.

Screenshot of the Faulkes Telescope realtime interface. Credit: Faulkes Telescope/LCOGT

The second factor is geographic location. Many of the robotic sites are located in places where average rainfall is a lot lower than say somewhere like the UK or North Eastern United States for example, with places like New Mexico and Chile in particular offering almost completely clear dry skies year round. Robotic scopes tend to see more sky than most amateur setups, and as they are being controlled over the Internet, you yourself don’t even have to get cold outside in the depths of winter. The beauty of the geographic location aspect is that in some cases, you can do your astronomy during the daytime, as the scopes may be on the other side of the world.

iTelescope systems are located all over the globe. Credit: iTelescope project

The third is ease of use, as it’s nothing more than a reasonably decent laptop, and solid broadband connection that’s required. The only thing you need worry about is your internet connection dropping, not your equipment failing to work. With scopes like the Faulkes or Liverpool Telescopes, ones I use a lot, they can be controlled from something as modest as a netbook or even an Android/iPad/iPhone, easily. The issues with CPU horsepower usually comes down to the image processing after you have taken your pictures.

Software applications like the brilliant Maxim DL by Diffraction Limited which is commonly used for image post processing in amateur and even professional astronomy, handles the FITS file data which robotic scopes will deliver. This is commonly the format images are saved in with professional observatories, and the same applies with many home amateur setups and robotic telescopes. This software requires a reasonably fast PC to work efficiently, as does the other stalwart of the imaging community, Adobe Photoshop. There are some superb and free applications which can be used instead of these two bastions of the imaging fraternity, like the excellent Deep Sky stacker, and IRIS, along with the interestingly named “GIMP” which is variant on the Photoshop theme, but free to use.

Some people may say just handling image data or a telescope over the internet detracts from real astronomy, but it’s how professional astronomers work day in day out, usually just doing data reduction from telescopes located on the other side of the world. Professionals can wait years to get telescope time, and even then rather than actually being a part of the imaging process, will submit imaging runs to observatories, and wait for the data to roll in. (If anyone wants to argue this fact…just say “Try doing eyepiece astronomy with the Hubble”)

The process of using and imaging with a robotic telescope still requires a level of skill and dedication to guarantee a good night of observing, be it for pretty pictures or real science or both.

Location Location Location

The location for a robotic telescope is critical as if you want to image some of the wonders of the Southern Hemisphere, which those of us in the UK or North America will never see from home, then you’ll need to pick a suitably located scope. Time of day is also important for access, unless the scope system allows an offline queue management approach, whereby you schedule it to do your observations for you and just wait for the results. Some telescopes utilise a real time interface, where you literally control the scope live from your computer, typically through a web browser interface. So depending on where in the world it is, you may be in work, or it may be at a very unhealthy hour in the night before you can access your telescope, it’s worth considering this when you decide which robotic system you wish to be a part of.

Telescopes like the twin Faulkes 2-metre scopes, which are based on the Hawaiian island of Maui, atop a mountain, and Siding Spring, Australia, next to the world famous Anglo Australian Observatory, operate during usual school hours in the UK, which means night time in the locations where the scopes live. This is perfect for children in western Europe who wish to use research grade professional technology from the classroom, though the Faulkes scopes are also used by schools and researchers in Hawaii.

The type of scope/camera you choose to use, will ultimately also determine what it is you image. Some robotic scopes are configured with wide field large format CCD’s connected to fast, low focal ratio telescopes. These are perfect for creating large sky vistas encompassing nebulae and larger galaxies like Messier 31 in Andromeda. For imaging competitions like the Astronomy Photographer of the Year competition, these wide field scopes are perfect for the beautiful skyscapes they can create.

Scopes like the Faulkes Telescope North, even though it has a huge 2m (almost the same size as the one on the Hubble Space Telescope) mirror, is configured for smaller fields of view, literally only around 10 arcminutes, which will nicely fit in objects like Messier 51, the Whirpool Galaxy, but would take many separate images to image something like the full Moon (If Faulkes North were set up for that, which it’s not). It’s advantage is aperture size and immense CCD sensitivity. Typically our team using them is able to image a magnitude +23 moving object (comet or asteroid) in under a minute using a red filter too!

A field of view with a scope like the twin Faulkes scopes, which are owned and operated byLCOGT is perfect for smaller deep sky objects and my own interests which are comets and asteroids.Many other research projects such as exoplanets and the study of variable stars are conducted using these telescopes.Many schools start out imaging nebulae, smaller galaxies and globular clusters, with our aim at the Faulkes Telescope Project office, to quickly get students moving on to more science based work, whilst keeping it fun. For imagers, mosaic approaches are possible to create larger fields, but this obviously will take up more imaging and telescope slew time.

Each robotic system has its own set of learning curves, and each can suffer from technical or weather related difficulties, like any complex piece of machinery or electronic system. Knowing a bit about the imaging process to begin with, sitting in on other’s observing sessions on things like Slooh, all helps. Also make sure you know your target field of view/size on the sky (usually in either right ascension and declination) or some systems have a “guided tour mode” with named objects, and make sure you can be ready to move the scope to it as quickly as possible, to get imaging. With the commercial robotic scopes, time really is money.

Global Rent-A-Scope interface

Magazines like Astronomy Now in the UK, as well as Astronomy and Sky and Telescope in the United States and Australia are excellent resources for finding out more, as they regularly feature robotic imaging and scopes in their articles. Online forums like cloudynights.com and stargazerslounge.com also have thousands of active members, many of whom regularly use robotic scopes and can give advice on imaging and use, and there are dedicated groups for robotic astronomy like the Online Astronomical Society. Search engines will also give useful information on what is available as well.

To get access to them, most of the robotic scopes require a simple sign up process, and then the user can either have limited free access, which is usually an introductory offer, or just start to pay for time. The scopes come in various sizes and quality of camera, the better they are, usually the more you pay. For education and school users as well as astronomical societies, The Faulkes Telescope (for schools) and the Bradford Robotic scope both offer free access, as does the NASA funded Micro Observatory project. Commercial ones like iTelescope, Slooh and Lightbuckets provide a range of telescopes and imaging options, with a wide variety of price models from casual to research grade instrumentation and facilities.

So what about my own use of Robotic Telescopes?

Personally I use mainly the Faulkes North and South scopes, as well as the Liverpool La Palma Telescope. I have worked with the Faulkes Telescope Project team now for a few years, and it’s a real honour to have such access to research grade intrumentation. Our team also use the iTelescope network when objects are difficult to obtain using the Faulkes or Liverpool scopes, though with smaller apertures, we’re more limited in our target choice when it comes to very faint asteroid or comet type objects.

After having been invited to meetings in an advisory capacity for Faulkes, late in 2011 I was appointed pro am program manager, co-ordinating projects with amateurs and other research groups. With regards to public outreach I have presented my work at conferences and public outreach events for Faulkes and we’re about to embark on a new and exciting project with the European Space Agency whom I work for also as a science writer.

My use of Faulkes and the Liverpool scopes is primarily for comet recovery, measurement (dust/coma photometry and embarking on spectroscopy) and detection work, those icy solar system interlopers being my key interest. In this area, I co-discovered Comet C2007/Q3 splitting in 2010, and worked closely with the amateur observing program managed by NASA for comet 103P, where my images were featured in National Geographic, The Times, BBC Television and also used by NASA at their press conference for the 103P pre-encounter event at JPL.

The 2m mirrors have huge light grasp, and can reach very faint magnitudes in very little time. When attempting to find new comets or recover orbits on existing ones, being able to image a moving target at magnitude 23 in under 30s is a real boon. I am also fortunate to work alongside two exceptional people in Italy, Giovanni Sostero and Ernesto Guido, and we maintain a blog of our work, and I am a part of the CARA research group working on comet coma and dust measurements, with our work in professional research papers such as the Astrophysical Journal Letters and Icarus.

The Imaging Process

When taking the image itself, the process starts really before you have access to the scope. Knowing the field of view, what it is you want to achieve is critical, as is knowing the capabilities of the scope and camera in question, and importantly, whether or not the object you want to image is visible from the location/time you’ll be using it.

First thing I would do if starting out again is look through the archives of the telescope, which are usually freely available, and see what others have imaged, how they have imaged in terms of filters, exposure times etc, and then match that against your own targets.

Ideally, given that in many cases, time will be costly, make sure that if you’re aiming for a faint deep sky object with tenuous nebulosity, you don’t pick a night with a bright Moon in the sky, even with narrowband filters, this can hamper the final image quality, and that your choice of scope/camera will in fact image what you want it to. Remember that others may also want to use the same telescopes, so plan ahead and book early. When the Moon is bright, many of the commercial robotic scope vendors offer discounted rates, which is great if you’re imaging something like globular clusters maybe, which aren’t as affected by the moonlight (as say a nebula would be)

Forward planning is usually essential, knowing that your object is visible and not too close to any horizon limits which the scope may impose, ideally picking objects as high up as possible, or rising to give you plenty of imaging time. Once that’s all done, then following the scope’s imaging process depends on which one you choose, but with something like Faulkes, it’s as simple as selecting the target/FOV, slewing the scope, setting the filter, and then exposure time and then waiting for the image to come in.

The number of shots taken depends on the time you have. Usually when imaging a comet using Faulkes I will try to take between 10 and 15 images to detect the motion, and give me enough good signal for the scientific data reduction which follows. Always remember though, that you’re usually working with vastly superior equipment than you have at home, and the time it takes to image an object using your home setup will be a lot less with a 2m telescope. A good example is that a full colour high resolution image of something like the Eagle Nebula can be obtained in a matter of minutes on Faulkes, in narrowband, something which would usually take hours on a typical backyard telescope.

For imaging a non moving target, the more shots in full colour or with your chosen filter (Hydrogen Alpha being a commonly used one with Faulkes for nebula) you can get the better. When imaging in colour, the three filters on the telescope itself are grouped into an RGB set, so you don’t need to set up each colour band. I’d usually add a luminance layer with H-Alpha if it’s an emission nebula, or maybe a few more red images if it’s not for luminance. Once the imaging run is complete, the data is usually placed on a server for you to collect, and then after downloading the FITS files, combine the images using Maxim (or other suitable software) and then on in to something like Photoshop to make the final colour image. The more images you take, the better the quality of the signal against the background noise, and hence a smoother and more polished final shot.

Between shots the only thing that will usually change will be filters, unless tracking a moving target, and possibly the exposure time, as some filters take less time to get the requisite amount of light. For example with a H-Alpha/OIII/SII image, you typically image for a lot longer with SII as the emission with many objects is weaker in this band, whereas many deep sky nebula emit strongly in the H-Alpha.

The Image Itself

NGC 6302 taken by Thomas Mills High School with the Faulkes Telescope

As with any imaging of deep sky objects, don’t be afraid to throw away poor quality sub frames (the shorter exposures which go to make up the final long exposure when stacked). These could be affected by cloud, satellite trails or any number of factors, such as the autoguider on the telescope not working correctly. Keep the good shots, and use those to get as good a RAW stacked data frame as you can. Then it’s all down to post processing tools in products like Maxim/Photoshop/Gimp, where you’d adjust the colours, levels, curves and possibly use plug ins to sharpen up the focus, or reduce noise. If it’s pure science your interested in, you’ll probably skip most of those steps and just want good, calibrated image data (dark and flat field subtracted as well as bias)

The processing side is very important when taking shots for aesthetic value, it seems obvious, but many people can overdo it with image processing, lessening the impact and/or value of the original data. Usually most amateur imagers spend more time on processing than actual imaging, but this does vary, it can be from hours to literally days doing tweaks. Typically when processing an image taken robotically, the dark and flat field calibration are done. First thing I do is access the datasets as FITS files, and bring those in to Maxim DL. Here I will combine and adjust the histogram on the image, possible running multiple iterations of a de-convolution algorithm if the start points are not as tight (maybe due to seeing issues that night).

Once the images are tightened up and then stretched, I will save them out as FITS files, and using the free FITS Liberator application bring them in to Photoshop. Here, additional noise reduction and contrast/level and curve adjustments will be made on each channel, running a set of actions known as Noels actions (a suite of superb actions by Noel Carboni, one of the worlds foremost imaging experts) can also enhance the final individual red green and blue channels (and the combined colour one).

Then, I will composite the images using layers into a colour final shot, adjusting this for colour balance and contrast. Possibly running a focus enhancement plug in and further noise reduction. Then publish them via flickr/facebook/twitter and/or submit to magazines/journals or scientific research papers depending on the final aim/goals.

Serendipity can be a wonderful thing

I got in to this quite by accident myself…. In March 2010, I had seen a posting on a newsgroup that Comet C/2007 Q3, a magnitude 12-14 object at the time, was passing near to a galaxy, and would make an interesting wide field side by side shot. That weekend, using my own observatory, I imaged the comet over several nights, and noticed a distinct change in the tail and brightness of the comet over two nights in particular.

Comet C/2007 Q3. Credit: Nick Howes

A member of the BAA (British Astronomical Association), seeing my images, then asked if I would submit them for publication. I decided however to investigate this brightening a bit further, and as I had access to the Faulkes that week, decided to point the 2m scope at this comet, to see if anything unusual was taking place. The first images came in, and I immediately, after loading them in to Maxim DL and adjusting the histogram, noticed that a small fuzzy blob appeared to be tracking the comet’s movement just behind it. I measured the separation as only a few arc-seconds, and after staring at it for a few minutes, decided that it may have fragmented.

I contacted Faulkes Telescope control, who put me in touch with the BAA comet section director, who kindly logged this observation the same day. I then contacted Astronomy Now magazine, who leapt on the story and images and immediately went to press with it on their website. The following days the media furore was quite literally incredible.

Interviews with national newspapers, BBC Radio, Coverage on the BBC’s Sky at Night television show, Discovery Channel, Radio Hawaii, Ethiopia were just a few of the news/media outlets that picked up the story.. the news went global that an amateur had made a major astronomical discovery from his desk using a robotic scope. This then led on to me working with members of the AOP project with the NASA/University of Maryland EPOXI mission team on imaging and obtaining light curve data for comet 103P late in 2010, again which led to articles and images in National Geographic, The Times and even my images used by NASA in their press briefings, alongside images from the Hubble Space Telescope. Subscription requests to Faulkes Telescope Project as a result of my discoveries went up by hundreds of % from all over the world.

In summary

Robotic telescopes can be fun, they can lead to amazing things, this past year, a work experience student I was mentor for with the Faulkes Telescope Project, imaged several fields we’d assigned to her, where our team then found dozens of new and un-catalogued asteroids, and she also managed to image a comet fragmenting. Taking pretty pictures is fun, but the buzz for me comes with the real scientific research I am now engaged in, and it’s a pathway I aim to stay on probably for the rest of my astronomical lifetime. For students and people who don’t have the ability to either own a telescope due to financial or possibly location constraints, it’s a fantastic way to do real astronomy, using real equipment, and I hope, in reading this, you’re encouraged to give these fantastic robotic telescopes a try.