user icon sign in | sign up
Want to skip the wall of text? Try our intro video.

MoonDiff Background

MoonDiff serves up "before-and-after" pictures of the Moon taken from two orbiting spacecraft: one that visited the moon in 1967, and one that arrived in 2009 and is still flying. Volunteers are invited to compare those pictures to discover the differences. Their discoveries teach us about the Moon's dynamic surface, and contribute to lunar exploration.

The big idea

There’s a lot happening at the surface of Earth’s Moon. Space rocks hit and blast new craters. Rocks break off and roll down hills. Spaceships land. Spaceships crash. Planetary scientists are still working on important questions, like:

  • What’s the meteorite impact cratering rate? How much space rock hits the moon per unit time?
  • Where are the missing spacecraft crash sites? On this table of landing sites and crash sites, we’re still looking for all of the things in the red rows.
  • What kind of geologic changes are happening right now on the lunar surface?

One way to keep track of changes on the moon is to compare images collected by the various cameras that have orbited the Moon since the first orbiter arrived in 1966. (Another way is to watch, from Earth, for flashes, an approach with obvious limitations). From 2009 until today (February 2023), the Lunar Reconnaissance Orbiter (LRO) has continuously delivered amazing imagery, with resolution as high as half a meter ground sampling distance. To create before-and-after imagery spanning many years, the LRO team has been able to automatically compare certain images taken with similar lighting from similar perspectives, and found 222 impact craters that appeared during the 14 years that LRO has flown.

We can look back in time much further than 2009, but it’ll take some elbow grease, and that’s where MoonDiff comes in. We have images covering 99% of the moon from the five 1966-1967 Lunar Orbiter (LO, not to be confused with LRO) missions. LO photographed selected areas in high resolution; as high as 2m ground sampling distance. A far cry from LRO’s digital cameras, the LO cameras exposed film, developed it onboard, and then used an analog scanner to read it and transmit it back to Earth, where the radio signals were recorded on analog magnetic tape. Starting in 2008, a coalition of volunteer enthusiasts and private companies called the Lunar Orbiter Image Recovery Project (LOIRP) worked with NASA to digitize the images, rescuing them from their degrading magnetic tapes.

The LRO team was able t use algorithms to automatically compare their before-and-after images, but that won’t work with LO images. The LRO-LRO pairs share geometry and lighting, and benefit from 2009-era camera and avionics technology. Even given all that, they still had humans go through the automatic detections to remove false positives and classify the detected changes.

So, we need brains 🧟. MoonDiff seeks to compare the 60s-era LO images to recent LRO images. This means comparing images that are taken from different angles, and in different lighting. Additionally, they have relatively poor spatial control. Doing change detection between image pairs like that is beyond the capabilities of today’s cleverest software. So, MoonDiff wields the best available tool for the job: the human brain’s vision system.

Image preparation

The MoonDiff team of scientists and programmers has to do quite a bit of prep work on the image pairs before they’re ready for the community of MoonDiffers to do the comparison work. To begin with, we focused on the areas where the highest-resolution Lunar Orbiter data is available, shown in red here:

Footprints of the high-resolution Lunar Orbiter Images shown on top of a global mosaic of the moon.

We’ve been preparing images from these clusters of high-res LO images one cluster at a time. Although we’ve been iterating on these methods, here’s how it worked for our initial image set (34 pairs), which we’re calling MoonDiffImgSet1:

  1. Select pairs. We choose one cluster of the high-resolution LO images to work on at a time (between 9 and 16 LO images). Using image footprints and metadata from the Orbital Data Explorer and some python code (not yet in the MoonDiff repository but will be soon), we find all of the LRO Narrow Angle Camera images that overlap the LO images. We then rank them using a “goodness metric” we created, which combines three factors:
    1. The size of the overlap area
    2. The difference in phase angle
    3. The difference in solar incidence angle

Finally, one of us manually chooses a set of LRO-LO pairs from that list which covers the LO cluster area at least once over. For each pair, we produce a finder plot like the below:

A diagram of Lunar Orbiter footprints and corresponding Lunar Reconnaissance Orbiter footprints, showing overlap.

This image shows a cluster of LO 24” camera image footprints in light red, with LRO NAC image footprints in light blue. The particular pair in question (LO image 5117_HIGH_RES_2 with LRO NAC image M1114191668LE) is shown in bolded red and blue.

  1. Retrieve and project raw images. Once we know which pairs we want to work with, we download Engineering Data Record (EDR) image files from the Planetary Data System Geosciences Node in for LO and from the LROC PDS archive for LO and use USGS ISIS to turn them into map-projected geospatial images. We project everything to IAU2000:30110 with 1m pixel spacing. LRO images are projected onto a lunar elevation model based on stereo reconstructions from the LRO Wide Angle Camera. Due to position uncertainty, we do not use an elevation model for LO images; for that projection we model the moon as a sphere.
  2. Coregister the images. At this point, we have pairs of map-projected imagery, but it doesn’t line up. Typically, if you locate a feature in the LRO image, the same feature will appear several kilometers away on the corresponding LO image. We try to fix this:
    1. Using ISIS qview, a moondiff committee member manually picks two pairs of corresponding points. Using those points, we translate and scale the LO image to roughly match the LRO image.
    2. Using algorithms in the Planetary Orbital Mosaicking and Mapping Toolset (POMM), we find corresponding features within the image pair, and warp the LO image to match the LRO image. This attempts to remove internal distortions within the LO image, both from imprecision in the spacecraft positioning and from the reconstruction process in which strips of tape were combined by hand. We roughly assess the quality of the coregistration with a root-mean-square pixel offset value, which is greater than 10 pixels for some difficult pairs but less than 1 pixel for most.

Change detection and review

Here’s where the public gets involved. Using the MoonDiff Comparer web app, people from around the world visually review our coregistered “before and after” image pairs. They can compare the images side-by-side, or by blinking or fading between the images. Our moon sleuths draw a polygon around areas where they see that something has changed between the two images.

Lunar experts review each identified change using the MoonDiff Reviewer web app. Discoveries will be written up as scientific journal papers. Anyone involved in the process will be credited, either as an author, or in acknowledgements.

Source code, issue tracker, and forum

The MoonDiff web app is developed in the open here. You can get involved by submitting an issue with a bug report or feature request. If you'd like to implement a feature or fix a bug yourself, please submit a pull request.

We also have a community forum for discussing MoonDiff and related topics. Additionally, you can submit comments on any image pair from within the MoonDiff comparer.

MoonDiff committee

These are the people who manage and develop MoonDiff:
  • Heather Lethcoe

    Portrait photo of Heather Lethcoe
    I work at JPL as a data engineer and GIS analyst. Throughout the years I have learned image coregistration techniques that I have applied to many different datasets for several of the bodies in our solar system. With this experience I was able to help out on the MoonDiff team to coregister the older Lunar Orbiter images and the newer Lunar Reconnaissance Orbiter to each other. This way we can compare any differences we might observe between the two sets of imagery.
  • Emily Law

    Portrait photo of Emily Law
    Emily serves as the Solar System Treks Project (SSTP) manager and a key staff of the Chief Data and Analytics Office at JPL. SSTP offers a suite of web-based Trek portals enabling users to explore a growing number of planetary bodies. One of the notable Trek portals is Moon Trek which serves as the parent project of MoonDiff.
  • Brian Day

    Portrait photo of Brian Day
    Brian sets up and manages MoonDiff's many collaborations with educational and volunteer groups. He is the staff scientist at NASA’s Solar System Exploration Research Virtual Institute (SSERVI). Part of that role includes serving as the science lead and the institute-level project manager for the NASA Solar System Treks Project. He previously served as E/PO and Citizen Science Lead for the LCROSS and LADEE lunar missions.
  • Jim Green

    Portrait photo of Jim Green
    Jim has an advisory role on the MoonDiff Committee. He has worked at NASA for 42 years before retiring in December 2022. He received his Ph.D. in Physics from the University of Iowa in 1979 and worked at Marshall Space Flight Center, Goddard Space Flight Center, and NASA Headquarters. During Jim’s long career at NASA, he has been NASA’s Chief Scientist and was the longest serving director of the Planetary Science Division with the overall programmatic responsibility for the New Horizons spacecraft flyby of Pluto, the Juno spacecraft to Jupiter, and the landing of the Curiosity rover on Mars, just to name a few. Jim has received the NASA Exceptional Achievement Medal for the New Horizons flyby of the Pluto system and NASA’s highest honor, the Distinguished Service Medal. He has written over 125 scientific articles in refereed journals and over 80 technical and popular articles.
  • Richard Kim

    Portrait photo of Richard Kim
    Richard Kim is a senior Science Application and Data Interaction Engineer at NASA Jet Propulsion Laboratory (JPL). With 19 years at JPL, he played a pivotal role in developing numbers of GIS applications for NASA projects including Solar System Trek, Lunar Mapping and Modeling Portal, Physical Oceanography Distributed Active Archive Center. He also leads software development of NASA Deep Space Network Complex Event Processing project.
  • Dave Williams

    Portrait photo of Dave Williams
    Dave Williams works with the Planetary Data System to make sure MoonDiff discoveries and coregistrations are stored for the future. He is the acting head of the NASA Space Science Data Coordinated Archive, one of NASA’s deep archives for spacecraft data. It holds NASA’s largest collection of digital and hard copy lunar data. He specializes in restoration and archiving of lunar and planetary data, and particularly restoration of the data returned from the Apollo missions.
  • Aaron Curtis

    Portrait photo of Aaron Curtis
    Aaron leads MoonDiff and does most of the web app development. He completed a PhD studying the geochemistry and geophysics of ice caves on Mt Erebus, joined JPL to build the world's first ice climbing robot, and now operates the Curiosity rover and Ingenuity helicopter. He's fascinated by lunar imagery and geology.
  • Others

    [Other committee members will be added soon]

Citizen science

MoonDiff is made possible by NASA’s Citizen Science Seed Funding Program (proposal 21-CSSFP21-0016).

NASA funds tons of other great citizen science projects. Check them out here.