More stories

  • in

    MIT researchers remotely map crops, field by field

    Crop maps help scientists and policymakers track global food supplies and estimate how they might shift with climate change and growing populations. But getting accurate maps of the types of crops that are grown from farm to farm often requires on-the-ground surveys that only a handful of countries have the resources to maintain.

    Now, MIT engineers have developed a method to quickly and accurately label and map crop types without requiring in-person assessments of every single farm. The team’s method uses a combination of Google Street View images, machine learning, and satellite data to automatically determine the crops grown throughout a region, from one fraction of an acre to the next. 

    The researchers used the technique to automatically generate the first nationwide crop map of Thailand — a smallholder country where small, independent farms make up the predominant form of agriculture. The team created a border-to-border map of Thailand’s four major crops — rice, cassava, sugarcane, and maize — and determined which of the four types was grown, at every 10 meters, and without gaps, across the entire country. The resulting map achieved an accuracy of 93 percent, which the researchers say is comparable to on-the-ground mapping efforts in high-income, big-farm countries.

    The team is applying their mapping technique to other countries such as India, where small farms sustain most of the population but the type of crops grown from farm to farm has historically been poorly recorded.

    “It’s a longstanding gap in knowledge about what is grown around the world,” says Sherrie Wang, the d’Arbeloff Career Development Assistant Professor in MIT’s Department of Mechanical Engineering, and the Institute for Data, Systems, and Society (IDSS). “The final goal is to understand agricultural outcomes like yield, and how to farm more sustainably. One of the key preliminary steps is to map what is even being grown — the more granularly you can map, the more questions you can answer.”

    Wang, along with MIT graduate student Jordi Laguarta Soler and Thomas Friedel of the agtech company PEAT GmbH, will present a paper detailing their mapping method later this month at the AAAI Conference on Artificial Intelligence.

    Ground truth

    Smallholder farms are often run by a single family or farmer, who subsist on the crops and livestock that they raise. It’s estimated that smallholder farms support two-thirds of the world’s rural population and produce 80 percent of the world’s food. Keeping tabs on what is grown and where is essential to tracking and forecasting food supplies around the world. But the majority of these small farms are in low to middle-income countries, where few resources are devoted to keeping track of individual farms’ crop types and yields.

    Crop mapping efforts are mainly carried out in high-income regions such as the United States and Europe, where government agricultural agencies oversee crop surveys and send assessors to farms to label crops from field to field. These “ground truth” labels are then fed into machine-learning models that make connections between the ground labels of actual crops and satellite signals of the same fields. They then label and map wider swaths of farmland that assessors don’t cover but that satellites automatically do.

    “What’s lacking in low- and middle-income countries is this ground label that we can associate with satellite signals,” Laguarta Soler says. “Getting these ground truths to train a model in the first place has been limited in most of the world.”

    The team realized that, while many developing countries do not have the resources to maintain crop surveys, they could potentially use another source of ground data: roadside imagery, captured by services such as Google Street View and Mapillary, which send cars throughout a region to take continuous 360-degree images with dashcams and rooftop cameras.

    In recent years, such services have been able to access low- and middle-income countries. While the goal of these services is not specifically to capture images of crops, the MIT team saw that they could search the roadside images to identify crops.

    Cropped image

    In their new study, the researchers worked with Google Street View (GSV) images taken throughout Thailand — a country that the service has recently imaged fairly thoroughly, and which consists predominantly of smallholder farms.

    Starting with over 200,000 GSV images randomly sampled across Thailand, the team filtered out images that depicted buildings, trees, and general vegetation. About 81,000 images were crop-related. They set aside 2,000 of these, which they sent to an agronomist, who determined and labeled each crop type by eye. They then trained a convolutional neural network to automatically generate crop labels for the other 79,000 images, using various training methods, including iNaturalist — a web-based crowdsourced  biodiversity database, and GPT-4V, a “multimodal large language model” that enables a user to input an image and ask the model to identify what the image is depicting. For each of the 81,000 images, the model generated a label of one of four crops that the image was likely depicting — rice, maize, sugarcane, or cassava.

    The researchers then paired each labeled image with the corresponding satellite data taken of the same location throughout a single growing season. These satellite data include measurements across multiple wavelengths, such as a location’s greenness and its reflectivity (which can be a sign of water). 

    “Each type of crop has a certain signature across these different bands, which changes throughout a growing season,” Laguarta Soler notes.

    The team trained a second model to make associations between a location’s satellite data and its corresponding crop label. They then used this model to process satellite data taken of the rest of the country, where crop labels were not generated or available. From the associations that the model learned, it then assigned crop labels across Thailand, generating a country-wide map of crop types, at a resolution of 10 square meters.

    This first-of-its-kind crop map included locations corresponding to the 2,000 GSV images that the researchers originally set aside, that were labeled by arborists. These human-labeled images were used to validate the map’s labels, and when the team looked to see whether the map’s labels matched the expert, “gold standard” labels, it did so 93 percent of the time.

    “In the U.S., we’re also looking at over 90 percent accuracy, whereas with previous work in India, we’ve only seen 75 percent because ground labels are limited,” Wang says. “Now we can create these labels in a cheap and automated way.”

    The researchers are moving to map crops across India, where roadside images via Google Street View and other services have recently become available.

    “There are over 150 million smallholder farmers in India,” Wang says. “India is covered in agriculture, almost wall-to-wall farms, but very small farms, and historically it’s been very difficult to create maps of India because there are very sparse ground labels.”

    The team is working to generate crop maps in India, which could be used to inform policies having to do with assessing and bolstering yields, as global temperatures and populations rise.

    “What would be interesting would be to create these maps over time,” Wang says. “Then you could start to see trends, and we can try to relate those things to anything like changes in climate and policies.” More

  • in

    Communications system achieves fastest laser link from space yet

    In May 2022, the TeraByte InfraRed Delivery (TBIRD) payload onboard a small CubeSat satellite was launched into orbit 300 miles above Earth’s surface. Since then, TBIRD has delivered terabytes of data at record-breaking rates of up to 100 gigabits per second — 100 times faster than the fastest internet speeds in most cities — via an optical communication link to a ground-based receiver in California. This data rate is more than 1,000 times higher than that of the radio-frequency links traditionally used for satellite communication and the highest ever achieved by a laser link from space to ground. And these record-setting speeds were all made possible by a communications payload roughly the size of a tissue box.

    MIT Lincoln Laboratory conceptualized the TBIRD mission in 2014 as a means of providing unprecedented capability to science missions at low cost. Science instruments in space today routinely generate more data than can be returned to Earth over typical space-to-ground communications links. With small, low-cost space and ground terminals, TBIRD can enable scientists from around the world to fully take advantage of laser communications to downlink all the data they could ever dream of.

    Designed and built at Lincoln Laboratory, the TBIRD communications payload was integrated onto a CubeSat manufactured by Terran Orbital as part of NASA’s Pathfinder Technology Demonstrator program. NASA Ames Research Center established this program to develop a CubeSat bus (the “vehicle” that powers and steers the payload) for bringing science and technology demonstrators into orbit more quickly and inexpensively. Weighing approximately 25 pounds and the size of two stacked cereal boxes, the CubeSat was launched into low-Earth orbit (LEO) aboard Space X’s Transporter-5 rideshare mission from Cape Canaveral Space Force Station in Florida in May 2022. The optical ground station is located in Table Mountain, California, where most weather takes place below the mountain’s summit, making this part of the sky relatively clear for laser communication. This ground station leverages the one-meter telescope and adaptive optics (to correct for distortions caused by atmospheric turbulence) at the NASA Jet Propulsion Laboratory Optical Communications Telescope Laboratory, with Lincoln Laboratory providing the TBIRD-specific ground communications hardware.

    “We’ve demonstrated a higher data rate than ever before in a smaller package than ever before,” says Jade Wang, the laboratory’s program manager for the TBIRD payload and ground communications and assistant leader of the Optical and Quantum Communications Technology Group. “While sending data from space using lasers may sound futuristic, the same technical concept is behind the fiber-optic internet we use every day. The difference is that the laser transmissions are taking place in the open atmosphere, rather than in contained fibers.”

    From radio waves to laser light

    Whether video conferencing, gaming, or streaming movies in high definition, you are using high-data-rate links that run across optical fibers made of glass (or sometimes plastic). About the diameter of a strand of human hair, these fibers are bundled into cables, which transmit data via fast-traveling pulses of light from a laser or other source. Fiber-optic communications are paramount to the internet age, in which large amounts of data must be quickly and reliably distributed across the globe every day.

    For satellites, however, a high-speed internet based on laser communications does not yet exist. Since the beginning of spaceflight in the 1950s, missions have relied on radio frequencies to send data to and from space. Compared to radio waves, the infrared light employed in laser communications has a much higher frequency (or shorter wavelength), which allows more data to be packed into each transmission. Laser communications will enable scientists to send 100 to 1,000 times more data than today’s radio-frequency systems — akin to our terrestrial switch from dial-up to high-speed internet.

    From Earth observation to space exploration, many science missions will benefit from this speedup, especially as instrument capabilities advance to capture larger troves of high-resolution data, experiments involve more remote control, and spacecraft voyage further from Earth into deep space.  

    However, laser-based space communication comes with several engineering challenges. Unlike radio waves, laser light forms a narrow beam. For successful data transmission, this narrow beam must be pointed precisely toward a receiver (e.g., telescope) located on the ground. And though laser light can travel long distances in space, laser beams can be distorted because of atmospheric effects and weather conditions. This distortion causes the beam to experience power loss, which can result in data loss.

    For the past 40 years, Lincoln Laboratory been tackling these and related challenges through various programs. At this point, these challenges have been reliably solved, and laser communications is rapidly becoming widely adopted. Industry has begun a proliferation of LEO cross-links using laser communications, with the intent to enhance the existing terrestrial backbone, as well as to provide a potential internet backbone to serve users in rural locations. Last year, NASA launched the Laser Communications Relay Demonstration (LCRD), a two-way optical communications system based on a laboratory design. In upcoming missions, a laboratory-developed laser communications terminal will be launched to the International Space Station, where the terminal will “talk” to LCRD, and support Artemis II, a crewed program that will fly by the moon in advance of a future crewed lunar landing.

    “With the expanding interest and development in space-based laser communications, Lincoln Laboratory continues to push the envelope of what is possible,” says Wang. “TBIRD heralds a new approach with the potential to further increase data rate capabilities; shrink size, weight, and power; and reduce lasercom mission costs.”

    One way that TBIRD aims to reduce these costs is by utilizing commercial off-the-shelf components originally developed for terrestrial fiber-optic networks. However, terrestrial components are not designed to survive the rigors of space, and their operation can be impacted by atmospheric effects. With TBIRD, the laboratory developed solutions to both challenges.

    Commercial components adapted for space

    The TBIRD payload integrates three key commercial off-the-shelf components: a high-rate optical modem, a large high-speed storage drive, and an optical signal amplifier.

    All these hardware components underwent shock and vibration, thermal-vacuum, and radiation testing to inform how the hardware might fare in space, where it would be subject to powerful forces, extreme temperatures, and high radiation levels. When the team first tested the amplifier through a thermal test simulating the space environment, the fibers melted. As Wang explains, in vacuum, no atmosphere exists, so heat gets trapped and cannot be released by convection. The team worked with the vendor to modify the amplifier to release heat through conduction instead.

    To deal with data loss from atmospheric effects, the laboratory developed its own version of Automatic Repeat Request (ARQ), a protocol for controlling errors in data transmission over a communications link. With ARQ, the receiver (in this case, the ground terminal) alerts the sender (satellite) through a low-rate uplink signal to re-transmit any block of data (frame) that has been lost or damaged.

    “If the signal drops out, data can be re-transmitted, but if done inefficiently — meaning you spend all your time sending repeat data instead of new data — you can lose a lot of throughput,” explains TBIRD system engineer Curt Schieler, a technical staff member in Wang’s group. “With our ARQ protocol, the receiver tells the payload which frames it received correctly, so the payload knows which ones to re-transmit.”

    Another aspect of TBIRD that is new is its lack of a gimbal, a mechanism for pointing the narrow laser beam. Instead, TBIRD relies on a laboratory-developed error-signaling concept for precision body pointing of the spacecraft. Error signals are provided to the CubeSat bus so it knows how exactly to point the body of the entire satellite toward the ground station. Without a gimbal, the payload can be even further miniaturized.

    “We intended to demonstrate a low-cost technology capable of quickly downlinking a large volume of data from LEO to Earth, in support of science missions,” says Wang. “In just a few weeks of operations, we have already accomplished this goal, achieving unprecedented transmission rates of up to 100 gigabits per second. Next, we plan to exercise additional features of the TBIRD system, including increasing rates to 200 gigabits per second, enabling the downlink of more than 2 terabytes of data — equivalent to 1,000 high-definition movies — in a single five-minute pass over a ground station.”

    Lincoln Laboratory developed the TBIRD mission and technology in partnership with NASA Goddard Space Flight Center. More

  • in

    Using seismology for groundwater management

    As climate change increases the number of extreme weather events, such as megadroughts, groundwater management is key for sustaining water supply. But current groundwater monitoring tools are either costly or insufficient for deeper aquifers, limiting our ability to monitor and practice sustainable management in populated areas.

    Now, a new paper published in Nature Communications bridges seismology and hydrology with a pilot application that uses seismometers as a cost-effective way to monitor and map groundwater fluctuations.

    “Our measurements are independent from and complementary to traditional observations,” says Shujuan Mao PhD ’21, lead author on the paper. “It provides a new way to dictate groundwater management and evaluate the impact of human activity on shaping underground hydrologic systems.”

    Mao, currently a Thompson Postdoctoral Fellow in the Geophysics department at Stanford University, conducted most of the research during her PhD in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). Other contributors to the paper include EAPS department chair and Schlumberger Professor of Earth and Planetary Sciences Robert van der Hilst, as well as Michel Campillo and Albanne Lecointre from the Institut des Sciences de la Terre in France.

    While there are a few different methods currently used for measuring groundwater, they all come with notable drawbacks. Hydraulic heads, which drill through the ground and into the aquifers, are expensive and can only give limited information at the specific location they’re placed. Noninvasive techniques based on satellite- or airborne-sensing lack the sensitivity and resolution needed to observe deeper depths.

    Mao proposes using seismometers, which are instruments used to measure ground vibrations such as the waves produced by earthquakes. They can measure seismic velocity, which is the propagation speed of seismic waves. Seismic velocity measurements are unique to the mechanical state of rocks, or the ways rocks respond to their physical environment, and can tell us a lot about them.

    The idea of using seismic velocity to characterize property changes in rocks has long been used in laboratory-scale analysis, but only recently have scientists been able to measure it continuously in realistic-scale geological settings. For aquifer monitoring, Mao and her team associate the seismic velocity with the hydraulic property, or the water content, in the rocks.

    Seismic velocity measurements make use of ambient seismic fields, or background noise, recorded by seismometers. “The Earth’s surface is always vibrating, whether due to ocean waves, winds, or human activities,” she explains. “Most of the time those vibrations are really small and are considered ‘noise’ by traditional seismologists. But in recent years scientists have shown that the continuous noise records in fact contain a wealth of information about the properties and structures of the Earth’s interior.”

    To extract useful information from the noise records, Mao and her team used a technique called seismic interferometry, which analyzes wave interference to calculate the seismic velocity of the medium the waves pass through. For their pilot application, Mao and her team applied this analysis to basins in the Metropolitan Los Angeles region, an area suffering from worsening drought and a growing population.

    By doing this, Mao and her team were able to see how the aquifers changed physically over time at a high resolution. Their seismic velocity measurements verified measurements taken by hydraulic heads over the last 20 years, and the images matched very well with satellite data. They could also see differences in how the storage areas changed between counties in the area that used different water pumping practices, which is important for developing water management protocol.

    Mao also calls using the seismometers a “buy-one get-one free” deal, since seismometers are already in use for earthquake and tectonic studies not just across California, but worldwide, and could help “avoid the expensive cost of drilling and maintaining dedicated groundwater monitoring wells,” she says.

    Mao emphasizes that this study is just the beginning of exploring possible applications of seismic noise interferometry in this way. It can be used to monitor other near-surface systems, such as geothermal or volcanic systems, and Mao is currently applying it to oil and gas fields. But in places like California currently experiencing megadroughts, and who rely on groundwater for a large portion of their water needs, this kind of information is key for sustainable water management.

    “It’s really important, especially now, to characterize these changes in groundwater storage so that we can promote data-informed policymaking to help them thrive under increasing water stress,” she says.

    This study was funded, in part, by the European Research Council, with additional support from the Thompson Fellowship at Stanford University. More

  • in

    Improving predictions of sea level rise for the next century

    When we think of climate change, one of the most dramatic images that comes to mind is the loss of glacial ice. As the Earth warms, these enormous rivers of ice become a casualty of the rising temperatures. But, as ice sheets retreat, they also become an important contributor to one the more dangerous outcomes of climate change: sea-level rise. At MIT, an interdisciplinary team of scientists is determined to improve sea level rise predictions for the next century, in part by taking a closer look at the physics of ice sheets.

    Last month, two research proposals on the topic, led by Brent Minchew, the Cecil and Ida Green Career Development Professor in the Department of Earth, Atmospheric and Planetary Sciences (EAPS), were announced as finalists in the MIT Climate Grand Challenges initiative. Launched in July 2020, Climate Grand Challenges fielded almost 100 project proposals from collaborators across the Institute who heeded the bold charge: to develop research and innovations that will deliver game-changing advances in the world’s efforts to address the climate challenge.

    As finalists, Minchew and his collaborators from the departments of Urban Studies and Planning, Economics, Civil and Environmental Engineering, the Haystack Observatory, and external partners, received $100,000 to develop their research plans. A subset of the 27 proposals tapped as finalists will be announced next month, making up a portfolio of multiyear “flagship” projects receiving additional funding and support.

    One goal of both Minchew proposals is to more fully understand the most fundamental processes that govern rapid changes in glacial ice, and to use that understanding to build next-generation models that are more predictive of ice sheet behavior as they respond to, and influence, climate change.

    “We need to develop more accurate and computationally efficient models that provide testable projections of sea-level rise over the coming decades. To do so quickly, we want to make better and more frequent observations and learn the physics of ice sheets from these data,” says Minchew. “For example, how much stress do you have to apply to ice before it breaks?”

    Currently, Minchew’s Glacier Dynamics and Remote Sensing group uses satellites to observe the ice sheets on Greenland and Antarctica primarily with interferometric synthetic aperture radar (InSAR). But the data are often collected over long intervals of time, which only gives them “before and after” snapshots of big events. By taking more frequent measurements on shorter time scales, such as hours or days, they can get a more detailed picture of what is happening in the ice.

    “Many of the key unknowns in our projections of what ice sheets are going to look like in the future, and how they’re going to evolve, involve the dynamics of glaciers, or our understanding of how the flow speed and the resistances to flow are related,” says Minchew.

    At the heart of the two proposals is the creation of SACOS, the Stratospheric Airborne Climate Observatory System. The group envisions developing solar-powered drones that can fly in the stratosphere for months at a time, taking more frequent measurements using a new lightweight, low-power radar and other high-resolution instrumentation. They also propose air-dropping sensors directly onto the ice, equipped with seismometers and GPS trackers to measure high-frequency vibrations in the ice and pinpoint the motions of its flow.

    How glaciers contribute to sea level rise

    Current climate models predict an increase in sea levels over the next century, but by just how much is still unclear. Estimates are anywhere from 20 centimeters to two meters, which is a large difference when it comes to enacting policy or mitigation. Minchew points out that response measures will be different, depending on which end of the scale it falls toward. If it’s closer to 20 centimeters, coastal barriers can be built to protect low-level areas. But with higher surges, such measures become too expensive and inefficient to be viable, as entire portions of cities and millions of people would have to be relocated.

    “If we’re looking at a future where we could get more than a meter of sea level rise by the end of the century, then we need to know about that sooner rather than later so that we can start to plan and to do our best to prepare for that scenario,” he says.

    There are two ways glaciers and ice sheets contribute to rising sea levels: direct melting of the ice and accelerated transport of ice to the oceans. In Antarctica, warming waters melt the margins of the ice sheets, which tends to reduce the resistive stresses and allow ice to flow more quickly to the ocean. This thinning can also cause the ice shelves to be more prone to fracture, facilitating the calving of icebergs — events which sometimes cause even further acceleration of ice flow.

    Using data collected by SACOS, Minchew and his group can better understand what material properties in the ice allow for fracturing and calving of icebergs, and build a more complete picture of how ice sheets respond to climate forces. 

    “What I want is to reduce and quantify the uncertainties in projections of sea level rise out to the year 2100,” he says.

    From that more complete picture, the team — which also includes economists, engineers, and urban planning specialists — can work on developing predictive models and methods to help communities and governments estimate the costs associated with sea level rise, develop sound infrastructure strategies, and spur engineering innovation.

    Understanding glacier dynamics

    More frequent radar measurements and the collection of higher-resolution seismic and GPS data will allow Minchew and the team to develop a better understanding of the broad category of glacier dynamics — including calving, an important process in setting the rate of sea level rise which is currently not well understood.  

    “Some of what we’re doing is quite similar to what seismologists do,” he says. “They measure seismic waves following an earthquake, or a volcanic eruption, or things of this nature and use those observations to better understand the mechanisms that govern these phenomena.”

    Air-droppable sensors will help them collect information about ice sheet movement, but this method comes with drawbacks — like installation and maintenance, which is difficult to do out on a massive ice sheet that is moving and melting. Also, the instruments can each only take measurements at a single location. Minchew equates it to a bobber in water: All it can tell you is how the bobber moves as the waves disturb it.

    But by also taking continuous radar measurements from the air, Minchew’s team can collect observations both in space and in time. Instead of just watching the bobber in the water, they can effectively make a movie of the waves propagating out, as well as visualize processes like iceberg calving happening in multiple dimensions.

    Once the bobbers are in place and the movies recorded, the next step is developing machine learning algorithms to help analyze all the new data being collected. While this data-driven kind of discovery has been a hot topic in other fields, this is the first time it has been applied to glacier research.

    “We’ve developed this new methodology to ingest this huge amount of data,” he says, “and from that create an entirely new way of analyzing the system to answer these fundamental and critically important questions.”  More

  • in

    Understanding air pollution from space

    Climate change and air pollution are interlocking crises that threaten human health. Reducing emissions of some air pollutants can help achieve climate goals, and some climate mitigation efforts can in turn improve air quality.

    One part of MIT Professor Arlene Fiore’s research program is to investigate the fundamental science in understanding air pollutants — how long they persist and move through our environment to affect air quality.

    “We need to understand the conditions under which pollutants, such as ozone, form. How much ozone is formed locally and how much is transported long distances?” says Fiore, who notes that Asian air pollution can be transported across the Pacific Ocean to North America. “We need to think about processes spanning local to global dimensions.”

    Fiore, the Peter H. Stone and Paola Malanotte Stone Professor in Earth, Atmospheric and Planetary Sciences, analyzes data from on-the-ground readings and from satellites, along with models, to better understand the chemistry and behavior of air pollutants — which ultimately can inform mitigation strategies and policy setting.

    A global concern

    At the United Nations’ most recent climate change conference, COP26, air quality management was a topic discussed over two days of presentations.

    “Breathing is vital. It’s life. But for the vast majority of people on this planet right now, the air that they breathe is not giving life, but cutting it short,” said Sarah Vogel, senior vice president for health at the Environmental Defense Fund, at the COP26 session.

    “We need to confront this twin challenge now through both a climate and clean air lens, of targeting those pollutants that warm both the air and harm our health.”

    Earlier this year, the World Health Organization (WHO) updated its global air quality guidelines it had issued 15 years earlier for six key pollutants including ozone (O3), nitrogen dioxide (NO2), sulfur dioxide (SO2), and carbon monoxide (CO). The new guidelines are more stringent based on what the WHO stated is the “quality and quantity of evidence” of how these pollutants affect human health. WHO estimates that roughly 7 million premature deaths are attributable to the joint effects of air pollution.

    “We’ve had all these health-motivated reductions of aerosol and ozone precursor emissions. What are the implications for the climate system, both locally but also around the globe? How does air quality respond to climate change? We study these two-way interactions between air pollution and the climate system,” says Fiore.

    But fundamental science is still required to understand how gases, such as ozone and nitrogen dioxide, linger and move throughout the troposphere — the lowermost layer of our atmosphere, containing the air we breathe.

    “We care about ozone in the air we’re breathing where we live at the Earth’s surface,” says Fiore. “Ozone reacts with biological tissue, and can be damaging to plants and human lungs. Even if you’re a healthy adult, if you’re out running hard during an ozone smog event, you might feel an extra weight on your lungs.”

    Telltale signs from space

    Ozone is not emitted directly, but instead forms through chemical reactions catalyzed by radiation from the sun interacting with nitrogen oxides — pollutants released in large part from burning fossil fuels—and volatile organic compounds. However, current satellite instruments cannot sense ground-level ozone.

    “We can’t retrieve surface- or even near-surface ozone from space,” says Fiore of the satellite data, “although the anticipated launch of a new instrument looks promising for new advances in retrieving lower-tropospheric ozone”. Instead, scientists can look at signatures from other gas emissions to get a sense of ozone formation. “Nitrogen dioxide and formaldehyde are a heavy focus of our research because they serve as proxies for two of the key ingredients that go on to form ozone in the atmosphere.”

    To understand ozone formation via these precursor pollutants, scientists have gathered data for more than two decades using spectrometer instruments aboard satellites that measure sunlight in ultraviolet and visible wavelengths that interact with these pollutants in the Earth’s atmosphere — known as solar backscatter radiation.

    Satellites, such as NASA’s Aura, carry instruments like the Ozone Monitoring Instrument (OMI). OMI, along with European-launched satellites such as the Global Ozone Monitoring Experiment (GOME) and the Scanning Imaging Absorption spectroMeter for Atmospheric CartograpHY (SCIAMACHY), and the newest generation TROPOspheric Monitoring instrument (TROPOMI), all orbit the Earth, collecting data during daylight hours when sunlight is interacting with the atmosphere over a particular location.

    In a recent paper from Fiore’s group, former graduate student Xiaomeng Jin (now a postdoc at the University of California at Berkeley), demonstrated that she could bring together and “beat down the noise in the data,” as Fiore says, to identify trends in ozone formation chemistry over several U.S. metropolitan areas that “are consistent with our on-the-ground understanding from in situ ozone measurements.”

    “This finding implies that we can use these records to learn about changes in surface ozone chemistry in places where we lack on-the-ground monitoring,” says Fiore. Extracting these signals by stringing together satellite data — OMI, GOME, and SCIAMACHY — to produce a two-decade record required reconciling the instruments’ differing orbit days, times, and fields of view on the ground, or spatial resolutions. 

    Currently, spectrometer instruments aboard satellites are retrieving data once per day. However, newer instruments, such as the Geostationary Environment Monitoring Spectrometer launched in February 2020 by the National Institute of Environmental Research in the Ministry of Environment of South Korea, will monitor a particular region continuously, providing much more data in real time.

    Over North America, the Tropospheric Emissions: Monitoring of Pollution Search (TEMPO) collaboration between NASA and the Smithsonian Astrophysical Observatory, led by Kelly Chance of Harvard University, will provide not only a stationary view of the atmospheric chemistry over the continent, but also a finer-resolution view — with the instrument recording pollution data from only a few square miles per pixel (with an anticipated launch in 2022).

    “What we’re very excited about is the opportunity to have continuous coverage where we get hourly measurements that allow us to follow pollution from morning rush hour through the course of the day and see how plumes of pollution are evolving in real time,” says Fiore.

    Data for the people

    Providing Earth-observing data to people in addition to scientists — namely environmental managers, city planners, and other government officials — is the goal for the NASA Health and Air Quality Applied Sciences Team (HAQAST).

    Since 2016, Fiore has been part of HAQAST, including collaborative “tiger teams” — projects that bring together scientists, nongovernment entities, and government officials — to bring data to bear on real issues.

    For example, in 2017, Fiore led a tiger team that provided guidance to state air management agencies on how satellite data can be incorporated into state implementation plans (SIPs). “Submission of a SIP is required for any state with a region in non-attainment of U.S. National Ambient Air Quality Standards to demonstrate their approach to achieving compliance with the standard,” says Fiore. “What we found is that small tweaks in, for example, the metrics we use to convey the science findings, can go a long way to making the science more usable, especially when there are detailed policy frameworks in place that must be followed.”

    Now, in 2021, Fiore is part of two tiger teams announced by HAQAST in late September. One team is looking at data to address environmental justice issues, by providing data to assess communities disproportionately affected by environmental health risks. Such information can be used to estimate the benefits of governmental investments in environmental improvements for disproportionately burdened communities. The other team is looking at urban emissions of nitrogen oxides to try to better quantify and communicate uncertainties in the estimates of anthropogenic sources of pollution.

    “For our HAQAST work, we’re looking at not just the estimate of the exposure to air pollutants, or in other words their concentrations,” says Fiore, “but how confident are we in our exposure estimates, which in turn affect our understanding of the public health burden due to exposure. We have stakeholder partners at the New York Department of Health who will pair exposure datasets with health data to help prioritize decisions around public health.

    “I enjoy working with stakeholders who have questions that require science to answer and can make a difference in their decisions.” Fiore says. More