Christopher Barnes is on a quest for a universal coronavirus vaccine

In January 2020, Caltech biochemist Pamela Bjorkman asked for volunteers to help work out the structures of immune proteins that attack a newly discovered coronavirus. The pathogen had emerged in China and was causing severe pneumonia-like symptoms in the people it infected. Knowing the molecular arrangements of these antibodies would be an important step toward developing drugs to fight the virus.

Christopher Barnes, a postdoc working in Bjorkman’s lab on the structure of HIV and the antibodies that target it, jumped at the chance to solve a new puzzle. “I was like, ‘Oh, I’ll do it!’” Barnes says. At the time he wasn’t aware how urgent the research would become.

Now, we are all too familiar with SARS-CoV-2, which causes COVID-19 and has killed more than 6 million people globally. Studies of the structure of the virus and the antibodies that target it have helped scientists quickly develop vaccines and treatments that have saved tens of millions of lives. But the virus continues to adapt, making changes to the spike protein that it uses to break into cells. That has left researchers scrambling for new drugs and updated vaccines.

Using high-resolution imaging techniques, Barnes is probing coronavirus spike proteins and the antibodies that attack them. His goal: Find a persistent weak spot and exploit it to create a vaccine that works against all coronaviruses.

Standout research
Barnes’ team used cryo-electron microscopy to reveal the structures of eight antibodies that stop the original version of SARS-CoV-2. The technique catches cells, viruses and proteins going about their business by flash freezing them. In this case, the team isolated coronavirus particles entwined with immune system proteins from people with COVID-19.

The antibodies had attached to four spots on the spike protein’s receptor binding domain, or RBD, the team reported in Nature in 2020. This fingerlike region anchors the virus to the cell it will infect. When antibodies bind to the RBD, the virus can no longer connect to the cell.
Barnes’ team also created an antibody classification system based on the RBD location where the immune system proteins tend to latch on. “That’s been really helpful for understanding the types of antibody responses that are elicited by natural infection,” says structural biologist Jason McLellan, who wasn’t involved in the work, and for identifying prime candidates for drug development.

“A major strength of Chris is that he does not limit himself or his research to one technique,” says McLellan, of the University of Texas at Austin. “He quickly adapts and incorporates new technologies to answer important questions in the field.”

Since launching his own lab at Stanford, Barnes and colleagues have determined the structures of six antibodies that attack the original SARS-CoV-2 virus and delta and omicron variants. Those variants are skilled at evading antibodies, including lab-made ones given to patients to treat COVID-19.

The newly identified antibodies, described in the June 14 Immunity, target the spike protein’s N-terminal domain. The structures of the sites where the proteins attach are the same in delta and omicron, hinting that the sites might remain unchanged even in future variants, the team says. Eventually, scientists may be able to mass-produce antibodies that target these sites for use in new therapies.

What’s next
Barnes has now turned his attention to antibodies that can fend off all coronaviruses — from ones that cause the common cold to ones found in livestock and other animals that have the potential to spill over into people.

Barnes and immunologist Davide Robbiani of the University of Lugano in Switzerland identified classes of antibodies that target variants from all four coronavirus families, blocking the viruses’ ability to fuse with cells.

What’s more, the structure of one of the binding sites on the spike protein is the same across the coronavirus family tree, Barnes says. “This is something you wouldn’t want to mutate as you diversify your viral family because this is a critical component of how you enter the cell.”

Two independent teams have identified similarly broad action in the same antibody classes. Taken together, the findings suggest that a universal coronavirus vaccine is possible, Barnes says.

“We’ve all kind of discovered this at the same time,” he says. The teams are now thinking, “Wow, this exists. So let’s try to make a real, true pan-coronavirus vaccine.”

NASA’s DART spacecraft just smashed into an asteroid — on purpose

Mission control rooms rarely celebrate crash landings. But the collision of NASA’s DART spacecraft with an asteroid was a smashing success.

At about 7:15 p.m. EDT on September 26, the spacecraft hurtled into Dimorphos, an asteroid moonlet orbiting a larger space rock named Didymos. The mission’s goal was to bump Dimorphos slightly closer to its parent asteroid, shortening its 12-hour orbit around Didymos by several minutes.

The Double Asteroid Redirection Test, or DART, is the world’s first attempt to change an asteroid’s motion by ramming a space probe into it (SN: 6/30/20). Neither Dimorphos nor Didymos poses a threat to Earth. But seeing how well DART’s maneuver worked will reveal how easy it is to tamper with an asteroid’s trajectory — a strategy that could protect the planet if a large asteroid is ever discovered on a collision course with Earth.

“We don’t know of any large asteroids that would be considered a threat to Earth that are coming any time in the next century,” says DART team member Angela Stickle, a planetary scientist at the Johns Hopkins University Applied Physics Laboratory in Laurel, Md. “The reason that we are doing something like DART is because there are asteroids that we haven’t discovered yet.”
Astronomers have spotted almost all the kilometer-size asteroids in the solar system that could end civilization if they hit Earth, says Jessica Sunshine, a planetary scientist at the University of Maryland in College Park who’s also on the DART team. But when it comes to space rocks around 150 meters wide, like Dimorphos, “we only know where about 40 percent of those are,” Sunshine says. “And that is something that, if it did hit, would certainly take out a city.”

Dimorphos is a safe asteroid to give an experimental nudge, says Mark Boslough, a physicist at Los Alamos National Laboratory in New Mexico who has studied planetary protection but is not involved in DART. “It’s not on a collision course” with Earth, he says, and DART “can’t hit it hard enough to put it on a collision course.” The DART spacecraft weighs only as much as a couple of vending machines, whereas Dimorphos is thought to be nearly as hefty as Egypt’s Great Pyramid of Giza.

After a 10-month voyage, DART met up with Didymos and Dimorphos near their closest approach to Earth, about 11 million kilometers away. Up until the very end of its journey, DART could see only the larger asteroid, Didymos. But about an hour before impact, DART spotted Dimorphos in its field of view. Using its onboard camera, the spacecraft steered itself toward the asteroid moonlet and slammed into it at some 6.1 kilometers per second, or nearly 14,000 miles per hour.
DART’s camera feed went dark after impact. But another probe nearby is expected to have caught the collision on camera. The Light Italian CubeSat for Imaging of Asteroids rode to Dimorphos aboard DART but detached a couple of weeks before impact to watch the event from a safe distance. Its mission was to whiz past Dimorphos about three minutes after DART’s impact to snap pictures of the crash site and the resulting plume of asteroid debris launched into space. The probe is expected to beam images of DART’s demise back to Earth within a couple of days.

“I was absolutely elated, especially as we saw the camera getting closer and just realizing all the science that we’re going to learn,” said Pam Melroy, NASA Deputy Administrator, after the impact. “But the best part was seeing, at the end, that there was no question there was going to be an impact, and to see the team overjoyed with their success.”
DART’s impact is expected to shove Dimorphos into a closer, shorter orbit around Didymos. Telescopes on Earth can clock the timing of that orbit by watching how the amount of light from the double asteroid system changes as Dimorphos passes in front of and behind Didymos.

“It’s really a beautifully conceived experiment,” Boslough says. In the coming weeks, dozens of telescopes across every continent will watch Dimorphos to see how much DART changed its orbit. The Hubble and James Webb space telescopes may also get images.
“It’ll be really interesting to see what comes out,” says Amy Mainzer, a planetary scientist at the University of Arizona in Tucson who is not involved in DART. “Asteroids have a way of surprising us,” she says, because it’s hard to know a space rock’s precise chemical makeup and internal structure based on observations from Earth. So Dimorphos’ motion post-impact may not exactly match researchers’ expectations.

The DART team will compare data on Dimorphos’ new orbit with their computer simulations to see how close those models were to predicting the asteroid’s actual behavior and tweak them accordingly. “If we can get our models to reproduce what actually happened, then you can use those models to [plan for] other scenarios that might show up in the future” — like the discovery of a real killer asteroid, says DART team member Wendy Caldwell, a mathematician and planetary scientist at Los Alamos National Laboratory.

“No matter what happens,” she says, “we will get information that is valuable to the scientific community and to the planetary defense community.”

In Maya society, cacao use was for everyone, not just royals

In ancient Maya civilization, cacao wasn’t just for the elites.

Traces of the sacred plant show up in ceramics from all types of neighborhoods and dwellings in and around a former Maya city, researchers report September 26 in the Proceedings of the National Academy of Sciences. The finding suggests that, contrary to previous thinking, cacao was consumed at every social level of Maya society.

“Now we know that the rituals the elite depict with cacao were likely played out, like Thanksgiving, like any other ritual, by everyone,” says Anabel Ford, an archaeologist at the University of California, Santa Barbara.
Cacao — which chocolate is made from — was sacred to the ancient Maya, consumed in rituals and used as a currency. The cacao tree (Theobroma cacao) itself was linked to Hun Hunahpu, the maize god. Previous research found cacao in ceremonial vessels and elite burials, suggesting that its use was restricted to those at the top.

To explore the extent to which cacao was used in broader Maya society, Ford and colleagues examined 54 ceramic shards dating from A.D. 600 to 900 (SN: 9/27/18). The shards come from jars, mixing bowls, serving plates and vases thought to be drinking vessels. All the pieces were found in residential and ceremonial civic areas of varying size and status from city centers, foothills, upland areas and the valley around the former Maya city of El Pilar, on the present-day border of Guatemala and Belize.

To identify cacao, the researchers searched for theophylline, a compound found in trace amounts in the plant. The team found the compound on more than half of the samples, on all types of ceramics and distributed throughout social contexts.

Future research will move beyond who consumed cacao and explore the role of farmers in managing the critical resource. “A better question is to understand who grew it,” Ford says, because those people probably had greater access to the prized commodity.

A protogalaxy in the Milky Way may be our galaxy’s original nucleus

The Milky Way left its “poor old heart” in and around the constellation Sagittarius, astronomers report. New data from the Gaia spacecraft reveal the full extent of what seems to be the galaxy’s original nucleus — the ancient stellar population that the rest of the Milky Way grew around — which came together more than 12.5 billion years ago.

“People have long speculated that such a vast population [of old stars] should exist in the center of our Milky Way, and Gaia now shows that there they are,” says astronomer Hans-Walter Rix of the Max Planck Institute for Astronomy in Heidelberg, Germany.
The Milky Way’s ancient heart is a round protogalaxy that spans nearly 18,000 light-years and possesses roughly 100 million times the mass of the sun in stars, or about 0.2 percent of the Milky Way’s current stellar mass, Rix and colleagues report in a study posted September 7 at arXiv.org.

“This study really helps to firm up our understanding of this very, very, very young stage in the Milky Way’s life,” says Vasily Belokurov, an astronomer at the University of Cambridge who was not involved in the work. “Not much is really known about this period of the Milky Way’s life,” he says. “We’ve seen glimpses of this population before,” but the new study gives “a bird’s-eye view of the whole structure.”

Most stars in the Milky Way’s central region abound with metals, because the stars originated in a crowded metropolis that earlier stellar generations had enriched with those metals through supernova explosions. But Rix and his colleagues wanted to find the exceptions to the rule, stars so metal-poor they must have been born well before the rest of the galaxy’s stellar denizens came along — what Rix calls “a needle-in-a-haystack exercise.”

His team turned to data from the Gaia spacecraft, which launched in 2013 on a mission to chart the Milky Way (SN: 6/13/22). The astronomers searched about 2 million stars within a broad region around the galaxy’s center, which lies in the constellation Sagittarius, looking for stars with metal-to-hydrogen ratios no more than 3 percent of the sun’s.

The astronomers then examined how those stars move through space, retaining only the ones that don’t dart off into the vast halo of metal-poor stars engulfing the Milky Way’s disk. The end result: a sample of 18,000 ancient stars that represents the kernel around which the entire galaxy blossomed, the researchers say. By accounting for stars obscured by dust, Rix estimates that the protogalaxy is between 50 million and 200 million times as massive as the sun.

“That’s the original core,” Rix says, and it harbors the Milky Way’s oldest stars, which he says probably have ages exceeding 12.5 billion years. The protogalaxy formed when several large clumps of stars and gas conglomerated long ago, before the Milky Way’s first disk — the so-called thick disk — arose (SN: 3/23/22).

The protogalaxy is compact, which means little has disturbed it since its formation. Smaller galaxies have crashed into the Milky Way, augmenting its mass, but “we didn’t have any later mergers that deeply penetrated into the core and shook it up, because then the core would be larger now,” Rix says.

The new data on the protogalaxy even capture the Milky Way’s initial spin-up — its transition from an object that didn’t rotate into one that now does. The oldest stars in the proto–Milky Way barely revolve around the galaxy’s center but dive in and out of it instead, whereas slightly younger stars show more and more movement around the galactic center. “This is the Milky Way trying to become a disk galaxy,” says Belokurov, who saw the same spin-up in research that he and a colleague reported in July.

Today, the Milky Way is a giant galaxy that spins rapidly — each hour our solar system speeds through 900,000 kilometers of space as we race around the galaxy’s center. But the new study shows that the Milky Way got its start as a modest protogalaxy whose stars still shine today, stars that astronomers can now scrutinize for further clues to the galaxy’s birth and early evolution.

Ancient ‘demon ducks’ may have been undone by their slow growth

Giant flightless birds called mihirungs were the biggest birds to ever stride across what is now Australia. The animals, which weighed up to hundreds of kilograms, died out about 40,000 years ago. Now researchers might have a better idea why.

The birds may have grown and reproduced too slowly to withstand pressures from humans’ arrival on the continent, researchers report August 17 in the Anatomical Record.

Mihirungs are sometimes called “demon ducks” because of their great size and close evolutionary relationship with present-day waterfowl and game birds. The flightless, plant-eating birds lived for more than 20 million years.
Over that time, some species evolved into titans. Take Stirton’s thunderbird (Dromornis stirtoni). It lived about 7 million years ago, stood 3 meters tall and could exceed 500 kilograms in weight, making it the largest-known mihirung and a contender for the largest bird ever to live.

Most research on mihirungs has been on their anatomy and evolutionary relationships with living birds. Little is known about the animals’ biology, such as how long they took to grow and mature, says Anusuya Chinsamy-Turan, a paleobiologist at the University of Cape Town in South Africa.

So Chinsamy-Turan and colleagues at Flinders University in Adelaide, Australia took samples from 20 fossilized leg bones of D. stirtoni, from animals of varying life stages.
“Even after millions of years of fossilization, the microscopic structure of fossil bones generally remains intact,” and it can be used to decipher important clues about extinct animals’ biology, Chinsamy-Turan says.

The team examined the thin bone slices under a microscope, detailing the presence or absence of growth marks. These marks provide information on how fast the bone grew while the birds were alive.

D. stirtoni took 15 years or more to reach full size, the team found. It probably became sexually mature a few years before that, based on the timing of a shift from rapidly growing bone to a slower-growing form that’s thought to be associated with reaching reproductive age.

These results differ from the team’s earlier analysis of the bones of another mihirung, Genyornis newtoni. That species — the last-known mihirung — was less than half the size of D. stirtoni. It lived as recently as about 40,000 years ago and was a contemporary of the continent’s earliest human inhabitants. G. newtoni grew up much faster than its giant relative, reaching adult size in one to two years and growing a bit more in the following years and possibly reproducing then.

This difference in how fast mihirung species that were separated by millions of years developed may have been an evolved response to Australia developing a drier, more variable climate over the last few million years, the researchers say. When resources are unpredictable, growing and reproducing quickly can be advantageous.

Even so, that seeming pep in the developmental step of more recent mihirungs was still slower than that of the emus they lived alongside. Emus grow up quickly, reaching adult size in less than a year and reproducing not long after, laying large numbers of eggs.

This difference may explain why G. newtoni went extinct shortly after hungry humans arrived in Australia, yet emus continue to thrive today, the team says. Even though over millions of years, mihirungs as a group seem to have adapted to growing and reproducing quicker than they used to, it wasn’t enough to survive the arrival of humans, who probably ate the birds and their eggs, the researchers conclude.

“Slowly growing animals face dire consequences in terms of their reduced ability to recover from threats in their environments,” Chinsamy-Turan says.

The scientists’ research on other giant, extinct, flightless birds thought to have met their end thanks to humans — such as the dodos of Mauritius (Raphus cucullatus) and the largest of Madagascar’s elephant birds (Vorombe titan) — shows that they too grew relatively slowly (SN: 8/29/17).

“It is very interesting to see this pattern repeating again and again with many large, flightless bird groups,” says Thomas Cullen, a paleoecologist at Carleton University in Ottawa who was not involved with the new study.

Modern ratite birds seem to be the exception in their ability to handle similar pressures, he says. Other ratites besides emus that have survived until the present day — such as cassowaries and ostriches — also grow and reproduce quickly (SN: 4/25/14).

Physicists dispute a claim of detecting a black hole’s ‘photon ring’

The first image of a black hole may conceal treasure — but physicists disagree about whether it’s been found.

A team of scientists say they’ve unearthed a photon ring, a thin halo of light around the supermassive black hole in the galaxy M87. If real, the photon ring would provide a new probe of the black hole’s intense gravity. But other scientists dispute the claim. Despite multiple news headlines suggesting the photon ring has been found, many physicists remain unconvinced.
Unveiled in 2019 by scientists with the Event Horizon Telescope, or EHT, the first image of a black hole revealed a doughnut-shaped glow from hot matter swirling around the black hole’s dark silhouette (SN: 4/10/19). But according to Einstein’s general theory of relativity, a thinner ring should be superimposed on that thick doughnut. This ring is produced by photons, or particles of light, that orbit close to the black hole, slung around by the behemoth’s gravity before escaping and zinging toward Earth.

Thanks to this circumnavigation, the photons should provide “a fingerprint of gravity,” more clearly revealing the black hole’s properties, says astrophysicist Avery Broderick of the University of Waterloo and the Perimeter Institute for Theoretical Physics in Canada. He and his colleagues, a subset of scientists from the EHT collaboration, used a new method to tease out that fingerprint, they report in the Aug. 10 Astrophysical Journal.

Creating images with EHT isn’t a simple point-and-shoot affair (SN: 4/10/19). Researchers stitch together data from EHT’s squad of observatories scattered across the globe, using various computational techniques to reconstruct an image. Broderick and colleagues created a new black hole image assuming it featured both a diffuse emission and a thin ring. On three out of four days of observations, the data better matched an image with the added thin ring than one without the ring.

But that method has drawn harsh criticism. “The claim of a photon ring detection is preposterous,” says physicist Sam Gralla of the University of Arizona in Tucson.

A main point of contention: The photon ring is brighter than expected, emitting around 60 percent of the light in the image. According to predictions, it should be more like 20 percent. “That’s a giant red flag,” says physicist Alex Lupsasca of Vanderbilt University in Nashville. More light should come from the black hole’s main glowing doughnut than from the thin photon ring.

This unexpected brightness, Broderick and colleagues say, occurs because some of the light from the main glow gets lumped in with the photon ring. So the ring’s apparent brightness doesn’t depend only on the light coming from the ring. The researchers note that the same effect appeared when testing the method on simulated data.

But that mishmash of purported photon ring light with other light doesn’t make for a very convincing detection, critics say. “If you want to claim that you’ve seen a photon ring, I think you have to do a better job than this,” says astrophysicist Dan Marrone of the University of Arizona, a member of the EHT collaboration who was not a coauthor on the new paper.

The new result suggests only that an added thin ring gives a better match to the data, Marrone says, not whether that shape is associated with the photon ring. So it raises the question of whether scientists are seeing a photon ring at all, or just picking out an unrelated structure in the image.

But Broderick argues that the features of the ring — the fact that its size and location are as expected and are consistent day-to-day — support the photon ring interpretation.

Meanwhile, in a similar, independent analysis, Gralla and physicist Will Lockhart, also of the University of Arizona, find no evidence for a photon ring, they report in a paper submitted August 22 at arXiv.org. Their analysis differed from Broderick and colleagues’ in part because it limited how bright the photon ring could be.

To convincingly detect the photon ring, some scientists propose adding telescopes in space to the EHT’s crew of observatories (SN: 3/18/20). The farther apart the telescopes in the network are, the finer details they may be able to pick out — potentially including the photon ring.

“If there were a photon ring detection,” Lupsasca says, “that would be the best thing in physics this year, if not for many years.”

The curious case of the 471-day coronavirus infection

As omicron subvariant BA.5 continues to drive the coronavirus’ spread in the United States, I’ve been thinking about what could come next. Omicron and its offshoots have been topping the variant charts since last winter. Before that, delta reigned.

Scientists have a few ideas for how new variants emerge. One involves people with persistent infections — people who test positive for the virus over a prolonged period of time. I’m going to tell you about the curious case of a person infected with SARS-CoV-2 for at least 471 days and what can happen when infections roil away uncontrolled.
That lengthy infection first came onto epidemiologist Nathan Grubaugh’s radar in the summer of 2021. His team had been analyzing coronavirus strains in patient samples from Yale New Haven Hospital when Grubaugh spotted something he had seen before. Known only as B.1.517, this version of the virus never got a name like delta or omicron, nor rampaged through communities quite like its infamous relatives.

Instead, after springing up somewhere in North America in early 2020, B.1.517 tooled around in a handful of regions around the world, even sparking an outbreak in Australia. But after April 2021, B.1.517 seemed to sputter, one of the who-knows-how-many viral lineages that flare up and then eventually fizzle.

B.1.517 might have been long forgotten, shouldered aside by the latest variant to stake a claim in local communities. “And yet we were still seeing it,” Grubaugh says. Even after B.1.517 had petered out across the country, his team noticed it cropping up in patient samples. The same lineage, every few weeks, like clockwork, for months.

One clue was the samples’ specimen ID. The code on the B.1.517 samples was always the same, Grubaugh’s team noticed. They had all come from a single patient.

That patient, a person in their 60s with a history of cancer, relapsed in November of 2020. That was right around when they first tested positive for SARS-CoV-2. After seeing B.1.517 show up again and again in their samples, Grubaugh worked with a clinician to get the patient’s permission to analyze their data.
Ultimately, the patient has remained infected for 471 days (and counting), Grubaugh, Yale postdoctoral researcher Chrispin Chaguza and their team reported last month in a preliminary study posted at medRxiv.org. Because of deteriorating health and a desire to maintain their anonymity, the patient was not willing to be interviewed, and Grubaugh has no direct contact with them.

But all those samples collected over all those days told an incredible tale of viral evolution. Over about 15 months, at least three genetically distinct versions of the virus had rapidly evolved inside the patient, the team’s analyses suggested.

Each version had dozens of mutations and seemed to coexist in the patient’s body. “Honestly, if any one of these were to emerge in a population and begin transmitting, we would be calling it a new variant,” Grubaugh says.

That scenario is probably rare, he says. After all, lots of prolonged infections have likely occurred during the pandemic, and only a handful of concerning variants have emerged. But the work does suggest that persistent viral infections can provide a playground for speedy evolutionary experimentation — perhaps taking advantage of weakened immune systems.

Grubaugh’s work is “probably the most detailed look we’ve had at a single, persistent infection with SARS-CoV-2 so far,” says Tom Friedrich, a virologist at the University of Wisconsin–Madison, who was not involved with the work.
The study supports an earlier finding about a different immunocompromised patient — one with a persistent omicron infection. In that work, researchers documented the evolution of the virus over 12 weeks and showed that its descendant infected at least five other people.

Together, the studies lay out how such infections could potentially drive the emergence of the next omicron.

“I am pretty well convinced that people with persistent infection are important sources of new variants,” Friedrich says.

Who exactly develops these infections remains mysterious. Yes, the virus can pummel people with weakened immune systems, but “not every immunocompromised person develops a persistent infection,” says Viviana Simon, a virologist at the Icahn School of Medicine at Mount Sinai who worked on the omicron infection study.

In fact, doctors and scientists have no idea how common these infections are. “We just don’t really have the numbers,” Simon says. That’s a huge gap for researchers, and something Mount Sinai’s Pathogen Surveillance Program is trying to address by analyzing real-time infection data.

Studying patients with prolonged infections could also tell scientists where SARS-CoV-2 evolution is heading, Friedrich says. Just because the virus evolves within a person doesn’t mean it will spread to other people. But if certain viral mutations tend to arise in multiple people with persistent infections, that could hint that the next big variant might evolve in a similar way. Knowing more about these mutation patterns could help researchers forecast what’s to come, an important step in designing future coronavirus vaccine boosters.
Beyond viral forecasting, Grubaugh says identifying people with prolonged infections is important so doctors can provide care. “We need to give them access to vaccines, monoclonal antibodies and antiviral drugs,” he says. Those treatments could help patients clear their infections.

But identifying persistent infections is easier said than done, he points out. Many places in the world aren’t set up to spot these infections and don’t have access to vaccines or treatments. And even when these are available, some patients opt out. The patient in Grubaugh’s study received a monoclonal antibody infusion about 100 days into their infection, then refused all other treatments. They have not been vaccinated.

Though the patient remained infectious over the course of the study, their variants never spread to the community, as far as Grubaugh knows.

And while untreated chronic infections might spawn new variants, they could emerge in other ways, too, like from animals infected with the virus, from person-to-person transmission in groups of people scientists haven’t been monitoring, or from “something else that maybe none of us has thought of yet,” he says. “SARS-CoV-2 has continued to surprise us with its evolution.”

How to make recyclable plastics out of CO2 to slow climate change

It’s morning and you wake on a comfortable foam mattress made partly from greenhouse gas. You pull on a T-shirt and sneakers containing carbon dioxide pulled from factory emissions. After a good run, you stop for a cup of joe and guiltlessly toss the plastic cup in the trash, confident it will fully biodegrade into harmless organic materials. At home, you squeeze shampoo from a bottle that has lived many lifetimes, then slip into a dress fashioned from smokestack emissions. You head to work with a smile, knowing your morning routine has made Earth’s atmosphere a teeny bit carbon cleaner.

Sound like a dream? Hardly. These products are already sold around the world. And others are being developed. They’re part of a growing effort by academia and industry to reduce the damage caused by centuries of human activity that has sent CO2 and other heat-trapping gases into the atmosphere .
The need for action is urgent. In its 2022 report, the United Nations Intergovernmental Panel on Climate Change, or IPCC, stated that rising temperatures have already caused irreversible damage to the planet and increased human death and disease (SN: 5/7/22 & 5/21/22, p. 8). Meanwhile, the amount of CO2 emitted continues to rise. The U.S. Energy Information Administration predicted last year that if current policy and growth trends continue, annual global CO2 emissions could rise from about 34 billion metric tons in 2020 to almost 43 billion by 2050.

Carbon capture and storage, or CCS, is one strategy for mitigating climate change long noted by the IPCC as having “considerable” potential. A technology that has existed since the 1970s, CCS traps CO2 from smokestacks or ambient air and pumps it underground for permanent sequestration. Today, 27 CCS facilities operate around the world — 12 in the United States — storing an estimated 36 million tons of carbon per year, according to the Global CCS Institute. The 2021 Infrastructure Investment and Jobs Act includes $3.5 billion in funding for four additional U.S. direct capture facilities.

But rather than just storing it, the captured carbon could be used to make things. This year for the first time, the IPCC added carbon capture and utilization, or CCU, to its list of options for drawing down atmospheric carbon. CCU captures CO2 and incorporates it into carbon-containing products like cement, jet fuel and the raw materials for making plastics. Still in early stages of development and commercialization, CCU could reduce annual greenhouse gas emissions by 20 billion tons in 2050 — more than half of the world’s global emissions today, the IPCC estimates.

Such recognition was a big victory for a movement that has struggled to emerge from the shadow of its more established cousin, CCS, says chemist and global CCU expert Peter Styring of the University of Sheffield in England. Many CCU-related companies are springing up and collaborating with each other and with governments around the world, he adds.

The potential of CCU is “enormous,” both in terms of its volume and monetary potential, said mechanical engineer Volker Sick at a CCU conference in Brussels in April. Sick, of the University of Michigan in Ann Arbor, directs the Global CO2 Initiative, which promotes CCU as a mainstream climate solution. “We’re not talking about something that’s nice to do but doesn’t move the needle,” he added. “It moves the needle in many, many aspects.”
The plastics paradox
The use of carbon dioxide in products is not new. CO2 is used to make soda fizzy, keep foods frozen (as dry ice) and convert ammonia to urea for fertilizer. What’s new is the focus on making products with CO2 as a strategy to slow climate change. Today’s CCU market, estimated at $2 billion, could mushroom to $550 billion by 2040, according to Lux Research, a Boston-based market research firm. Much of this market is driven by adding CO2 to cement — which can improve its properties as well as reduce atmospheric carbon — and to jet fuel, which can lower the industry’s large carbon footprint. CO2-to-plastics is a niche market today, but the field aims to battle two crises at once: climate change and plastic pollution.

Plastics are made from fossil fuels, a mix of hydrocarbons formed by the remains of ancient organisms. Most plastics are produced by refining crude oil, which is then broken down into smaller molecules through a process called cracking. These smaller molecules, known as monomers, are the building blocks of polymers. Monomers such as ethylene, propylene, styrene and others are linked together to form plastics such as polyethylene (detergent bottles, toys, rigid pipes), polypropylene (water bottles, luggage, car parts) and polystyrene (plastic cutlery, CD cases, Styrofoam).
But making plastics from fossil fuels is a carbon catastrophe. Each step in the plastics life cycle — extraction, transport, manufacture and disposal — emits massive amounts of greenhouse gases, mostly CO2, according to the Center for International Environmental Law, a nonprofit law firm based in Geneva and Washington, D.C. These emissions alone — more than 850 million tons of greenhouse gases in 2019 — are enough to threaten global climate targets.

And the numbers are about to get much worse. A 2018 report by the Paris-based intergovernmental International Energy Agency projected that global demand for plastics will increase from about 400 million tons in 2020 to nearly 600 million by 2050. Future demand is expected to be concentrated in developing countries and will vastly outstrip global recycling efforts.

Plastics are a serious crisis for the environment, from fossil fuel use to their buildup in landfills and oceans (SN: 1/16/21, p. 4). But we’re a society addicted to plastic and all it gives us — cell phones, computers, comfy Crocs. Is there a way to have our (plastic-wrapped) cake and eat it too?

Yes, says Sick. First, he argues, cap the oil wells. Next, make plastics from aboveground carbon. Today, there are products made of 20 to over 40 percent CO2. Finally, he says, build a circular economy, one that reduces resource use, reuses products, then recycles them into other new products.

“Not only can we eliminate the fossil carbon as a source so that we don’t add to the aboveground carbon budget, but in the process we can also rethink how we make plastics,” Sick says. He suggests they be specifically designed “to live very, very long so that they don’t have to be replaced … or that they decompose in a benign manner.”

But creating plastics from thin air is not easy. CO2 needs to be extracted, from the atmosphere or smokestacks, for example, using specialized equipment. It often needs to be compressed into liquid form and transported, generally through pipelines. Finally, to meet the overall goal of reducing the amount of carbon in the air, the chemical reaction that turns CO2 into the building blocks of plastics must be run with as little extra energy as possible. Keeping energy use low is a special challenge when dealing with the carbon dioxide molecule.

A bond that’s hard to break
There’s a reason that carbon dioxide is such a potent greenhouse gas. It is incredibly stable and can linger in the atmosphere for 300 to 1,000 years. That stability makes CO2 hard to break apart and add to other chemicals. Lots of energy is typically needed for the reaction.

“This is the fundamental energy problem of CO2,” says chemist Ian Tonks of the University of Minnesota in Minneapolis. “Energy is necessary to fix CO2 to plastics. We’re trying to find that energy in creative ways.”

Catalysts offer a possible answer. These substances can increase the rate of a chemical reaction, and thus reduce the need for energy. Scientists in the CO2-to-plastics field have spent more than a decade searching for catalysts that can work at close to room temperature and pressure, and coax CO2 to form a new chemical identity. These efforts fall into two broad categories: chemical and biological conversion.

First attempts
Early experiments focused on adding CO2 to highly reactive monomers like epoxides to facilitate the reaction. Epoxides are three-membered rings composed of one oxygen atom and two carbon atoms. Like a spring under tension, they can easily pop open. In the early 2000s, industrial chemist Christoph Gürtler and chemist Walter Leitner of Aachen University in Germany found a zinc catalyst that allowed them to break open the epoxide ring of polypropylene oxide and combine it with CO2. Following the reaction, the CO2 was joined permanently to the polypropylene molecule and was no longer in gas form — something that is true of all CO2-to-plastic reactions. Their work resulted in one of the first commercial CO2 products — a polyurethane foam containing 20 percent captured CO2. Today, the German company Covestro, where Gürtler now works, sells 5,000 tons of the product annually in mattresses, car interiors, building insulation and sports flooring.

More recent research has focused on other monomers to expand the variety of CO2-based plastics. Butadiene is a hydrocarbon monomer that can be used to make polyester for clothing, carpets, adhesives and other products.

In 2020, chemist James Eagan at the University of Akron in Ohio mixed butadiene and CO2 with a series of catalysts developed at Stanford University. Eagan hoped to create a polyester that is carbon negative, meaning it has a net effect of removing CO2 from the atmosphere, rather than adding it. When he analyzed the contents of one vial, he discovered he had created something even better: a polyester made with 29 percent CO2 that degrades in high pH water into organic materials.
“Chemistry is like cooking,” Eagan says. “We took chocolate chips, flour, eggs, butter, mixed them up, and instead of getting cookies we opened the oven and found a chicken potpie.”

Eagan’s invention has immediate applications in the recycling industry, where machines can often get gummed up from the nondegradable adhesives used in packaging, soda bottle labels and other products. An adhesive that easily breaks down may improve the efficiency of recycling facilities.

Tonks, described by Eagan as a friendly competitor, took Eagan’s patented process a step further. By putting Eagan’s product through one more reaction, Tonks made the polymer fully degradable back to reusable CO2 — a circular carbon economy goal. Tonks created a start-up this year called LoopCO2 to produce a variety of biodegradable plastics.

Microbial help
Researchers have also harnessed microbes to help turn carbon dioxide into useful materials including dress fabric. Some of the planet’s oldest-living microbes emerged at a time when Earth’s atmosphere was rich in carbon dioxide. Known as acetogens and methanogens, the microbes developed simple metabolic pathways that use enzyme catalysts to convert CO2 and carbon monoxide into organic molecules. In the atmosphere, CO will react with oxygen to form CO2. In the last decade, researchers have studied the microbes’ potential to remove these gases from the atmosphere and turn them into useful products.

LanzaTech, based in Skokie, Ill., uses the acetogenic bacterium Clostridium autoethanogenum to metabolize CO2and CO emissions into a variety of industrial chemicals, including ethanol. Last year, the clothing company Zara began using LanzaTech’s polyester fabric for a line of dresses.

The ethanol used to create these products comes from LanzaTech’s two commercial facilities in China, the first to transform waste CO, a main emission from steel plants, into ethanol. The ethanol goes through two more steps to become polyester. LanzaTech partnered with steel mills near Beijing and in north-central China, feeding carbon monoxide into LanzaTech’s microbe-filled bioreactor.

Steel production emits almost two tons of CO2 for every ton of steel made. By contrast, a life cycle assessment study found that LanzaTech’s ethanol production process lowered greenhouse gas emissions by approximately 80 percent compared with ethanol made from fossil fuels.

In February, researchers from LanzaTech, Northwestern University in Evanston, Ill., and others reported in Nature Biotechnology that they had genetically modified the Clostridium bacterium to produce acetone and isopropanol, two other fossil fuel–based industrial chemicals. Company CEO Jennifer Holmgren says the only waste product is dead bacteria, which can be used as compost or animal feed.

Other researchers are skipping the living microbes and just using their catalysts. More than a decade ago, chemist Charles Dismukes of Rutgers University in Piscataway, N.J., began looking at acetogens and methanogens as a way to use atmospheric carbon. He was intrigued by their ability to release energy when making carbon building blocks from CO2, a reaction that usually requires energy. He and his team focused on the bacteria’s nickel phosphide catalysts, which are responsible for the energy-releasing carbon reaction.

Dismukes and colleagues developed six electrocatalysts that are able to make monomers at room temperature and pressure using only CO2, water and electricity. The energy­-releasing pathway of the nickel phosphide catalysts “lowers the required voltage to run the reaction, which lowers the energy consumption of the process and improves the carbon footprint,” says Karin Calvinho, a former student of Dismukes who is now chief technical officer at RenewCO2, the start-up Dismukes’ team formed in 2018.

RenewCO2 plans to sell its monomers, including monoethylene glycol, to companies that want to reduce their carbon footprint. The group proved its concept works using CO2 brought into the lab. In the future, the company intends to obtain CO2 from biomass, industrial emissions or direct air capture.
Barriers to change
Yet researchers and companies face challenges in scaling up carbon capture and reuse. Some barriers lurk in the language of regulations written before CCU existed. An example is the U.S. Environmental Protection Agency’s program to provide tax credits to companies that make biofuels. The program is geared toward plant-based fuels like corn and sugar­cane. LanzaTech’s approach for making jet fuel doesn’t qualify for credits because bacteria are not plants.

Other barriers are more fundamental. Styring points to the long-standing practice of fossil fuel subsidies, which in 2021 topped $440 billion worldwide. Global government subsidies to the oil and gas industry keep fossil fuel prices artificially low, making it hard for renewables to compete, according to the International Energy Agency. Styring advocates shifting those subsidies toward renewables.

“We try to work on the principle that we recycle carbon and create a circular economy,” he says. “But current legislation is set up to perpetuate a linear economy.”
The happy morning routine that makes the world carbon cleaner is theoretically possible. It’s just not the way the world works yet. Getting to that circular economy, where the amount of carbon above ground is finite and controlled in a never-ending loop of use and reuse will require change on multiple fronts. Government policy and investment, corporate practices, technological development and human behavior would need to align perfectly and quickly in the interests of the planet.

In the meantime, researchers continue their work on the carbon dioxide molecule.

“I try to plan for the worst-case scenario,” says Eagan, the chemist in Akron. “If legislation is never in place to curb emissions, how do we operate within our capitalist system to generate value in a renewable and responsible way? At the end of the day, we will need new chemistry.”

DNA reveals donkeys were domesticated 7,000 years ago in East Africa

From pulling Mesopotamian war chariots to grinding grain in the Middle Ages, donkeys have carried civilization on their backs for centuries. DNA has now revealed just how ancient humans’ relationship with donkeys really is.

The genetic instruction books of over 200 donkeys from countries around the world show that these beasts of burden were domesticated about 7,000 years ago in East Africa, researchers report in the Sept. 9 Science.

“The history of the donkey has puzzled scientists for years,” says Ludovic Orlando, a molecular archaeologist at the Centre for Anthropobiology and Genomics of Toulouse in France. This discovery shows that donkeys were domesticated in one fell swoop, roughly 3,000 years before horses.
DNA has great potential for unraveling humankind’s shared history with our animal companions. In 2021, Orlando and his colleagues used DNA from the bones of horses to track their domestication to the Eurasian steppes, in what’s now southwestern Russia, more than 4,200 years ago (SN: 10/20/21).

But the history of donkeys (Equus asinus) had remained murky. Today, domesticated donkeys are found all over the globe. A dwindling number of wild asses in Asia and Africa — the closest wild relatives of donkeys — pointed toward one of those continents as the likely donkey homeland.

Archaeological evidence — including a 5,000-year-old Egyptian tablet depicting marching asses, sheep and cattle — zeroed in on Africa as the most probable contender. But genetic studies attempting to pin down when and where donkeys were domesticated have been largely inconclusive.

This was probably because scientists were lacking donkey DNA from many regions of the world, Orlando says. For example, to date, there have been no published genomes from donkeys living south of the equator in Africa. To get a broader diversity of DNA, Orlando and his colleagues gathered 207 genomes from donkeys living in 31 countries, ranging from Brazil to China, along with DNA belonging to 31 donkeys that lived between 4,000 and about 100 years ago.

By comparing these genomes with those of wild asses, the researchers found that all donkeys could trace their lineage back to a single domestication event in East Africa, perhaps in the Horn of Africa, around 5000 B.C. From there, domesticated donkeys spread to the rest of the continent and into Europe and Asia, where they formed genetically distinct groups based on region. Humans have now brought donkeys to nearly every continent on Earth, carrying their genetic legacy with them.

These results add new clarity to the story of donkey domestication, says Emily Clark, a livestock geneticist at the Roslin Institute at the University of Edinburgh. “Donkeys are extraordinary working animals that are essential to the livelihoods of millions of people around the globe,” she says. “As humans, we owe a debt of gratitude to the domestic donkey for the role they play and have played in shaping society.”
Exactly why people chose to tame wild asses in Africa thousands of years ago is unclear. But the timing of their first spread across eastern Africa coincides with a period when the Sahara started becoming more arid and expanded (SN: 5/8/08).

“Donkeys are champions when it comes to carrying stuff and are good at going through deserts,” Orlando says. As the desert grew larger, donkeys could have provided much needed help moving goods across the increasingly dry terrain, he says.

The archaeological record for donkeys in Africa outside of Egypt is sparse. The new result could help archaeologists narrow their search to new areas to learn more about the first donkeys and the people who tamed them, Orlando says.

Meanwhile, digging into the genetic diversity that has allowed donkeys to support human endeavors across a range of environmental conditions could put donkeys in the spotlight as climate change exacerbates droughts and threatens to expand deserts around the world (SN: 3/10/22).

“Donkeys still provide tons of support for people living in low- and middle-income countries,” Orlando says. Understanding humankind’s shared history with donkeys “is not just about the past, but could actually be useful in the future.”

An AI can decode speech from brain activity with surprising accuracy

An artificial intelligence can decode words and sentences from brain activity with surprising — but still limited — accuracy. Using only a few seconds of brain activity data, the AI guesses what a person has heard. It lists the correct answer in its top 10 possibilities up to 73 percent of the time, researchers found in a preliminary study.

The AI’s “performance was above what many people thought was possible at this stage,” says Giovanni Di Liberto, a computer scientist at Trinity College Dublin who was not involved in the research.
Developed at the parent company of Facebook, Meta, the AI could eventually be used to help thousands of people around the world unable to communicate through speech, typing or gestures, researchers report August 25 at arXiv.org. That includes many patients in minimally conscious, locked-in or “vegetative states” — what’s now generally known as unresponsive wakefulness syndrome (SN: 2/8/19).

Most existing technologies to help such patients communicate require risky brain surgeries to implant electrodes. This new approach “could provide a viable path to help patients with communication deficits … without the use of invasive methods,” says neuroscientist Jean-Rémi King, a Meta AI researcher currently at the École Normale Supérieure in Paris.

King and his colleagues trained a computational tool to detect words and sentences on 56,000 hours of speech recordings from 53 languages. The tool, also known as a language model, learned how to recognize specific features of language both at a fine-grained level — think letters or syllables — and at a broader level, such as a word or sentence.

The team applied an AI with this language model to databases from four institutions that included brain activity from 169 volunteers. In these databases, participants listened to various stories and sentences from, for example, Ernest Hemingway’s The Old Man and the Sea and Lewis Carroll’s Alice’s Adventures in Wonderland while the people’s brains were scanned using either magnetoencephalography or electroencephalography. Those techniques measure the magnetic or electrical component of brain signals.

Then with the help of a computational method that helps account for physical differences among actual brains, the team tried to decode what participants had heard using just three seconds of brain activity data from each person. The team instructed the AI to align the speech sounds from the story recordings to patterns of brain activity that the AI computed as corresponding to what people were hearing. It then made predictions about what the person might have been hearing during that short time, given more than 1,000 possibilities.

Using magnetoencephalography, or MEG, the correct answer was in the AI’s top 10 guesses up to 73 percent of the time, the researchers found. With electroencephalography, that value dropped to no more than 30 percent. “[That MEG] performance is very good,” Di Liberto says, but he’s less optimistic about its practical use. “What can we do with it? Nothing. Absolutely nothing.”

The reason, he says, is that MEG requires a bulky and expensive machine. Bringing this technology to clinics will require scientific innovations that make the machines cheaper and easier to use.

It’s also important to understand what “decoding” really means in this study, says Jonathan Brennan, a linguist at the University of Michigan in Ann Arbor. The word is often used to describe the process of deciphering information directly from a source — in this case, speech from brain activity. But the AI could do this only because it was provided a finite list of possible correct answers to make its guesses.

“With language, that’s not going to cut it if we want to scale to practical use, because language is infinite,” Brennan says.

What’s more, Di Liberto says, the AI decoded information of participants passively listening to audio, which is not directly relevant to nonverbal patients. For it to become a meaningful communication tool, scientists will need to learn how to decrypt from brain activity what these patients intend on saying, including expressions of hunger, discomfort or a simple “yes” or “no.”

The new study is “decoding of speech perception, not production,” King agrees. Though speech production is the ultimate goal, for now, “we’re quite a long way away.”