Giant flightless birds called mihirungs were the biggest birds to ever stride across what is now Australia. The animals, which weighed up to hundreds of kilograms, died out about 40,000 years ago. Now researchers might have a better idea why.
The birds may have grown and reproduced too slowly to withstand pressures from humans’ arrival on the continent, researchers report August 17 in the Anatomical Record.
Mihirungs are sometimes called “demon ducks” because of their great size and close evolutionary relationship with present-day waterfowl and game birds. The flightless, plant-eating birds lived for more than 20 million years. Over that time, some species evolved into titans. Take Stirton’s thunderbird (Dromornis stirtoni). It lived about 7 million years ago, stood 3 meters tall and could exceed 500 kilograms in weight, making it the largest-known mihirung and a contender for the largest bird ever to live.
Most research on mihirungs has been on their anatomy and evolutionary relationships with living birds. Little is known about the animals’ biology, such as how long they took to grow and mature, says Anusuya Chinsamy-Turan, a paleobiologist at the University of Cape Town in South Africa.
So Chinsamy-Turan and colleagues at Flinders University in Adelaide, Australia took samples from 20 fossilized leg bones of D. stirtoni, from animals of varying life stages. “Even after millions of years of fossilization, the microscopic structure of fossil bones generally remains intact,” and it can be used to decipher important clues about extinct animals’ biology, Chinsamy-Turan says.
The team examined the thin bone slices under a microscope, detailing the presence or absence of growth marks. These marks provide information on how fast the bone grew while the birds were alive.
D. stirtoni took 15 years or more to reach full size, the team found. It probably became sexually mature a few years before that, based on the timing of a shift from rapidly growing bone to a slower-growing form that’s thought to be associated with reaching reproductive age.
These results differ from the team’s earlier analysis of the bones of another mihirung, Genyornis newtoni. That species — the last-known mihirung — was less than half the size of D. stirtoni. It lived as recently as about 40,000 years ago and was a contemporary of the continent’s earliest human inhabitants. G. newtoni grew up much faster than its giant relative, reaching adult size in one to two years and growing a bit more in the following years and possibly reproducing then.
This difference in how fast mihirung species that were separated by millions of years developed may have been an evolved response to Australia developing a drier, more variable climate over the last few million years, the researchers say. When resources are unpredictable, growing and reproducing quickly can be advantageous.
Even so, that seeming pep in the developmental step of more recent mihirungs was still slower than that of the emus they lived alongside. Emus grow up quickly, reaching adult size in less than a year and reproducing not long after, laying large numbers of eggs.
This difference may explain why G. newtoni went extinct shortly after hungry humans arrived in Australia, yet emus continue to thrive today, the team says. Even though over millions of years, mihirungs as a group seem to have adapted to growing and reproducing quicker than they used to, it wasn’t enough to survive the arrival of humans, who probably ate the birds and their eggs, the researchers conclude.
“Slowly growing animals face dire consequences in terms of their reduced ability to recover from threats in their environments,” Chinsamy-Turan says.
The scientists’ research on other giant, extinct, flightless birds thought to have met their end thanks to humans — such as the dodos of Mauritius (Raphus cucullatus) and the largest of Madagascar’s elephant birds (Vorombe titan) — shows that they too grew relatively slowly (SN: 8/29/17).
“It is very interesting to see this pattern repeating again and again with many large, flightless bird groups,” says Thomas Cullen, a paleoecologist at Carleton University in Ottawa who was not involved with the new study.
Modern ratite birds seem to be the exception in their ability to handle similar pressures, he says. Other ratites besides emus that have survived until the present day — such as cassowaries and ostriches — also grow and reproduce quickly (SN: 4/25/14).
The first image of a black hole may conceal treasure — but physicists disagree about whether it’s been found.
A team of scientists say they’ve unearthed a photon ring, a thin halo of light around the supermassive black hole in the galaxy M87. If real, the photon ring would provide a new probe of the black hole’s intense gravity. But other scientists dispute the claim. Despite multiple news headlines suggesting the photon ring has been found, many physicists remain unconvinced. Unveiled in 2019 by scientists with the Event Horizon Telescope, or EHT, the first image of a black hole revealed a doughnut-shaped glow from hot matter swirling around the black hole’s dark silhouette (SN: 4/10/19). But according to Einstein’s general theory of relativity, a thinner ring should be superimposed on that thick doughnut. This ring is produced by photons, or particles of light, that orbit close to the black hole, slung around by the behemoth’s gravity before escaping and zinging toward Earth.
Thanks to this circumnavigation, the photons should provide “a fingerprint of gravity,” more clearly revealing the black hole’s properties, says astrophysicist Avery Broderick of the University of Waterloo and the Perimeter Institute for Theoretical Physics in Canada. He and his colleagues, a subset of scientists from the EHT collaboration, used a new method to tease out that fingerprint, they report in the Aug. 10 Astrophysical Journal.
Creating images with EHT isn’t a simple point-and-shoot affair (SN: 4/10/19). Researchers stitch together data from EHT’s squad of observatories scattered across the globe, using various computational techniques to reconstruct an image. Broderick and colleagues created a new black hole image assuming it featured both a diffuse emission and a thin ring. On three out of four days of observations, the data better matched an image with the added thin ring than one without the ring.
But that method has drawn harsh criticism. “The claim of a photon ring detection is preposterous,” says physicist Sam Gralla of the University of Arizona in Tucson.
A main point of contention: The photon ring is brighter than expected, emitting around 60 percent of the light in the image. According to predictions, it should be more like 20 percent. “That’s a giant red flag,” says physicist Alex Lupsasca of Vanderbilt University in Nashville. More light should come from the black hole’s main glowing doughnut than from the thin photon ring.
This unexpected brightness, Broderick and colleagues say, occurs because some of the light from the main glow gets lumped in with the photon ring. So the ring’s apparent brightness doesn’t depend only on the light coming from the ring. The researchers note that the same effect appeared when testing the method on simulated data.
But that mishmash of purported photon ring light with other light doesn’t make for a very convincing detection, critics say. “If you want to claim that you’ve seen a photon ring, I think you have to do a better job than this,” says astrophysicist Dan Marrone of the University of Arizona, a member of the EHT collaboration who was not a coauthor on the new paper.
The new result suggests only that an added thin ring gives a better match to the data, Marrone says, not whether that shape is associated with the photon ring. So it raises the question of whether scientists are seeing a photon ring at all, or just picking out an unrelated structure in the image.
But Broderick argues that the features of the ring — the fact that its size and location are as expected and are consistent day-to-day — support the photon ring interpretation.
Meanwhile, in a similar, independent analysis, Gralla and physicist Will Lockhart, also of the University of Arizona, find no evidence for a photon ring, they report in a paper submitted August 22 at arXiv.org. Their analysis differed from Broderick and colleagues’ in part because it limited how bright the photon ring could be.
To convincingly detect the photon ring, some scientists propose adding telescopes in space to the EHT’s crew of observatories (SN: 3/18/20). The farther apart the telescopes in the network are, the finer details they may be able to pick out — potentially including the photon ring.
“If there were a photon ring detection,” Lupsasca says, “that would be the best thing in physics this year, if not for many years.”
As omicron subvariant BA.5 continues to drive the coronavirus’ spread in the United States, I’ve been thinking about what could come next. Omicron and its offshoots have been topping the variant charts since last winter. Before that, delta reigned.
Scientists have a few ideas for how new variants emerge. One involves people with persistent infections — people who test positive for the virus over a prolonged period of time. I’m going to tell you about the curious case of a person infected with SARS-CoV-2 for at least 471 days and what can happen when infections roil away uncontrolled. That lengthy infection first came onto epidemiologist Nathan Grubaugh’s radar in the summer of 2021. His team had been analyzing coronavirus strains in patient samples from Yale New Haven Hospital when Grubaugh spotted something he had seen before. Known only as B.1.517, this version of the virus never got a name like delta or omicron, nor rampaged through communities quite like its infamous relatives.
Instead, after springing up somewhere in North America in early 2020, B.1.517 tooled around in a handful of regions around the world, even sparking an outbreak in Australia. But after April 2021, B.1.517 seemed to sputter, one of the who-knows-how-many viral lineages that flare up and then eventually fizzle.
B.1.517 might have been long forgotten, shouldered aside by the latest variant to stake a claim in local communities. “And yet we were still seeing it,” Grubaugh says. Even after B.1.517 had petered out across the country, his team noticed it cropping up in patient samples. The same lineage, every few weeks, like clockwork, for months.
One clue was the samples’ specimen ID. The code on the B.1.517 samples was always the same, Grubaugh’s team noticed. They had all come from a single patient.
That patient, a person in their 60s with a history of cancer, relapsed in November of 2020. That was right around when they first tested positive for SARS-CoV-2. After seeing B.1.517 show up again and again in their samples, Grubaugh worked with a clinician to get the patient’s permission to analyze their data. Ultimately, the patient has remained infected for 471 days (and counting), Grubaugh, Yale postdoctoral researcher Chrispin Chaguza and their team reported last month in a preliminary study posted at medRxiv.org. Because of deteriorating health and a desire to maintain their anonymity, the patient was not willing to be interviewed, and Grubaugh has no direct contact with them.
But all those samples collected over all those days told an incredible tale of viral evolution. Over about 15 months, at least three genetically distinct versions of the virus had rapidly evolved inside the patient, the team’s analyses suggested.
Each version had dozens of mutations and seemed to coexist in the patient’s body. “Honestly, if any one of these were to emerge in a population and begin transmitting, we would be calling it a new variant,” Grubaugh says.
That scenario is probably rare, he says. After all, lots of prolonged infections have likely occurred during the pandemic, and only a handful of concerning variants have emerged. But the work does suggest that persistent viral infections can provide a playground for speedy evolutionary experimentation — perhaps taking advantage of weakened immune systems.
Grubaugh’s work is “probably the most detailed look we’ve had at a single, persistent infection with SARS-CoV-2 so far,” says Tom Friedrich, a virologist at the University of Wisconsin–Madison, who was not involved with the work. The study supports an earlier finding about a different immunocompromised patient — one with a persistent omicron infection. In that work, researchers documented the evolution of the virus over 12 weeks and showed that its descendant infected at least five other people.
Together, the studies lay out how such infections could potentially drive the emergence of the next omicron.
“I am pretty well convinced that people with persistent infection are important sources of new variants,” Friedrich says.
Who exactly develops these infections remains mysterious. Yes, the virus can pummel people with weakened immune systems, but “not every immunocompromised person develops a persistent infection,” says Viviana Simon, a virologist at the Icahn School of Medicine at Mount Sinai who worked on the omicron infection study.
In fact, doctors and scientists have no idea how common these infections are. “We just don’t really have the numbers,” Simon says. That’s a huge gap for researchers, and something Mount Sinai’s Pathogen Surveillance Program is trying to address by analyzing real-time infection data.
Studying patients with prolonged infections could also tell scientists where SARS-CoV-2 evolution is heading, Friedrich says. Just because the virus evolves within a person doesn’t mean it will spread to other people. But if certain viral mutations tend to arise in multiple people with persistent infections, that could hint that the next big variant might evolve in a similar way. Knowing more about these mutation patterns could help researchers forecast what’s to come, an important step in designing future coronavirus vaccine boosters. Beyond viral forecasting, Grubaugh says identifying people with prolonged infections is important so doctors can provide care. “We need to give them access to vaccines, monoclonal antibodies and antiviral drugs,” he says. Those treatments could help patients clear their infections.
But identifying persistent infections is easier said than done, he points out. Many places in the world aren’t set up to spot these infections and don’t have access to vaccines or treatments. And even when these are available, some patients opt out. The patient in Grubaugh’s study received a monoclonal antibody infusion about 100 days into their infection, then refused all other treatments. They have not been vaccinated.
Though the patient remained infectious over the course of the study, their variants never spread to the community, as far as Grubaugh knows.
And while untreated chronic infections might spawn new variants, they could emerge in other ways, too, like from animals infected with the virus, from person-to-person transmission in groups of people scientists haven’t been monitoring, or from “something else that maybe none of us has thought of yet,” he says. “SARS-CoV-2 has continued to surprise us with its evolution.”
It’s morning and you wake on a comfortable foam mattress made partly from greenhouse gas. You pull on a T-shirt and sneakers containing carbon dioxide pulled from factory emissions. After a good run, you stop for a cup of joe and guiltlessly toss the plastic cup in the trash, confident it will fully biodegrade into harmless organic materials. At home, you squeeze shampoo from a bottle that has lived many lifetimes, then slip into a dress fashioned from smokestack emissions. You head to work with a smile, knowing your morning routine has made Earth’s atmosphere a teeny bit carbon cleaner.
Sound like a dream? Hardly. These products are already sold around the world. And others are being developed. They’re part of a growing effort by academia and industry to reduce the damage caused by centuries of human activity that has sent CO2 and other heat-trapping gases into the atmosphere . The need for action is urgent. In its 2022 report, the United Nations Intergovernmental Panel on Climate Change, or IPCC, stated that rising temperatures have already caused irreversible damage to the planet and increased human death and disease (SN: 5/7/22 & 5/21/22, p. 8). Meanwhile, the amount of CO2 emitted continues to rise. The U.S. Energy Information Administration predicted last year that if current policy and growth trends continue, annual global CO2 emissions could rise from about 34 billion metric tons in 2020 to almost 43 billion by 2050.
Carbon capture and storage, or CCS, is one strategy for mitigating climate change long noted by the IPCC as having “considerable” potential. A technology that has existed since the 1970s, CCS traps CO2 from smokestacks or ambient air and pumps it underground for permanent sequestration. Today, 27 CCS facilities operate around the world — 12 in the United States — storing an estimated 36 million tons of carbon per year, according to the Global CCS Institute. The 2021 Infrastructure Investment and Jobs Act includes $3.5 billion in funding for four additional U.S. direct capture facilities.
But rather than just storing it, the captured carbon could be used to make things. This year for the first time, the IPCC added carbon capture and utilization, or CCU, to its list of options for drawing down atmospheric carbon. CCU captures CO2 and incorporates it into carbon-containing products like cement, jet fuel and the raw materials for making plastics. Still in early stages of development and commercialization, CCU could reduce annual greenhouse gas emissions by 20 billion tons in 2050 — more than half of the world’s global emissions today, the IPCC estimates.
Such recognition was a big victory for a movement that has struggled to emerge from the shadow of its more established cousin, CCS, says chemist and global CCU expert Peter Styring of the University of Sheffield in England. Many CCU-related companies are springing up and collaborating with each other and with governments around the world, he adds.
The potential of CCU is “enormous,” both in terms of its volume and monetary potential, said mechanical engineer Volker Sick at a CCU conference in Brussels in April. Sick, of the University of Michigan in Ann Arbor, directs the Global CO2 Initiative, which promotes CCU as a mainstream climate solution. “We’re not talking about something that’s nice to do but doesn’t move the needle,” he added. “It moves the needle in many, many aspects.” The plastics paradox The use of carbon dioxide in products is not new. CO2 is used to make soda fizzy, keep foods frozen (as dry ice) and convert ammonia to urea for fertilizer. What’s new is the focus on making products with CO2 as a strategy to slow climate change. Today’s CCU market, estimated at $2 billion, could mushroom to $550 billion by 2040, according to Lux Research, a Boston-based market research firm. Much of this market is driven by adding CO2 to cement — which can improve its properties as well as reduce atmospheric carbon — and to jet fuel, which can lower the industry’s large carbon footprint. CO2-to-plastics is a niche market today, but the field aims to battle two crises at once: climate change and plastic pollution.
Plastics are made from fossil fuels, a mix of hydrocarbons formed by the remains of ancient organisms. Most plastics are produced by refining crude oil, which is then broken down into smaller molecules through a process called cracking. These smaller molecules, known as monomers, are the building blocks of polymers. Monomers such as ethylene, propylene, styrene and others are linked together to form plastics such as polyethylene (detergent bottles, toys, rigid pipes), polypropylene (water bottles, luggage, car parts) and polystyrene (plastic cutlery, CD cases, Styrofoam). But making plastics from fossil fuels is a carbon catastrophe. Each step in the plastics life cycle — extraction, transport, manufacture and disposal — emits massive amounts of greenhouse gases, mostly CO2, according to the Center for International Environmental Law, a nonprofit law firm based in Geneva and Washington, D.C. These emissions alone — more than 850 million tons of greenhouse gases in 2019 — are enough to threaten global climate targets.
And the numbers are about to get much worse. A 2018 report by the Paris-based intergovernmental International Energy Agency projected that global demand for plastics will increase from about 400 million tons in 2020 to nearly 600 million by 2050. Future demand is expected to be concentrated in developing countries and will vastly outstrip global recycling efforts.
Plastics are a serious crisis for the environment, from fossil fuel use to their buildup in landfills and oceans (SN: 1/16/21, p. 4). But we’re a society addicted to plastic and all it gives us — cell phones, computers, comfy Crocs. Is there a way to have our (plastic-wrapped) cake and eat it too?
Yes, says Sick. First, he argues, cap the oil wells. Next, make plastics from aboveground carbon. Today, there are products made of 20 to over 40 percent CO2. Finally, he says, build a circular economy, one that reduces resource use, reuses products, then recycles them into other new products.
“Not only can we eliminate the fossil carbon as a source so that we don’t add to the aboveground carbon budget, but in the process we can also rethink how we make plastics,” Sick says. He suggests they be specifically designed “to live very, very long so that they don’t have to be replaced … or that they decompose in a benign manner.”
But creating plastics from thin air is not easy. CO2 needs to be extracted, from the atmosphere or smokestacks, for example, using specialized equipment. It often needs to be compressed into liquid form and transported, generally through pipelines. Finally, to meet the overall goal of reducing the amount of carbon in the air, the chemical reaction that turns CO2 into the building blocks of plastics must be run with as little extra energy as possible. Keeping energy use low is a special challenge when dealing with the carbon dioxide molecule.
A bond that’s hard to break There’s a reason that carbon dioxide is such a potent greenhouse gas. It is incredibly stable and can linger in the atmosphere for 300 to 1,000 years. That stability makes CO2 hard to break apart and add to other chemicals. Lots of energy is typically needed for the reaction.
“This is the fundamental energy problem of CO2,” says chemist Ian Tonks of the University of Minnesota in Minneapolis. “Energy is necessary to fix CO2 to plastics. We’re trying to find that energy in creative ways.”
Catalysts offer a possible answer. These substances can increase the rate of a chemical reaction, and thus reduce the need for energy. Scientists in the CO2-to-plastics field have spent more than a decade searching for catalysts that can work at close to room temperature and pressure, and coax CO2 to form a new chemical identity. These efforts fall into two broad categories: chemical and biological conversion.
First attempts Early experiments focused on adding CO2 to highly reactive monomers like epoxides to facilitate the reaction. Epoxides are three-membered rings composed of one oxygen atom and two carbon atoms. Like a spring under tension, they can easily pop open. In the early 2000s, industrial chemist Christoph Gürtler and chemist Walter Leitner of Aachen University in Germany found a zinc catalyst that allowed them to break open the epoxide ring of polypropylene oxide and combine it with CO2. Following the reaction, the CO2 was joined permanently to the polypropylene molecule and was no longer in gas form — something that is true of all CO2-to-plastic reactions. Their work resulted in one of the first commercial CO2 products — a polyurethane foam containing 20 percent captured CO2. Today, the German company Covestro, where Gürtler now works, sells 5,000 tons of the product annually in mattresses, car interiors, building insulation and sports flooring.
More recent research has focused on other monomers to expand the variety of CO2-based plastics. Butadiene is a hydrocarbon monomer that can be used to make polyester for clothing, carpets, adhesives and other products.
In 2020, chemist James Eagan at the University of Akron in Ohio mixed butadiene and CO2 with a series of catalysts developed at Stanford University. Eagan hoped to create a polyester that is carbon negative, meaning it has a net effect of removing CO2 from the atmosphere, rather than adding it. When he analyzed the contents of one vial, he discovered he had created something even better: a polyester made with 29 percent CO2 that degrades in high pH water into organic materials. “Chemistry is like cooking,” Eagan says. “We took chocolate chips, flour, eggs, butter, mixed them up, and instead of getting cookies we opened the oven and found a chicken potpie.”
Eagan’s invention has immediate applications in the recycling industry, where machines can often get gummed up from the nondegradable adhesives used in packaging, soda bottle labels and other products. An adhesive that easily breaks down may improve the efficiency of recycling facilities.
Tonks, described by Eagan as a friendly competitor, took Eagan’s patented process a step further. By putting Eagan’s product through one more reaction, Tonks made the polymer fully degradable back to reusable CO2 — a circular carbon economy goal. Tonks created a start-up this year called LoopCO2 to produce a variety of biodegradable plastics.
Microbial help Researchers have also harnessed microbes to help turn carbon dioxide into useful materials including dress fabric. Some of the planet’s oldest-living microbes emerged at a time when Earth’s atmosphere was rich in carbon dioxide. Known as acetogens and methanogens, the microbes developed simple metabolic pathways that use enzyme catalysts to convert CO2 and carbon monoxide into organic molecules. In the atmosphere, CO will react with oxygen to form CO2. In the last decade, researchers have studied the microbes’ potential to remove these gases from the atmosphere and turn them into useful products.
LanzaTech, based in Skokie, Ill., uses the acetogenic bacterium Clostridium autoethanogenum to metabolize CO2and CO emissions into a variety of industrial chemicals, including ethanol. Last year, the clothing company Zara began using LanzaTech’s polyester fabric for a line of dresses.
The ethanol used to create these products comes from LanzaTech’s two commercial facilities in China, the first to transform waste CO, a main emission from steel plants, into ethanol. The ethanol goes through two more steps to become polyester. LanzaTech partnered with steel mills near Beijing and in north-central China, feeding carbon monoxide into LanzaTech’s microbe-filled bioreactor.
Steel production emits almost two tons of CO2 for every ton of steel made. By contrast, a life cycle assessment study found that LanzaTech’s ethanol production process lowered greenhouse gas emissions by approximately 80 percent compared with ethanol made from fossil fuels.
In February, researchers from LanzaTech, Northwestern University in Evanston, Ill., and others reported in Nature Biotechnology that they had genetically modified the Clostridium bacterium to produce acetone and isopropanol, two other fossil fuel–based industrial chemicals. Company CEO Jennifer Holmgren says the only waste product is dead bacteria, which can be used as compost or animal feed.
Other researchers are skipping the living microbes and just using their catalysts. More than a decade ago, chemist Charles Dismukes of Rutgers University in Piscataway, N.J., began looking at acetogens and methanogens as a way to use atmospheric carbon. He was intrigued by their ability to release energy when making carbon building blocks from CO2, a reaction that usually requires energy. He and his team focused on the bacteria’s nickel phosphide catalysts, which are responsible for the energy-releasing carbon reaction.
Dismukes and colleagues developed six electrocatalysts that are able to make monomers at room temperature and pressure using only CO2, water and electricity. The energy-releasing pathway of the nickel phosphide catalysts “lowers the required voltage to run the reaction, which lowers the energy consumption of the process and improves the carbon footprint,” says Karin Calvinho, a former student of Dismukes who is now chief technical officer at RenewCO2, the start-up Dismukes’ team formed in 2018.
RenewCO2 plans to sell its monomers, including monoethylene glycol, to companies that want to reduce their carbon footprint. The group proved its concept works using CO2 brought into the lab. In the future, the company intends to obtain CO2 from biomass, industrial emissions or direct air capture. Barriers to change Yet researchers and companies face challenges in scaling up carbon capture and reuse. Some barriers lurk in the language of regulations written before CCU existed. An example is the U.S. Environmental Protection Agency’s program to provide tax credits to companies that make biofuels. The program is geared toward plant-based fuels like corn and sugarcane. LanzaTech’s approach for making jet fuel doesn’t qualify for credits because bacteria are not plants.
Other barriers are more fundamental. Styring points to the long-standing practice of fossil fuel subsidies, which in 2021 topped $440 billion worldwide. Global government subsidies to the oil and gas industry keep fossil fuel prices artificially low, making it hard for renewables to compete, according to the International Energy Agency. Styring advocates shifting those subsidies toward renewables.
“We try to work on the principle that we recycle carbon and create a circular economy,” he says. “But current legislation is set up to perpetuate a linear economy.” The happy morning routine that makes the world carbon cleaner is theoretically possible. It’s just not the way the world works yet. Getting to that circular economy, where the amount of carbon above ground is finite and controlled in a never-ending loop of use and reuse will require change on multiple fronts. Government policy and investment, corporate practices, technological development and human behavior would need to align perfectly and quickly in the interests of the planet.
In the meantime, researchers continue their work on the carbon dioxide molecule.
“I try to plan for the worst-case scenario,” says Eagan, the chemist in Akron. “If legislation is never in place to curb emissions, how do we operate within our capitalist system to generate value in a renewable and responsible way? At the end of the day, we will need new chemistry.”
From pulling Mesopotamian war chariots to grinding grain in the Middle Ages, donkeys have carried civilization on their backs for centuries. DNA has now revealed just how ancient humans’ relationship with donkeys really is.
The genetic instruction books of over 200 donkeys from countries around the world show that these beasts of burden were domesticated about 7,000 years ago in East Africa, researchers report in the Sept. 9 Science.
“The history of the donkey has puzzled scientists for years,” says Ludovic Orlando, a molecular archaeologist at the Centre for Anthropobiology and Genomics of Toulouse in France. This discovery shows that donkeys were domesticated in one fell swoop, roughly 3,000 years before horses. DNA has great potential for unraveling humankind’s shared history with our animal companions. In 2021, Orlando and his colleagues used DNA from the bones of horses to track their domestication to the Eurasian steppes, in what’s now southwestern Russia, more than 4,200 years ago (SN: 10/20/21).
But the history of donkeys (Equus asinus) had remained murky. Today, domesticated donkeys are found all over the globe. A dwindling number of wild asses in Asia and Africa — the closest wild relatives of donkeys — pointed toward one of those continents as the likely donkey homeland.
Archaeological evidence — including a 5,000-year-old Egyptian tablet depicting marching asses, sheep and cattle — zeroed in on Africa as the most probable contender. But genetic studies attempting to pin down when and where donkeys were domesticated have been largely inconclusive.
This was probably because scientists were lacking donkey DNA from many regions of the world, Orlando says. For example, to date, there have been no published genomes from donkeys living south of the equator in Africa. To get a broader diversity of DNA, Orlando and his colleagues gathered 207 genomes from donkeys living in 31 countries, ranging from Brazil to China, along with DNA belonging to 31 donkeys that lived between 4,000 and about 100 years ago.
By comparing these genomes with those of wild asses, the researchers found that all donkeys could trace their lineage back to a single domestication event in East Africa, perhaps in the Horn of Africa, around 5000 B.C. From there, domesticated donkeys spread to the rest of the continent and into Europe and Asia, where they formed genetically distinct groups based on region. Humans have now brought donkeys to nearly every continent on Earth, carrying their genetic legacy with them.
These results add new clarity to the story of donkey domestication, says Emily Clark, a livestock geneticist at the Roslin Institute at the University of Edinburgh. “Donkeys are extraordinary working animals that are essential to the livelihoods of millions of people around the globe,” she says. “As humans, we owe a debt of gratitude to the domestic donkey for the role they play and have played in shaping society.” Exactly why people chose to tame wild asses in Africa thousands of years ago is unclear. But the timing of their first spread across eastern Africa coincides with a period when the Sahara started becoming more arid and expanded (SN: 5/8/08).
“Donkeys are champions when it comes to carrying stuff and are good at going through deserts,” Orlando says. As the desert grew larger, donkeys could have provided much needed help moving goods across the increasingly dry terrain, he says.
The archaeological record for donkeys in Africa outside of Egypt is sparse. The new result could help archaeologists narrow their search to new areas to learn more about the first donkeys and the people who tamed them, Orlando says.
Meanwhile, digging into the genetic diversity that has allowed donkeys to support human endeavors across a range of environmental conditions could put donkeys in the spotlight as climate change exacerbates droughts and threatens to expand deserts around the world (SN: 3/10/22).
“Donkeys still provide tons of support for people living in low- and middle-income countries,” Orlando says. Understanding humankind’s shared history with donkeys “is not just about the past, but could actually be useful in the future.”
An artificial intelligence can decode words and sentences from brain activity with surprising — but still limited — accuracy. Using only a few seconds of brain activity data, the AI guesses what a person has heard. It lists the correct answer in its top 10 possibilities up to 73 percent of the time, researchers found in a preliminary study.
The AI’s “performance was above what many people thought was possible at this stage,” says Giovanni Di Liberto, a computer scientist at Trinity College Dublin who was not involved in the research. Developed at the parent company of Facebook, Meta, the AI could eventually be used to help thousands of people around the world unable to communicate through speech, typing or gestures, researchers report August 25 at arXiv.org. That includes many patients in minimally conscious, locked-in or “vegetative states” — what’s now generally known as unresponsive wakefulness syndrome (SN: 2/8/19).
Most existing technologies to help such patients communicate require risky brain surgeries to implant electrodes. This new approach “could provide a viable path to help patients with communication deficits … without the use of invasive methods,” says neuroscientist Jean-Rémi King, a Meta AI researcher currently at the École Normale Supérieure in Paris.
King and his colleagues trained a computational tool to detect words and sentences on 56,000 hours of speech recordings from 53 languages. The tool, also known as a language model, learned how to recognize specific features of language both at a fine-grained level — think letters or syllables — and at a broader level, such as a word or sentence.
The team applied an AI with this language model to databases from four institutions that included brain activity from 169 volunteers. In these databases, participants listened to various stories and sentences from, for example, Ernest Hemingway’s The Old Man and the Sea and Lewis Carroll’s Alice’s Adventures in Wonderland while the people’s brains were scanned using either magnetoencephalography or electroencephalography. Those techniques measure the magnetic or electrical component of brain signals.
Then with the help of a computational method that helps account for physical differences among actual brains, the team tried to decode what participants had heard using just three seconds of brain activity data from each person. The team instructed the AI to align the speech sounds from the story recordings to patterns of brain activity that the AI computed as corresponding to what people were hearing. It then made predictions about what the person might have been hearing during that short time, given more than 1,000 possibilities.
Using magnetoencephalography, or MEG, the correct answer was in the AI’s top 10 guesses up to 73 percent of the time, the researchers found. With electroencephalography, that value dropped to no more than 30 percent. “[That MEG] performance is very good,” Di Liberto says, but he’s less optimistic about its practical use. “What can we do with it? Nothing. Absolutely nothing.”
The reason, he says, is that MEG requires a bulky and expensive machine. Bringing this technology to clinics will require scientific innovations that make the machines cheaper and easier to use.
It’s also important to understand what “decoding” really means in this study, says Jonathan Brennan, a linguist at the University of Michigan in Ann Arbor. The word is often used to describe the process of deciphering information directly from a source — in this case, speech from brain activity. But the AI could do this only because it was provided a finite list of possible correct answers to make its guesses.
“With language, that’s not going to cut it if we want to scale to practical use, because language is infinite,” Brennan says.
What’s more, Di Liberto says, the AI decoded information of participants passively listening to audio, which is not directly relevant to nonverbal patients. For it to become a meaningful communication tool, scientists will need to learn how to decrypt from brain activity what these patients intend on saying, including expressions of hunger, discomfort or a simple “yes” or “no.”
The new study is “decoding of speech perception, not production,” King agrees. Though speech production is the ultimate goal, for now, “we’re quite a long way away.”
Biggest bacteria? A newfound species of bacteria, Thiomargarita magnifica, averages 1 centimeter long and can be seen by the naked eye, making it the largest bacteria yet discovered, Erin Garcia de Jesús reported in “Newfound bacteria make a big splash” (SN: 7/16/22 & 7/30/22, p. 17).
Reader J.C. Smith pointed out that another article in the magazine seems to contradict the findings in this story. In “Live wires,” Nikk Ogasa reported that cable bacteria, which channel electricity, can grow up to 5 centimeters long (SN: 7/16/22 & 7/30/22, p. 24).
This is no contradiction, Garcia de Jesús says. T. magnifica is a single-celled species of bacteria, which means all of the cellular functions necessary for the organism’s survival happen within its one cell. Cable bacteria, on the other hand, are multicellular, with different cells performing different functions. “T. magnifica is the largest single-celled bacterium ever found,” Garcia de Jesús says.
Given that bacteria are typically defined as single-celled organisms, reader Barry Maletzky wondered how multicellular cable bacteria can be considered part of the group.
Most bacteria are single-celled, Ogasa says, but several multicellular species do exist. “For instance, some cyanobacteria, sometimes called blue-green algae, are also multicellular. That allows the organisms to split the jobs of photosynthesis and nitrogen absorption between cells.”
Mapping out space Massive objects that warp spacetime can redirect gravitational waves. Researchers might someday leverage those waves as a kind of gravity “radar” to peer inside stars and find globs of dark matter, Asa Stahl reported in “Gravitational wave ‘radar’ could map the universe” (SN: 7/16/22 & 7/30/22, p. 12).
Reader Neil Kaminar wondered if changes in the frequency of light coming from massive objects could be used to detect the distortion of spacetime.
In theory, yes, says Glenn Starkman, a physicist at Case Western Reserve University in Cleveland. When light travels through spacetime toward or away from a massive object, gravity changes the frequency of the light, he says. Scientists have witnessed one form of this process, called gravitational redshift, in action on Earth.
But this effect would probably not be very useful when it comes to gravitational radar, Starkman says. After light moves toward a massive object, changing its frequency, it would then move away from the object. That process would shift the light’s frequency toward what it was before the encounter, mostly canceling out the effect, Starkman says.
Science and society In “We won’t shy away from covering politicized science,” editor in chief Nancy Shute reflected on Science News’ history of reporting on the science of politically contentious issues and asserted our commitment to continue that coverage (SN: 7/16/22 & 7/30/22, p. 2).
Brigitte Dempsey was glad to read Shute’s editor’s note in the wake of the U.S. Supreme Court striking down Roe v. Wade, the landmark decision that had protected a person’s right to an abortion. Since then, debates around abortion and pregnancy biology have become more heated, and accurate science is often missing from the discussions (SN: 7/16/22 & 7/30/22, p. 6). “Bravo for meeting the issue square on,” Dempsey wrote. “Our only hope to bring reason to bear … is to let science speak.”
A particular kind of poliovirus is spreading in the United States. The U.S. Centers for Disease Control and Prevention has confirmed that the country now joins a list of around 30 other countries where circulation of the virus has been identified. Those countries include the United Kingdom, Israel, Egypt, Yemen and around two dozen in Africa.
The news, announced September 13, comes after the identification in July of a case of paralytic polio in an unvaccinated adult in Rockland County in New York. Public health officials found the case was caused by what’s called a vaccine-derived poliovirus (find out more about this kind of poliovirus below). This spurred wastewater surveillance in Rockland and the surrounding counties, because people shed poliovirus in their stool. The wastewater samples showed that the virus was spreading in Rockland and neighboring areas. In response, New York Governor Kathy Hochul declared a state of emergency on September 9 to expand access to polio vaccination statewide. Three of the counties where poliovirus has been detected in wastewater — Rockland, Orange and Sullivan — have polio vaccination rates of only around 60 percent. The virus has also turned up in New York City and Nassau County.
While most people infected with polio don’t have symptoms, some might feel like they have the flu, with fever, fatigue or a sore throat. In rare cases, the virus can cause permanent paralysis, and the disease can turn deadly if that paralysis hits the muscles that control breathing or swallowing. Anyone unvaccinated is at risk of paralytic polio if they get infected.
Widespread vaccination efforts helped eliminate wild polioviruses from the United States in 1979, but public health officials are still working toward eradicating the disease globally (SN: 9/12/19). The new worries about polio in the United States are driven by vaccine-derived versions of the virus spreading in areas with low vaccination.
Here are six things to know about polio right now.
There are two types of polio vaccines. What’s the difference? Polio vaccines come as a shot, given in the arm or leg, or a liquid given orally. These vaccines provide protection against wild poliovirus and vaccine-derived poliovirus. Both polio vaccines used to be given in the United States, but since 2000, the shot has been the only polio vaccine available in the country (SN: 10/27/21).
The shot is an inactivated vaccine given as part of routine childhood vaccinations in the United States. It is made using poliovirus that has been “killed,” stripped of its ability to cause disease. Kids receive a total of four shots. The inactivated vaccine protects against paralysis.
The oral vaccine, still used in many countries, is an attenuated vaccine, made with live but weakened poliovirus. This vaccine can help prevent wild poliovirus from being passed along further if a vaccinated person drinks water or eats food that has been contaminated with stool containing the pathogen. That means it can prevent the spread of poliovirus in a community while also protecting against paralysis (SN: 1/8/21).
But because these attenuated versions can replicate, the virus can spread from cell to cell and possibly to other people. Which leads us to the next question.
What are vaccine-derived polioviruses? These viruses are related to the oral vaccine. Since the viruses used in the vaccine can replicate, they can spread but they’re too weakened to cause serious disease. The problem comes when an attenuated virus from the oral vaccine spreads among too many people and regains its ability to cause paralysis, says Adam Lauring, a virologist and infectious diseases physician at the University of Michigan in Ann Arbor. “Because it can replicate, it will evolve.”
In a community with low or no vaccination against polio, such vaccine-derived polioviruses can cause disease.
So why do some countries still use the oral vaccine? The Global Polio Eradication Initiative, which includes the World Health Organization, CDC, United Nations Children’s Fund and other groups, has been working since 1988 to eradicate polio. The oral vaccine has been a key tool for global efforts to get rid of polio, Lauring says. That’s not only because that vaccine is inexpensive and easy to use in low- and middle-income countries, but also because studies suggest it better protects the gut, the part of the body where the virus grows. The more protected the gut, the better the chances of reducing transmission and stopping an outbreak.
What does it mean that poliovirus is being detected in wastewater? It’s a sign that poliovirus is spreading among people in those regions.
Paralysis from poliovirus is rare — affecting around 1 out of 200 infected people. So the single paralytic case identified in July in New York was already a hint that there may have been hundreds of other infections. The virus has since been detected in wastewater samples from as early as May. The virus’s continued presence in wastewater suggests people are still getting infected and passing it on to others. Should unvaccinated people get vaccinated? Yes. “If you don’t know if you received polio shots, then you should probably get your polio shots,” Lauring says. “If you didn’t [get vaccinated], you should get a polio shot.”
Vaccine-derived polioviruses are largely a problem in communities where not enough people are vaccinated. “That’s one piece of the puzzle of what’s been going on in New York,” Lauring says. Low immunization rates mean vaccine-derived viruses can spread, largely among unvaccinated people, and circulate silently before someone gets sick.
Places that have sanitation issues or struggle with other intestinal diseases are also hot spots for vaccine-derived polioviruses. When there’s not enough immunity to stop poliovirus from circulating, the virus can evolve further.
What about people who got vaccinated as kids? People who got vaccinated, even decades ago, are likely still protected.
Adults who have a high risk of exposure to the virus are eligible for one lifetime booster shot, according to the CDC. Otherwise, people should make sure they received all the recommended doses.
It’s unknown exactly how durable childhood polio vaccines are at protecting against severe disease in adults. With little polio circulating around the world, it’s a hard question to study, Lauring says. Still, for years there haven’t been any cases of polio in the United States, and we’ve largely had immunity from the inactivated vaccine, he says. “I’m not sounding the alarm bells.”
Hunter-gatherer groups living in southwest Asia may have started keeping and caring for animals nearly 13,000 years ago — roughly 2,000 years earlier than previously thought.
Ancient plant samples extracted from present-day Syria show hints of charred dung, indicating that people were burning animal droppings by the end of the Old Stone Age, researchers report September 14 in PLOS One. The findings suggest humans were using the dung as fuel and may have started animal tending during or even before the transition to agriculture. But what animals produced the dung and the exact nature of the animal-human relationship remain unclear. “We know today that dung fuel is a valuable resource, but it hasn’t really been documented prior to the Neolithic,” says Alexia Smith, an archaeobotanist at the University of Connecticut in Storrs (SN: 8/5/03).
Smith and her colleagues reexamined 43 plant samples taken in the 1970s from a residential dwelling at Abu Hureyra, an archaeological site now lost under the Tabqa Dam reservoir. The samples date from roughly 13,300 to 7,800 years ago, spanning the transition from hunter-gatherer societies to farming and herding.
Throughout the samples, the researchers found varying amounts of spherulites, tiny crystals that form in the intestines of animals and are deposited in dung. There was a noticeable uptick between 12,800 and 12,300 years ago, when darkened spherulites also appeared in a fire pit — evidence they were heated to between 500⁰ and 700⁰ Celsius, and probably burned.
The team then cross-referenced these findings against previously published data from Abu Hureyra. It found that the dung burning coincided with a shift from circular to linear buildings, an indication of a more sedentary lifestyle, along with steadily rising numbers of wild sheep at the site and a decline in gazelle and other small game. Combined, the authors argue, these findings suggest humans may have started tending animals outside their homes and were burning the piles of dung at hand as a supplement to wood.
“The spherulite evidence reported here confirms that dung of some sort was used as fuel,” says Naomi Miller, an archaeobotanist at the University of Pennsylvania who was not involved with the study.
Figuring out what animal left the dung could reveal whether animals were tethered outside or not. While the authors propose wild sheep, which would have been more accommodating to capture, Miller suggests the source was probably roaming wild gazelle.
“Spherulites coming from off-site collection of gazelle dung, stored until burned as fuel, is to my mind a more plausible interpretation,” Miller says. Even if kept for a few days, she says, sheep wouldn’t produce large amounts of dung.
“The whole thing is a classic whodunit,” says anthropologist Melinda Zeder, something perhaps DNA analysis could solve (SN: 7/6/17). Gazelle might be the source, she says, and if captured young, the animals may have even been tended for a while — even if they weren’t eventually domesticated.
“The interesting thing is that people [were] experimenting with their environment,” says Zeder, of the Smithsonian Institution in Washington, D.C. “Domestication is almost incidental to that.”
Gravity doesn’t discriminate. An experiment in orbit has confirmed, with precision a hundred times greater than previous efforts, that everything falls the same way under the influence of gravity.
The finding is the most stringent test yet of the equivalence principle, a key tenet of Einstein’s theory of general relativity. The principle holds to about one part in a thousand trillion, researchers report September 14 in Physical Review Letters.
The idea that gravity affects all things equally might not seem surprising. But the slightest hint otherwise could help explain how general relativity, the foundational theory of gravity, meshes with the standard model of particle physics, the theoretical framework that describes all fundamental particles of matter. General relativity is a classical theory that sees the universe as smooth and continuous, whereas the standard model is a quantum theory involving grainy bits of matter and energy. Combining them into a single theory of everything has been an unfulfilled dream of scientists extending back to Einstein . “The equivalence principle is the most important cornerstone of Einstein’s theory of general relativity,” says Sabine Hossenfelder, a physicist with the Frankfurt Institute for Advanced Studies in Germany who was not involved in the study. “We know [it] eventually has to be altered because it cannot in its present form take into account quantum effects.”
To help search for potential alterations, the MICROSCOPE experiment tracked the motion of nested metal cylinders — a 300-gram titanium outer cylinder and a 402-gram platinum inner one — as they orbited the Earth in near-perfect free fall. Any difference in the effect of gravity on the respective cylinders would cause them to move relative to each other. Small electrical forces applied to bring the cylinders back into alignment would have revealed a potential violation of the equivalence principle.
From April 2016 to October 2018, the cylinders were shielded inside a satellite that protected them from the buffeting of solar winds, the minuscule pressure that sunlight exerts and the residual atmosphere at an orbital altitude of a little over 700 kilometers high.
By performing the experiment in orbit, the researchers could compare the free fall of two different materials for extended periods without the confounding effects of vibrations or of objects nearby that could exert gravitational forces, says Manuel Rodrigues, a MICROSCOPE team member and physicist with the French aerospace lab ONERA in Palaiseau. “One of the lessons learned by MICROSCOPE is … that space is the best way to get an important improvement in the accuracy for this kind of test.”
Over its two-and-a-half-year mission, MICROSCOPE found no sign of cracks in the equivalence principle, the new study reports. The finding builds on a previous interim report from the experiment that found the same thing, but with less precision (SN: 12/4/17).
Some physicists suspect that limits to the equivalence principle may never turn up in experiments, and that Einstein will perpetually be proven right.
Even 100 times greater precision from a follow-up MICROSCOPE 2 mission, tentatively planned for the 2030s, is unlikely to reveal an equivalence principle breakdown, says Clifford Will, a physicist at the University of Florida in Gainesville who is not affiliated with the experiment. “It really is still this basic idea that Einstein taught,” he says. What we see as the force of gravity is actually the curvature of spacetime. “Any body simply moves along the path in Earth’s spacetime,” whether it’s made of dense platinum, lighter titanium or any other material.
But even if physicists never prove Einstein wrong, Hossenfelder says, experiments like MICROSCOPE are still important. “These tests aren’t just about the equivalence principle,” she says. “They implicitly look for all other kinds of deviations, new forces and so on,” that aren’t part of general relativity. “So really it’s a multiple-purpose, high-precision measurement.”
Now that the mission is complete, the MICROSCOPE satellite will slowly spiral out of orbit. “It’s difficult to bet where in 25 years it will fall down,” Rodrigues says. Along with a reference set of platinum cylinders on board, “it’s [a] couple of millions of euros [in] platinum.” Where that precious platinum metal will land is anyone’s guess, but the gravity that pulls it down will tug on the titanium just as hard, to one part in a thousand trillion at least.