Skip to Content

With AI warning, Nobel winner joins ranks of laureates who’ve cautioned about the risks of their own work

By Meg Tirrell, CNN

(CNN) — When computer scientist Geoffrey Hinton won the Nobel Prize in physics on Tuesday for his work on machine learning, he immediately issued a warning about the power of the technology that his research helped propel: artificial intelligence.

“It will be comparable with the Industrial Revolution,” he said just after the announcement. “But instead of exceeding people in physical strength, it’s going to exceed people in intellectual ability. We have no experience of what it’s like to have things smarter than us.”

Hinton, who famously quit Google to warn about the potential dangers of AI, has been called the godfather of the technology. Now affiliated with the University of Toronto, he shared the prize with Princeton University professor John Hopfield “for foundational discoveries and inventions that enable machine learning with artificial neural networks.”

And while Hinton acknowledges that AI could transform parts of society for the better – leading to a “huge improvement in productivity” in areas like health care, for example – he also emphasized the potential for “a number of possible bad consequences, particularly the threat of these things getting out of control.”

“I am worried that the overall consequence of this might be systems more intelligent than us that eventually take control,” he said.

Hinton isn’t the first Nobel laureate to warn about the risks of the technology that he helped pioneer. Here’s a look at others who issued similar cautions about their own work.

1935: Nuclear weapons

The 1935 Nobel Prize for chemistry was shared by a husband-and-wife team, Frederic Joliot and Irene Joliot-Curie (daughter of laureates Marie and Pierre Curie), for discovering the first artificially created radioactive atoms. It was work that would contribute to important advancements in medicine, including cancer treatment, but also to the creation of the atomic bomb.

In his Nobel lecture that year, Joliot concluded with a warning that future scientists would “be able to bring about transmutations of an explosive type, true chemical chain reactions.”

“If such transmutations do succeed in spreading in matter, the enormous liberation of usable energy can be imagined,” he said. “But, unfortunately, if the contagion spread to all the elements of our planet, the consequences of unloosing such a cataclysm can only be viewed with apprehension.”

Nonetheless, Joliot predicted, it would be “a process that [future] investigators will no doubt attempt to realize while taking, we hope, the necessary precautions.”

1945: Antibiotic resistance

Sir Alexander Fleming shared the 1945 Nobel Prize in medicine with Ernst Chain and Sir Edward Florey for the discovery of penicillin and its application in curing bacterial infections.

Fleming made the initial discovery in 1928, and by the time he gave his Nobel lecture in 1945, already he had an important warning for the world: “It is not difficult to make microbes resistant to penicillin in the laboratory by exposing them to concentrations not sufficient to kill them, and the same thing has occasionally happened in the body,” he said.

“The time may come when penicillin can be bought by anyone in the shops,” he went on. “Then there is the danger that the ignorant man may easily underdose himself and, by exposing his microbes to non-lethal quantities of the drug, make them resistant.”

It was “such an important and prescient thought so many years ago,” said Dr. Jeffrey Gerber, an infectious diseases physician at Children’s Hospital of Philadelphia and medical director of the Antimicrobial Stewardship Program.

Nearly a century after Fleming’s initial discovery, antimicrobial resistance – the resistance of pathogens like bacteria to drugs meant to treat them – is considered one of the biggest threats to global public health, according to the World Health Organization, responsible for 1.27 million deaths in 2019 alone.

The key part of Fleming’s warning may have been antibiotics’ excessively wide use rather than the idea of low dosing.

“More often, people are given antibiotics entirely unnecessarily,” Gerber told CNN in an email. And “more and more often, we see bugs that are resistant to almost every (and sometimes every) antibiotic we have.”

1980: Recombinant DNA

Paul Berg, who won the 1980 Nobel Prize in chemistry for development of recombinant DNA, a technology that helped jump-start the biotechnology industry, didn’t issue as stark a warning as some of his fellow laureates about the potential risks of his research.

But he did acknowledge fears around what genetic engineering could lead to, including biological warfare, genetically modified foods and gene therapy, a form of medicine that involves replacing a defective gene that causes disease with a normally functioning one.

In his 1980 Nobel lecture, Berg focused specifically on gene therapy, saying the approach “has many pitfalls and unknowns, amongst which are questions concerning the feasibility and desirability for any particular genetic disease, to say nothing about the risks.”

“It seems to me,” he continued, “that if we are ever to proceed along these lines, we shall need a more detailed knowledge of how human genes are organized and how they function and are regulated.”

In an interview decades later, Berg noted that he and other scientists in the field had already come together publicly to acknowledge the potential dangers of the technology and work on guardrails, in a conference known as Asilomar, in 1975.

“The concerns about the recombinant DNA or genetic engineering came from the scientists, so that was a very crucial fact,” he told science writer Joanna Rose in 2001, according to a transcript on the Nobel website.

Through publicly acknowledging the risks and the need to examine them, Berg said, “we gained an enormous amount of public admiration, if you will, and tolerance, and so we were allowed to actually begin to deal with the question of how can we prevent any dangerous things coming out of our work?”

By 2001, he said, “the experience and experiments that have been done have shown that the original concerns which we really believed were possible, in fact, didn’t exist.”

Now, gene therapy is a growing area of medicine, with treatments approved for sickle cell disease, muscular dystrophy and some inherited forms of blindness, although it’s not widely used because it’s still complicated to administer and very expensive. In its earlier days, the technology led to the death in 1999 of a 17-year-old participant in a clinical trial, Jesse Gelsinger, raising ethical questions about how the research was done and slowing work in the area.

And though Berg raised concerns himself, he concluded his Nobel lecture in 1980 with a call for optimism and the “need to proceed.”

“The recombinant DNA breakthrough has provided us with a new and powerful approach to the questions that have intrigued and plagued man for centuries,” he said. “I, for one, would not shrink from that challenge.”

2020: Gene editing

Four years ago, Jennifer Doudna and Emmanuelle Charpentier shared the Nobel Prize in chemistry for the development of a method for genome editing called CRISPR-Cas9.

In her lecture, Doudna detailed “extraordinary and exciting opportunities” for the technology across public health, agriculture and biomedicine.

But she specified that work must proceed much more carefully when applied to human germ cells, whose genetic changes would be passed down to progeny, versus somatic cells, where any genetic changes would be limited to the individual.

“Heritability makes genome editing of germ cells a very powerful tool when we think about using it in plants or using it to create better animal models of human diseases, for example,” Doudna said. “It’s very different when we think about the enormous ethical and societal issues raised by the possibility of using germline editing in humans.”

Doudna, who founded the Innovative Genomics Institute, told CNN this week that she believed “appropriate warnings from scientists about the potential misuse of their discoveries is an important responsibility and helpful public service, particularly when the work has broad societal implications.”

“Those of us closest to the science of CRISPR understand that it’s a powerful tool that can positively transform our health and world but could potentially be used nefariously,” she said. “We’ve seen that dual-use capability with other transformative technologies like nuclear power – and now with AI.”

CNN’s Christian Edwards and Katie Hunt contributed to this report.

The-CNN-Wire
™ & © 2024 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.

Article Topic Follows: CNN - Health

Jump to comments ↓

Author Profile Photo

CNN Newsource

BE PART OF THE CONVERSATION

KIFI Local News 8 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content