From Einstein to AI: how 100 years have shaped science
Looking back a century reveals how much the research landscape has changed — and how unclear the consequences of scientific innovation can be.
Earlier this year, Nature published a paper that concluded that science is getting less disruptive1. Looking back a century might seem to support that idea. The twentieth century began with a revolution in physics. In 1900, Max Planck laid the foundation for quantum theory. This was followed by Albert Einstein’s annus mirabilis: in 1905, he published four groundbreaking papers on the photoelectric effect2, Brownian motion3, the special theory of relativity4 and the mass–energy relationship5 described by his famous formula, E = mc2. Subsequent decades saw the establishment of the general theory of relativity and that of the field of quantum mechanics.
Other scientific areas also saw rapid developments. In 1910, US geneticist Thomas Hunt Morgan used the fruit fly Drosophila to show how genes reside on chromosomes — a crucial step on the path to modern genetics. That same year, Marie Curie successfully isolated pure radium (element 88 in the periodic table). And, in 1925, Australian anthropologist Raymond Dart’s description of an Australopithecus africanus skull provided the first evidence that Africa is the cradle of humankind6.
Other scientific breakthroughs would shape people’s lives in more practical ways. In 1907, Belgian chemist Leo Baekeland commercialized an invention that he called bakelite — the forerunner of today’s plastics. The material was made up of long, unbreakable chains of hydrocarbon molecules. It didn’t conduct electricity, was mouldable, heat resistant and rather easy on the eye when dyed.
And in 1909, German chemist Fritz Haber discovered a method for producing ammonia, which he and fellow chemist Carl Bosch commercialized at the German chemical company BASF in 1913. Their process of manufacturing ammonia by fixing nitrogen from the air became the basis of the fertilizers that remain crucial to global food security today.
The scientific landscape has changed so much that it would be unrecognizable to someone who lived 100 years ago. The scale of science and innovation, performed by large, globally collaborating teams, and how it is funded (predominantly by industries) would be utterly alien to scientists of old. How research is disseminated to scientific peers and to society would be both foreign and familiar; papers are still published but that is only part of how science is now communicated. And researchers bear many new ethical, legal and societal responsibilities.
It’s hard to argue that some of the discoveries of the twenty-first century so far haven’t been disruptive, in the sense of providing new directions for science. Through global collaborations and with help of multinational funding, scientists produced the first draft sequence of the whole human genome7 in 2001 and found a way8 to edit genes efficiently in 2012. These achievements also enabled researchers to swiftly develop mRNA vaccines during the COVID-19 pandemic.
Fundamental physicists discovered the Higgs boson9,10 in 2012, nearly 50 years after its prediction. And in 2015, gravitational waves were first detected directly11, almost 100 years to the day after general relativity provided a theoretical basis for their existence.
Science and society have changed in other ways, too. The past century has taught researchers a lot about the risks of innovations such as plastics and artificial fertilizers. In response, countries have established legally binding agreements through the United Nations to limit the harms of scientific and technological innovations.
Baekeland’s life-changing plastics are now the subject of talks to limit their pollution. The process for producing ammonia is controlled by at least two international conventions. The first intends to limit, or reduce the risks of, greenhouse-gas emissions from production of this chemical. The second is a treaty to eliminate chemical weapons, an application of Haber’s invention that he supported during the First World War.
Recent developments, such as artificial intelligence (AI) technologies, are yet to be governed by global agreements, but they must be as well. Large language models and generative AI — this year’s biggest disruptive innovations — need to be applied such that their potential to do harm does not outweigh their benefits. Nature regularly reports on the challenges posed by generative AI technologies and the current lack of regulation. At some point, such systems will need to be regulated by globally coordinated agreements, as is the case for innovations such as nuclear materials, drugs and vaccines.
It is impossible to predict precisely what impacts this century’s innovations will have 100 years from now. But it is safe to say that the world’s societies, economies and environment will once again have changed, possibly beyond recognition. All the more reason for the international community to continue coordinating regulatory responses to new inventions, such as AI technologies — to avoid disruptive innovations that do more harm than good.