This piece was published at Palladium magazine in 2023, so a few years after Nvidia's Jensen Huang started talking about AI and medicine and artificial biology*, but before the possibilities really started to become apparent. Now genomics looks to be very investable.**
From Palladium, November 17, 2023:
On June 26, 2000, President Bill Clinton announced the completion of
the draft of the human genome at a press conference with the two project
leads, Francis Collins and J. Craig Venter. A genome is all the genetic
information of an organism. Scientists had conceived of the Human
Genome Project in the 1980s, and, in the first half of the 1990s,
expected it to be an endeavor that would go on for decades. But an
unexpected technological revolution of faster computers and better
chemistry accelerated the ten-year effort toward the finish line, just
as the 20th century came to a close.
The American-led international effort cost more than $3 billion
dollars and involved thousands of people. Since then, the last 23 years
of the 21st century have seen a sea change in the landscape of genomics,
from blue-sky basic science to mass-market consumer products. Companies
like Nebula now provide entire genome sequences that are medical-grade
quality for $200; down from a price point of $20,000 just 13 years ago.
We’ve gone from a single mapped genome—that of humanity—to more than a
million genomes. This is a case where quantity has a quality all its
own; the commoditization of genomic sequencing has radically transformed
how we do genetics.
Yet at the dawn of this brave new genomic era, it is not health and
well-being outcomes that have been revolutionized. Rather, genomics as a
window into the human past has vindicated Alfred Tennyson’s poetic
assertion that nature is “red in tooth and claw.” Where a few decades
ago archaeologists and historians had to cobble together inferences from
pottery shards, slotting their data into theories that owed more to
political fashions of the present than scientific facts of the past,
today they can chart the rise and fall of peoples from the clear
evidence of the genes.
Collins and Venter promised a shiny future of good health and a more
enlightened understanding of humanity’s place in the world, but their
invention has, instead, unleashed knowledge of a bygone age of brutality
reminiscent of Conan the Barbarian’s Hyperborean Age. Historians can
list Genghis Khan’s concubines, but it is genetics that tells us that
10% of Central Asian men are his direct paternal descendants, bringing
home the magnitude of his conquests. But obviously, we aren’t fated to
relive the brutality of the past; just as technology can open a window
back in time, it can unlock the door to a brighter future. The question
is what advances we as species wish to make.
The Book of Nature Has a Billion Pages
A single human genome has two copies of each gene, of which there are 19,000. These 19,000 genes are distributed across three billion base pairs of adenine, cytosine, guanine, and thymine, or ACGT for short. Notably, the number of genes that humans have has been discovered only within the last twenty years, even though genetics as a scientific field is over 150 years old. The reason for this recent explosion in our knowledge is that, before the 1990s, genetics probed a digital process—the recombination of discrete units of heredity from the same and different individuals—with analog means. The correlation of characteristics between parents and offspring is intuitively obvious, but the mechanisms by which inheritance occurs are not self-evident.Our naïve assumption is that the characteristics blend together,
resulting in a child who is a synthesis of the traits of the parents
e.g. a short parent and a tall parent will produce medium-height
offspring. But the implication of this model is that, over the
generations, all human variation should be blended away as each
generation is the average of the previous one. That simply does not
occur. Humans remain as variable as they have been in the past. The
insight of Mendelian genetics is that inheritance does not proceed
through blending, but through the rearrangement of discrete units of
variation.
At about the same time that Charles Darwin was revolutionizing our
understanding of the tree of life with his theory of evolutionary change
through natural selection, an Austrian monk named Gregor Mendel
stumbled upon the framework that would later be called genetics. Between
1856 and 1863, he realized that inheritance seemed to be mediated by
particular units of inheritance he called “factors,” and would later be
called genes. Mendel hypothesized complex organisms had two copies of
many factors, discrete bundles of information that were rearranged every
generation through the law of segregation—that you inherit one copy of a
gene from each parent—and the law of independent assortment, that you
inherit factors independently from each other.
Mendel came to these insights through a famous set of experiments
where he crossed lines of peas with distinct characteristics and noted
that some traits bred true and others did not. Two short pea plants
always produced short pea plants. But two tall pea plants sometimes also
produced short pea plants. A model of blending inheritance cannot
explain recessive traits, but a Mendelian framework can. Whereas
intuitive blendings of inheritance take the visible traits as the only
variables of interest in understanding intergenerational change in
characteristics, Mendelian genetics implies that phenotypes emerge from
the interactions of underlying genotypes.
These genotypes are the true factors through which variation is
preserved from generation to generation; an organism’s visible
characteristics are only pointers to the true underlying heritable
variation present in the genes. Darwin’s Origin of Species was
published in 1859 to great fanfare, but Darwin famously lacked a
plausible mechanism of inheritance that could maintain the variation
that was necessary for natural selection. Mendel provided the answer,
but the Austrian monk’s single 1866 paper, “Experiments on Plant
Hybridization,” was ignored by the scientific community of the time,
only to be rediscovered around 1900, when the modern field of genetics
was born.
But twentieth-century genetics very much worked within Gregor
Mendel’s methodological framework. Genes were analytical units,
abstractions necessary to explain the patterns of inheritance visible in
breeding experiments, but not understood in physical terms. Genetics
proceeded through analyses of patterns of inheritance in pedigrees and
populations, a laborious matter of inspection and inference. The journey
to where we are today, when we can read out the sequence of any
organism that we choose, began in the 1940s when biologists realized
nucleic acids were the medium through which genetic information was
transmitted.
After James Watson and Francis Crick’s elucidation of the structure
of DNA in 1953, the molecular biological revolution that it ushered in
allowed geneticists to conceive of the idea of mapping genes in a direct
physical manner, rather than inferring them through the transmission of
phenotypes within pedigrees. But even as late as 1975 only
one hundred genetic positions were mapped in the human genome across
all populations. The first complete biophysical genetic map of an
organism, Haemophilus influenzae, was published in 1995 with
a1.83 million base pair length sequence. By 2020 there were tens of
thousands of different species sequenced. The story of the mutation of
genetics from a data-poor to a data-rich science is one of exponential
technological change; it is very much a synergy between rapid advances
in computing and novel innovations in chemistry.
But more interesting than the exponential growth in data are the
surprising things we have inferred from the data. In the heady early
days of the publication of the draft of the human genome over twenty
years ago, co-author Francis Collins asserted that the combination of
molecular biology and genomics would “make a significant impact” on our
attempt to understand and cure cancer. Despite some early instances
where genomic sequencing was performed on cancer patients, like Steve
Jobs in 2009, the overall impact of the new science on healthcare has
been modest at best. Instead, paleoanthropology, prehistory, and history
were transformed as genetics surveyed the pedigrees of the human past
with a power and precision that would have been unimaginable a
generation ago.
Even though the Swedish geneticist Svante Pääbo published a paper in
1984 on the DNA of mummies, pioneering the field that would become
paleogenomics, it is clear that much of his work in the 1980s and early
1990s was simply reporting sample contamination; the DNA detected was
that of lab workers or people who had handled artifacts and specimens.
But in 2022, Pääbo was awarded the Nobel Prize in Physiology and
Medicine for the transformative work that began in the 2000s. He and his
colleagues had learned from earlier errors, and taken to the new
genomic technology with gusto.
The first modern
human genetic map was published one hundred years after the founding of
the field, but the first prehistoric human genetic map was published ten
years after that, when Pääbo’s group released the draft of the
Neanderthal genome in May 2010. The team then unveiled the genome of a
new human species, Denisovans. Named after the Denisova cave in Siberia,
where a broken finger bone and a single molar yielded their genome,
they are a whole additional branch of humanity distinct from but closer
to Neanderthals than modern humans. While Neanderthals are well-known
from paleontology and archaeology, Denisovans were novel because they
have been identified only from their distinct genetic markers. Genomics
was resurrecting the DNA of literally vanished species of humans that
were totally unknown to science.
Humanity Was Once Not One Species, But Many....
*In 2021 Nvidia really, really wanted to buy Britain's ARM Holdings. Part of the diplomatic dance was gifting a really nifty computer:
NVIDIA Opens UK's Fastest Supercomputer To Outside Researchers, Academic and Commercial
NVIDIA Claims Install of UK’s Top Supercomputer, for Research in AI and Healthcare
Announced last October, NVIDIA today launched Cambridge-1, calling it
the United Kingdom’s most powerful supercomputer. Enabling scientists
and healthcare experts to use the combination of AI and simulation to
accelerate the digital biology revolution, Cambridge-1 represents a $100
million investment by NVIDIA.....
November 16, 2023, we posted:
Britain's Most Powerful Supercomputer And The Butterfly Effect Of Weather Modeling In the Cloud....That 'puter was the fastest in Britain. From Nvidia:
NVIDIA Launches UK’s Most Powerful Supercomputer, for Research in AI and Healthcare
NVIDIA CEO Unveils ‘First Big Bet’ on Digital Biology Revolution with UK-Based Cambridge-1
First Wave of Startups Harnesses UK’s Most Powerful Supercomputer to Power Digital Biology Breakthroughs
In August 2024:
note:
Google has put this post behind a "Sensitive Content" warning—"Nvidia
Looks To Disrupt The Health Care Industry With AI" (NVDA)
This summary is not available. Please click here to view the post.
In July 2025 a new "Britain's fastest supercomputer" was fired-up at the University of Bristol:
"UK’s Most Powerful Supercomputer, the Isambard-AI, Goes Live" (NVDA)
NVIDIA’s CEO says the Isambard-AI supercomputer is a "vital national
asset” that will help “scientists and developers unlock new frontiers in
science.”
The UK’s most powerful supercomputer, Isambard-AI, is officially live at
the University of Bristol. Clocking in at 100,000 times faster than a
typical laptop, Isambard-AI has officially become the UK’s fastest
supercomputer and is expected to rank 11th globally.
NVIDIA has supplied the facility with 5,448 GH200 Grace Hopper
Superchips, enabling it to deliver 21 exaFLOPs of AI performance. An
exaFLOP represents a quintillion (10¹⁸) floating‑point operations per
second, whereas a typical smartphone achieves merely trillions (10¹²) of
operations per second....
....MUCH MORE
With a slightly more emotional take here's The Sun (U.K.):
OAPs being Old Age Pensioners.
And going back to Ventner in 2012:
J. CRAIG VENTER: THE BIOLOGICAL-DIGITAL CONVERTER, OR, BIOLOGY AT THE SPEED OF LIGHT @ THE EDGE DINNER IN TURIN
It's not just Mr. Huang. In March 2023 we posted:
"The Biorevolution: Its Implications for U.S. National Security, Economic Competitiveness, and National Power"
The author of this piece, Dr. Tara O'Toole is Senior Vice-President of the CIA's venture capital arm, In-Q-Tel.
**In many ways the vibe feels similar to the zeitgeist around chips and training AI a dozen years ago.
I don't know if it is going to work out as well as 2013's "Why Is Machine Learning (CS 229) The Most Popular Course At Stanford?"—which was followed by 2014's Deep Learning is VC Worthy—which was followed by 2015-to-date: "Saaaay, this Nvidia may be on to something."
But we shall see.