Williams College Commencement · June 2023 Valedictorian Address
Class of 2023,
At this point in our lives, we are about to receive a lot of advice.
As we listen, we may realize that many pieces of wisdom can pull us in
opposite directions.
Should we always have a fixed direction in life?
The author Tolkein reminds us that "Not all those who wander are
lost"
...but the philosopher Seneca argues the opposite: "If one does not
know to which port one is sailing, no wind is favorable."
On the topic of consequences, Eleanor Roosevelt tells us "I am who I
am today because of the choices I made yesterday"
...but we've also been told we should live for "this moment," because
"we might not get tomorrow."
(That was the esteemed lyricist Pitbull.)
Considering opposing views is valuable because true wisdom often lies
in the synthesis of the two extremes. I'd like to highlight another
topic I believe we ought to view in this diametric way, since we
typically only hear one side of the story. That topic is critical
thinking.
All our lives, we've been told we should strive to be critical
thinkers. It's an essential skill that enables us to reason
analytically and reflect rationally. In the age of AI and the
Internet, critical thinking equips us to examine the world with clear
vision even as it becomes increasingly clouded by an overabundance of
information.
Yet, just as our immune system keeps us safe from infection but causes
autoimmune disease when it becomes overactive, critical thinking gives
us the healthy skepticism we need to survive in a world filled with
negative viral ideas but can also do us harm when left unchecked.
The mental pathways we've trained to critique ideas in our classes and
papers can also produce excessive criticism of ourselves and others.
The word critical comes from ancient Greek terms that not only mean to
discern and investigate, but also to judge, to accuse, and to separate
or divide groups of people. Finding the proper balance is challenging,
and perhaps it is no surprise that our generation, which has been so
thoroughly taught to think critically, is also the one that finds
itself in a polarized world, in which our judgments of others are
rivaled only by our anxious judgments of ourselves.
All that to say: critical thinking is necessary but not sufficient for
us to live good and productive lives. I hope that none of us will ever
be remembered first and foremost as critical thinkers. Instead, I hope
our success will be made possible by our critical thinking, but will
be created through our constructive action.
In a 2005 commencement speech, Steve Jobs reminded graduates to "stay
hungry, stay foolish." In a way, this advice is the very antithesis of
critical thinking. Hunger drives action, not thought, and to be
foolish is to view the world with an utter lack of skepticism.
My fellow classmates:
As we leave Williams, let us embrace the contradiction that we should
be hungry and foolish critical thinkers.
Let us think as critically as we should, and no more than that.
TEDxOxford · January 2022 Innovation, Stagnation, and Progress Studies
When I was little, I loved a series of books called The Magic
Treehouse. In every story, the protagonists, Jack and Annie, were
whisked away to a faraway land — usually somewhere in the past. Their
adventures were thrilling, and I became fascinated by history.
Now I’ve learned much more about the past. It turns out that books
like Thanksgiving on Thursday, Mummies in the Morning, and Tonight on
the Titanic leave out key details about what everyday life used to be
like. So if anyone knows the author Mary Pope Osborne, please pass
along my ideas for more representative stories: Teatime with Typhoid
Fever; Evenings without Electricity (or Running Water, or Central
Heating); Strenuous, 14-Hour Days of Manual Labor at Sunrise.
Despite what The Magic Treehouse might have you believe, the past was
a miserable time for most people by our current standards. Life
expectancy was a fraction of what it is today, the average diet was
bland and unvaried, workdays were long and grueling, knowledge about
health and hygiene was sparse.
Around the mid-1800s, things began to change. Fast. In his book The
Rise and Fall of American Growth, Robert Gordon describes the period
from 1870-1970 as “The Special Century.” This era saw revolutions in
virtually every facet of life across industrialized nations:
transportation, communication, food, clothes, shopping, homes, work,
health.
This growth was unprecedented in all of human history. Since pre-20th
century life was such a primitive and uncomfortable baseline, these
improvements amounted to an almost incomprehensible increase in the
standard of living. And since our understanding of many subjects was
so basic, it was possible for a torrent of revolutionary discoveries
to be unleashed. These were low-hanging fruits of progress, and they
were plucked with great speed and in great quantities.
This era of rapid, transformative growth sets the scene for the first
of two ideas I’d like to share with you today. It’s a concept known as
the Stagnation Hypothesis, and it explains how and why progress has
slowed since the Special Century. The second big idea is what we ought
to do about this stagnation. Studying the salient questions prompted
by this slowdown and figuring out how we can jumpstart our stalling
growth are the goals of a new field of study called progress
studies.
In 2013, entrepreneur and investor Peter Thiel famously commented on
the disappointing state of recent technological progress. “We wanted
flying cars, instead we got 140 characters.” Of course, there has been
some progress since 1970 — it just hasn’t been as broad or as
life-changing as the previous decades. The Special Century upended
every aspect of how we lived, while more recent growth has been
narrowly concentrated in entertainment, communication, and information
— industries powered by the invention of the Internet.
Growth in other areas has slowed from an all-out sprint to more of a
casual stroll. Modern improvements in healthcare, for example, should
not be diminished, but they are far from the nearly 30 years of life
expectancy that were added during the Special Century. Let’s face it,
a time traveler from the mid-1900s would be disappointed by 2022.
They’d enter a present-day home and find it looks quite familiar;
they’d get a job and learn that the 40-hour workweek, which was a
standard in 1940, remains the norm; they’d watch the news and realize
it’s been 50 years since we last set foot on the moon.
This lackluster progress can’t be explained by a lack of effort.
Researchers and innovators are just as hardworking and creative as
they used to be — in fact, we’re investing more resources than ever
into their work. We can see here that the amount of scientific funding
and the number of PhDs and publications are all growing
exponentially.
Unfortunately, however, the output from this investment is not growing
at the same rate; when asked to rank the importance of discoveries in
their fields over time, top scientists produced fairly flat
results.
Productivity trends across health care, electronics, and agriculture
show a similar pattern.
The fact is, it’s getting more difficult to innovate.
We’re using more inputs to achieve constant — sometimes even less —
progress. An economist would say we’re seeing diminishing marginal
returns. In everyday language, we’re getting less bang for our
buck.
There are a few explanations for why this is. Our culture and our
political and economic systems may not be as conducive to growth as
they used to be. But equally important is the fact that in every field
of study, the frontier of knowledge is as far out as it’s ever been.
Yet humans are born with an utter lack of knowledge, a mind that’s a
tabula rasa, and this means that aspiring innovators who want to push
the boundaries of today’s knowledge stock must spend increasing
amounts of time and effort just getting to the frontier.
Legendary physicist Paul Dirac called the 1920s a time when “even
second-rate physicists could make first-rate discoveries.” The field
was positively bursting with insights to be made. Now it might be the
case that even first-rate physicists struggle to make second-rate
discoveries since it takes decades to learn the foundational topics
that bring them to the knowledge frontier. With all the low-hanging
fruit picked, we have to expend more resources climbing to higher
branches just to reap the same bushel of progress.
The economist Benjamin Jones laid out this argument more formally —
and with fewer fruit analogies — in a 2009 research paper titled “The
Burden of Knowledge and the ‘Death of the Renaissance Man.’” Jones
analyzed data to show that society is trying to cope with the
expanding knowledge frontier using three main strategies: we’re
spending more time in school, we’re becoming more specialized, and
we’re relying more on collaboration. Unfortunately, each of these has
limitations.
Education is crucial, but more time spent in schools means less time
spent applying what we learn. More education is also not a sustainable
solution — what happens when it takes a lifetime to reach the
knowledge frontier for a particular topic? Isaac Newton famously said,
“If I have seen further it is by standing on the shoulders of Giants.”
But what if the giants’ shoulders become too high for future
generations to climb?
Increased specialization isn’t ideal either. Being deeply specialized
makes it nearly impossible to switch careers or fields of study, and
it limits convergent thinking since specialists cannot readily draw on
ideas from multiple domains.
And while more collaboration sounds positive, it’s also indicative of
the growing limits of what a single, motivated creator can accomplish.
What’s more, collaboration is often the result of a fortuitous chance
encounter between individuals with complementary skills. Do we really
want to rely on fortune to determine our ability to make meaningful
progress? I love stories of serendipitous breakthroughs as much as
anyone else — Alexander Fleming’s famous accidental discovery of
penicillin, for example — but I’d certainly prefer to live in a world
in which luck plays less of a role in deciding whether we’re able to
find cures to diseases or invent new breakthrough technologies.
So we find ourselves in the present, with progress becoming
increasingly difficult and our instinctive responses proving
insufficient. Needless to say, this is problematic. We’re facing
significant, even existential, challenges as a species, from climate
change to poverty to disease. Life expectancies in many nations have
seen unprecedented drops, some even before the pandemic. And many
predict that millennials will be the first generation in recent memory
that will grow up to be worse off than their parents.
This is a good point to pause and take a deep breath. I promise this
marks the end of the gloomy facts and figures. See, I’m actually quite
hopeful despite the challenges. The growing difficulty of reaching and
extending the knowledge frontier presents an incredible opportunity
for humanity. It has exposed the fact that we’ve invested frightfully
little time, money, and focus towards the systematic study of human
progress.
A new field called progress studies attempts to remedy this. Progress
studies is a cross-cutting domain that employs lessons from history,
economics, politics, and more niche areas such as organizational
studies.
What distinguishes progress studies from these related fields is that
it is prescriptive and not just descriptive. Economists and historians
do an excellent job explaining how and why our world looks and works
the way it does, but progress studies goes one step further and uses
these insights to recommend tangible actions for maximizing
development. It’s no wonder, then, that the difference between
progress studies and the fields it draws upon has been compared to the
distinction between medicine versus biology and anatomy. Prescription
versus description. Our ability to create progress is experiencing a
malaise, and we can’t cure it with more explanation. What we need is a
treatment, an intervention, a dose of carefully prescribed reform to
start being well again.
Progress studies is important because we can no longer afford to wait
around for progress to arise naturally and spontaneously. There are
numerous historical examples of fantastically productive societies —
Ancient Greece, Renaissance Florence, Edinburgh during the Scottish
Enlightenment, and, more recently, Silicon Valley — but we tend to
think of these as exogenous events. They just happen to happen. But
what if we could make them happen? What if we could plan and
strategize and engineer these types of societies? Progress studies
offers the chance to do just that, by providing a framework for us to
study and experiment with how our communities are educated, funded,
supported, and organized.
Although it's in its early days, progress studies already has much to
say about addressing the Great Stagnation. It suggests we can make
improvements that fall into two broad categories: our culture, and our
institutions and systems.
In terms of culture, our society needs to overhaul how we think about
progress. First of all, we must recognize that it is possible to
deliberately create progress. Great ideas don’t have to fall from the
sky like the apocryphal apple that hit Newton on the head. Innovation
isn’t some fickle lightbulb that randomly blinks on above a lucky
genius’s head — it’s an outcome for which we can strategically
optimize. If we regard progress as arbitrary or unachievable, we risk
becoming a cynical society with a zero-sum mindset. If there’s no
progress to be made, then my gain must be your loss and vice versa.
But when we understand that we have agency, that we can shape our
future and not let it shape us, when we adopt the positive-sum mindset
that centuries of invention have shown to be true since the
Enlightenment, we become doers and builders and creators. And then my
gain is your gain and everyone else’s gain. Societies that believe
they can make progress and societies that don’t are both usually
right.
We also must address the growing backlash against progress. There are
those who contend it does more harm than good, pointing to negative
repercussions like climate change or technology’s pernicious impacts
on our privacy and mental health. These are significant, alarming
drawbacks. But this is no reason to shun progress. Imagine if the
first Neolithic humans rejected fire as too dangerous the first time
they got burned! Where progress instigates problems, it also offers
solutions. Carbon credits, advanced renewables, geoengineering,
lab-grown meat — the innovative answers to climate change are exciting
beyond belief, as long as we’re able to effectively implement them.
The dark sides of progress merely provide another reason to study it
closely. Technologies such as general AI or advanced bioengineering
have the potential to reshape our world, but whether they change it
for better or for worse depends on our preparation for their arrival.
Studying progress — both our past successes and mistakes — will help
us anticipate unexpected consequences so that we don’t get burned
again.
And then we have our institutions. Many crucial systems were good
enough to support historical progress but are now insufficient. In a
period of stasis, it’s worth asking ourselves: “if we could rebuild
from the ground up, would we design our institutions to be exactly as
they are now?” If the answer is no, then we ought to proactively make
those improvements.
Let’s scrutinize our education system to determine if it’s properly
promoting the modern outcomes we care about. Most universities are
fundamentally unchanged since the early 1900s. Tyler Cowen, a
professor and one of the pioneers of progress studies, has advocated
for more experimentation in higher ed by, for example, considering
alternatives to the current tenure system. We can also explore novel
classroom structures, devise curricula that nurture creativity, and
ensure all educators are compensated with both the salaries and
prestige they deserve so that the best and brightest are drawn to
teaching.
Let’s review current funding mechanisms to ensure capital flows
towards the most promising endeavors. The private sector is notorious
for underinvesting in risky, long-term projects, even those with high
upside, which presents challenges for startups in areas like
renewables and energy storage. Meanwhile, public funding for research
is woefully understudied. Patrick Collison, another early proponent of
progress studies, points out that traditional government funding
schemes seem to be less successful than alternative grant systems
since they don’t focus enough on research autonomy and risk-taking.
Let’s weigh the costs and benefits of current regulations and consider
novel pieces of legislation to promote progress. Are we properly
incentivizing R&D? Are all of our laws up-to-date, or is our
regulatory state burdened with red tape that prevents us from seizing
new opportunities or responding to emergencies such as COVID?
Changes to our institutions are often contentious, so let me be clear
— I’m not advocating for any specific proposals. I’m advocating for
exploration.
I’m advocating for humility, open-mindedness, and courage. Humility to
acknowledge that our current institutions may be suboptimal,
open-mindedness to consider alternatives, and courage to experiment
with these alternatives to see if they might work better. Progress
studies is not about promoting particular answers — it’s about
fighting for these important questions to be asked in the first place
and given the attention they deserve.
Improving our ability to generate progress and cope with expanding
knowledge frontiers will benefit everyone, but it also requires
everyone’s support. Policymakers, educators, business leaders,
students, parents — everyone has a role to play in reshaping our
society so that we can innovate on how we innovate. Deepening our
understanding of progress studies will make us all more informed
citizens and more conscientious champions of growth. And, at the very
least, it will make us all more appreciative beneficiaries of the
fruits of human progress we often take for granted.