I experienced a weird constellation of three events last week. First, my editor, my agent, and I hashed out the jacket copy for my upcoming novel, Exit Black. It was weird to realize that I was basically helping write ad copy for a book that I labored over for two years. Second, I sent out a query to a literary agent for the book I drafted after Exit Black, called Pacifica. (My current excellent and extant agent, Scott, works on film and TV projects, and Pacifica definitely isn’t movie material, so I am looking for a bookish agent as well).
And third, I was driving down Broadway in Portland when I saw this bumper sticker ahead of me:
Millennials and Zoomers–and their parents who might have watched Spongebob Squarepants with them twenty years ago–will recognize this as Squidward Tentacles’ self portrait, “Bold and Brash.” I was happily shocked to see it on a bumper sticker. And I realized when I saw it that this publication process I am experiencing with my books is drawing up all of my deeply Squidwardian impulses: my vanity, my hunger for approval, my inner conflict about how art intersects with commerce. Squidward, c’est moi.
If you need a blast from the past, or you somehow never saw the original, here is a clip of me Squidward from the episode “Artist Unknown:”
I’m late bringing this news to The Subway Test–a sign of the recent moribunditude of my blog. But last year I entered the Aura Estrada Short Story Contest at Boston Review and…I didn’t win. But my story, called “Nomenclator of the Revolution,” was a finalist for the award, and that means the story will be appearing in Boston Review soon!
One of the reasons I feel so proud to have “Nomenclator” coming out in Boston Review–besides the obvious, that Boston Review is an excellent publication which has hosted writing from titans like Rita Dove, John Updike, Susan Sontag, and Saul Bellow–is that “Nomenclator” had been rejected by 26 other magazines before it was accepted at Boston Review.
Which is another way of saying don’t invest too much meaning in any particular rejection of your work. When a story of mine is rejected a few times–say, half a dozen rejections or so–I do need to look hard at what I’m sending out. Maybe the piece isn’t ready. Maybe it’s not as good as I would like to believe it is. But once in a while, I scrutinize the much-rejected story and it still looks good to me. It’s hard to assess my own work honestly. But if I really believe the story is good, then I’ll keep sending it out until I find someone who agrees with me.
One of my formative memories as a writer was seeing the great poet William Stafford read at the end of his career. He would have been 75 or 76 years old at the time–a wizened, kindly fellow who I appreciated only later as one of the most important poetic voices in the 20th century. At one point in the reading, he introduced one of his poems by reciting the names of all the magazines and journals that had rejected it. The list was long–I don’t remember how many publications were on it–but, journal names being what they are, the list sounded like a poem of its own. I didn’t have the courage to tell Stafford how important his reading had been to me, and he died mot much later. But thank you, William Stafford, for reading that list when I was 19 years old.
But the novel I’m working on now, Pacifica, begins each of its 74 chapters with an epigraph. Much like the computer game Civilization, each chapter is named after one of the technologies that have made modern humanity possible. And, much like Civilization, each technology is accompanied by an apposite quote. Leonard Nimoy was the gold standard narrator for those quotes in Civilization IV (though Sean Bean has his moments in Civilization VI).
One of the most fun parts of drafting Pacifica has been finding the right quotes for each chapter. I picked from books and poems that I love (as well as a few books that I hated) to put together what I imagined as a kind of collage or mosaic of human knowledge. I imagined the task as something like a literary version of the cover of Sgt. Pepper’s Lonely Hearts Club Band, where The Beatles assembled a photo-collage crowd of their favorite thinkers and artists and goofball influences.
Many technologies were easy to find quotes for. Especially for early technologies like pottery, masonry, and currency, there are a thousand great writers who had something pithy to say. Mostly I would page through books in my office, or CTRL-F through digitized books in Archive.org, to find quotes that spoke to the technology in question and also, hopefully, to the action of the chapter. Sometimes I had to draw the connections myself, in which case the quote turned into something of a writing prompt; other times the quote fit the chapter in deep and unexpected ways that I couldn’t have engineered if I tried.
Some of the later technologies were much harder: for instance, no one from Homer to Virginia Woolf seems to have much to say about the superconductor. Who could I quote for a tech like that?
It just so happened that by the time I got to the superconductor chapter of the book, everybody was talking about ChatGPT. At my college, the discussion revolves entirely around students’ using ChatGPT to plagiarize their essays, an issue which seems to me as trivial, in the grand scheme of dangers that ChatGPT represents, as the crew of the Titanic arguing about a shortage of urinal cakes in the men’s rooms of the Saloon Deck.
So I asked ChatGPT to find me some quotes about superconducting. It suggested some quotes from Larry Niven’s Ringworld and Niven’s and Jerry Pournelle’s The Mote in God’s Eye. They weren’t bad references, exactly–those books do mention superconducting–but none of them resonated with me. So I asked about Arthur C. Clarke, a fave of mine: surely, I thought, Clarke must have written somewhere about superconducting.
According to ChatGPT, Clarke has written about superconducting: of the two references ChatGPT gave me, the one which jumped out at me was this one: “Clarke’s short story ‘The Ultimate Melody,’ published in 1957, briefly mentions the use of superconducting materials in the construction of a futuristic musical instrument called the ‘ultimate melody.'” Now that’s a resonant quote–that would work perfectly for Pacifica!
So I looked up the story and read it (like 90% of Clarke’s short fiction, I had never read it before). Here’s the thing, though: there’s absolutely nothing about superconducting in that story! (For that matter, the futuristic musical instrument is called “Ludwig;” the ultimate melody was the ideal music the instrument was designed to find).
And here’s the other thing, which I discovered later: Arthur C. Clarke did write a short story, called “Crusade,” in which superconductivity is a central plot point. ChatGPT didn’t think to mention it (because ChatGPT doesn’t think yet). I tracked that story down with a simple DuckDuckGo search for “Arthur C. Clarke superconducting.” It’s an excellent story, by the way–very Arthur C. Clarke. And that story had the perfect quote, which fits both Pacifica and the life I feel I am living lately: “It was a computer’s paradise. No world could have been more hostile to life.“
So, for now, I agree with John Scalzi’s excellent assessment: “If you want a fast, infinite generator of competently-assembled bullshit, AI is your go-to source. For anything else, you still need a human.” That’s all changing, and changing faster than I would like, but I’m relieved to know that I’m still smarter than a computer for the next year or maybe two.
I’ve known that community college teaching was my calling almost from the moment I knew what a community college was. Working at an open door institution–that is, offering an education to anyone who comes through the door–spoke to something deep in my moral DNA.
But it didn’t take many years of actually working in a community college for me to see how far the reality falls short of the dream: there are many community colleges, including the one where I teach, where students are likelier to default on their student loans than they are to graduate on time. And, as with just about every other institution in the United States, there are serious equity gaps between how easy it is for middle class, traditional age (usually white) students to navigate the system, compared to how many roadblocks exist for first generation and other “non-traditional” students, who are disproportionately people of color.
In the twenty-plus years of my career, I’ve imagined the work of my college as analogous to the function of a large, overburdened public hospital: the community is glad that such places exist, but anyone who knows better takes their kids elsewhere if they can.
Yet the educational ecosystem of the US (indeed, of the entire world) is changing more rapidly, and more profoundly, than at any time in decades and perhaps in centuries. The ultimate driver of these changes is the internet: no information technology since the printing press has had such a seismic effect on people’s access to knowledge. And, if our society approaches the changes mindfully, I believe that this transformation will lift the stock value of America’s community colleges.
I am not speaking here of the wholesale move to online education that began to accelerate twenty-odd years ago and then sped up cataclysmically during the coronavirus pandemic. Years of teaching both online and face to face have convinced me that online learning is a pale substitute for the educational experience that many students are hoping for. But that’s an argument for another essay. For this post, I will say that the internet has done more than simply spur the growth of a million mediocre online courses; far more importantly, the internet has upended some of the fundamental assumptions of what school is for.
Before the internet, the central educational challenge for any society was access to content, whether that knowledge was locked up in books or in the experience of elders, who are limited in the number of people they can teach at one time. It is still the case today that where access to content is scarce, societies have difficulty in delivering even basic literacy to their citizens. Back in the pre-internet age, even where literacy was widespread it was hard out there for an auto-didact. Anyone who wished to know more than the barest rudiments of chemistry or mechanical engineering or ancient history or whatever had to have physical access to an institution of learning: a library, a museum, a university. Advanced knowledge in many fields was locked up in these ivory towers, preserved for the elect who had the social connections, the money, or the talent to access the lectures and the rare manuscripts, the academic journal subscriptions and the Erlenmeyer flasks. Thomas Hardy’s Jude the Obscure offers a poignant description of this state of affairs: Jude’s failed attempt to enter “Bibliol College” at Oxford because of his background as a stonemason was thought to have been drawn from Hardy’s real life experience failing to gain entry to Balliol College Oxford.
The Real-life Balliol College, photo by Steve Cadman. Note the literal tower.
The community college was conceived as a disruptor of this elitist system. It’s hardly the only one: the public library, Wikipedia, and the land grant university system were also developed to increase ordinary people’s access to educational content. But the community college has come to occupy a special niche in the educational ecosystem: unlike land grant universities, the community college is a truly open door institution. Pound for pound, the community college helps to lift far more people out of poverty than universities do, given the formal and informal barriers to entry at most universities. And yet, the community college is also unlike those other great open door institutions like the public library, Khan Academy, and Wikipedia: at a community college, whatever subject you hope to study, there is a knowledgeable guide there to speak with you personally, to offer you personal feedback on your writing, to help you frame your questions and offer suggestions for tracking down the answers. It is the personal relationship between teachers and students–what the parents of elite students pay tens of thousands of dollars for at small liberal arts colleges–that the community college can offer.
Of course, anyone who has actually studied at a community college knows that not everyone who works there is a knowledgeable guide: some community college teachers are lackluster, ineffective, or worse. Outside the classroom, the processes for getting academic advising or help in the financial aid office can be so byzantine that they would be at home in a Franz Kafka novel. And many college administrations mismanage their institutions with such energy that one can be forgiven for wondering whether there are saboteurs among them.
But despite these defects, many of which are the result of America’s decades-long disinvestment in public services, the community college remains one of the only institutions where an adult can walk in, without any prior credentials or letters of recommendation, and receive caring, personalized instruction in nearly any field from an experienced teacher. The community college aims to help those students who are most vulnerable to misinformation and disinformation; those most vulnerable to the predatory sales pitch of the for-profit university; those least likely to be able to afford an internet paywall, or the more consequential paywall of university tuition. The internet may have exploded many people’s assumptions about how education works. But here is one thing the internet hasn’t changed: most students still want to be seen, to be recognized, to be known by other human beings. Students with money can get those attentions at hundreds of prestigious universities. But anyone, rich or poor, young or old, neurotypical or not, can find teachers who see them, recognize them, and know them at a community college.
Today the temperature on our backyard weather station topped out at 112 degrees Fahrenheit. Apparently the reading at the Portland International Airport was 116 degrees. It was the hottest day ever recorded in the history of Portland. Indeed, it was very likely the hottest day that has ever occurred in this valley in the entire history of human habitation at this site. The second hottest day in Portland’s history was yesterday; the third hottest was the day before that.
For years, ever since I knew what climate change was–ever since we used the term global warming instead of climate change–experts have cautioned the public not to point at any specific weather event and say “See? That’s climate change at work.” With my own students, I’ve taken pains to differentiate weather from climate and to help them understand that extreme weather events have always been with us, that extreme weather is a natural consequence of living on a planet with an atmosphere and oceans and an axial tilt. However, extreme weather events do not happen by magic. And I am thankful that more and more Americans seem to have awakened to the reality that these shocking extremes in the weather are being driven by human-caused climate change.
A few years ago, I decided to devote the rest of my career to fighting anthropogenic climate change. Like a lot of people, I feel overwhelmed by how puny my influence is in relation to the scope of the problem. But I can work to address climate inaction at my college, and I can help shepherd into being academic programs devoted to restoration ecology and climate remediation and environmental policy change. And I know that I can work with students in ways both formal and informal to help them see the political and economic transformation ahead of us.
You can see the transformation ahead of us as well. It will cost you and me a good deal of money to address the catastrophe that is upon us. However, you and I will pay it: either we will pay the cost to save human civilization or we will pay for our civilization’s collapse.
I hope that a few locals who have been snookered by Fox News and its ilk into climate change skepticism (some of them students of mine) will be jostled into cognitive dissonance by the heat of the last three days. I have less hope for the cynics and nihilists that broadcast to them or who pretend to represent them politically. But it was ever so: those who today claim that climate science is unsettled are close cousins of those who used to argue that cigarettes don’t cause cancer or that black people were happier as slaves than as free people. For whatever social evil one cares to name, there is a powerful constituency that benefits from its existence and that will fight to keep it. For the last several decades, that force has been concentrated in the Republican Party and its various media outlets. The names may change at some point–just as the Republicans used to be a far more progressive party than today and the Democrats far more socially regressive–but there will always be a group of powerful people ready to defend an exploitative or oppressive status quo.
But here’s the good news, to the extent that any news about what is happening to us can be good: climate change is not going away. The problem will continue to knock at our doors more and more insistently. And in the words attributed to my favorite Republican, “you may fool people for a time; you can fool a part of the people all the time; but you can’t fool all the people all the time.”
It’s hard to have much love for 2020. This year–which, I remind myself when I am feeling down, is only about 77% finished–feels like a self-reinforcing system of catastrophes. I suspect I would find this a tough year even without the basso continuo of a global pandemic: the corner of the world I live in has suffered the most ruinous wildfires in decades; the president of the United States has announced his intention to replace democracy with authoritarianism and minority rule; his party, long ago one of the great intellectual traditions of the country, has shown itself to be led by nihilists, cynics, time servers, and predators. I’ve awakened in the middle of the night more than once this year overcome with the thought that life as we know it is ending, to be replaced by something more solitary, poor, nasty, brutish, and short.
Perhaps my 3:00 am dread is an accurate picture of what is to come. Perhaps, like Job, “the thing which I greatly feared is come upon me” and we are watching the collapse of the American experiment. Or, perhaps, what we are witnessing are the beginnings of the wholesale collapse of the entire human experiment, as the planet’s many life support systems go offline one by one. These outcomes seem possible: the beginning of the end of the republic by next month, the end of human civilization by the end of my children’s lifetimes.
And yet, what wakes me at 3:00 in the morning is not the certainty that those are our fates. Rather, what wakes me is uncertainty, the sense that much of what I could count on for the first half of my life can’t be counted on today. A related dread is the knowledge of the limits of my influence: I can work towards a civic renewal and towards ecological restoration, but the outcome of my work is out of my hands.
Paradoxically, this cloud of unknowing is also where I have taken some comfort. Old things are passing away–because of the pandemic, because of climate change, because of the presidency of an authoritarian strongman. It does not necessarily follow, however, that what will follow must be worse. The United States of America still purports to be a democracy. It is not impossible–if we vote, if we participate, if we work towards it–to build a more just society than the one we live in today, a healthier society, a more sustainable economy, a restored ecosystem.
By whatever name you care to call it–providence, karma, feedback loops–we are in a moment when the world itself seems to be pushing back on the outrages of the last four years, or four centuries: not just the fires and the supra-alphabetical roster of hurricanes, but Donald Trump’s own infection with COVID-19. Because he is a public man, his illness and suffering take on symbolic dimensions, as though he were a character being punished for his hubris in Dante’s Inferno or the Book of Daniel. Trump’s posturing about his strength, even when it’s obvious that he is in pain and struggling for breath, only goes to show that he is as unprepared for his life as a metaphor as he is for his life as President of the United States.
The times are cataclysmic, but they will pass. A new day may be closer than you think. And there will be a moment on the other side of the cataclysm that calls for new balances. It’s time to vote Donald Trump and his enablers out of office. It’s time to push. It’s time to work.
I had the joy of watching 2001: A Space Odyssey on the big screen the first time in my life a little while ago. For those of you living near Portland, The Hollywood Theater purchased a 70 mm print of the film a couple of years back, and they show the movie to a sold-out house a couple of times every year. I had seen the film many times before on video–it’s one of the truly formative pieces of art in my life–but seeing it in a literally larger-than-life format impressed me deeply: the movie reminds me why I work in the genre of science fiction.
One of the most celebrated elements of the film has been its technological accuracy. Stanley Kubrick and Arthur C Clarke, working before CGI or the moon landing, were able to predict so many of the challenges and curiosities of living and working in space. As much as I loved Star Trek and Star Wars growing up, I always had the sense that those two franchises were more science fantasy than science fiction (especially Star Wars). 2001, by contrast, looked like some thrillingly-plausible documentary footage from a future just over the horizon.
But it is not the accuracy of the film that affects me so much now. Rather, 2001 is worth watching because of what Tolkien would have called its mythopoesis: its creation of a new mythology in which we could view our modern predicament. As much as any other work of art I can think of, 2001 gets at the painfully intermediate position of our species as part animal and part divine: the film is a 164-minute meditation on Hamlet’s musing: “What a piece of work is a man! How noble in reason! how infinite in faculties! in form and moving, how express and admirable! in action how like an angel! in apprehension, how like a god! the beauty of the world! the paragon of animals! And yet, to me, what is this quintessence of dust?”
(Another quote, just as apt, comes from Nietzsche’s Thus Spake Zarathustra, the book which also inspired the iconic theme music for 2001: “Man is a rope, tied between beast and overman—a rope over an abyss … what is great in man is that he is a bridge and not an end.”).
While the film is set in space in the near future, as realistically as Kubrick and Clarke could conceive of it, the setting is just as much a place of the inscrutable divine: in other words, its setting is really The Dreamtime, the Underworld, Faerie. Even though the US Space Program was deeply influenced in real life by 2001, the movie is closer to the mystical cave paintings of Chauvet or Lubang Jeriji Saléh than it is to the Space Shuttle and the International Space Station.
Of course, there are many elements of any piece of science fiction that won’t hold up well after 50+ years. In the case of 2001, Kubrick and Clarke seriously underestimated the amount of progress our species would make in some aspects of information technology, while at the same time overestimating the progress we would make in artificial intelligence and manned spaceflight. Those are easy mistakes to make, by the way: I can’t think of any science fiction before the 1980s that successfully anticipated the internet, and of course a movie made in 1968, the year before Apollo 11, would extend the logic of manned spaceflight out to regular orbital shuttles and populous moon bases and manned Jupiter missions.
But the beauty of 2001 is not how much the movie correctly predicted but rather how well it explores the timeless theme of what it means to be a human being. What strange gods called out of the darkness to our rude, frightened hominid ancestors to make us human? What awaits us if we can survive the deadly unintended consequences of our own ingenuity? In wrestling with those questions, 2001 is every bit as bottomless a work of art as Paradise Lost or Faust or the Popol Vuh. One can argue that there are no gods that made us, that the monoliths of the movie will never be found because they never existed in the first place. However, 2001 speaks to something very deep in our cultural DNA (and, for all I know, in our literal DNA): the yearning for our spiritual parents.
Two hundred years from now, if we somehow survive this dreadful bottleneck of overpopulation and ecological collapse, our descendants may be living in domed cities on the moon and Mars; we may be gliding in beautiful submarines through the oceans of Europa and Ganymede. We will still be looking for the monoliths.
The author in the process of failing the subway test.
Without the gargantuan cave of Facebook to amplify my voice, I don’t know how many people will see my writing here. But it helps me to write here nonetheless.
Coming soon, I’ll reprint one of my favorite early stories in honor of its 10th anniversary. Keep watching the skies…
I’ve spent months away from The Subway Test and from social media in general, deep in the burrows of a new writing project. And, as exciting as that new project has been (it’s so exciting that I can’t really tell you much about it), I have missed the writing practice that I had before, working on short stories, my novel Pacifica, and the odd blog post that most people read when I cross-post it to Facebook.
But regarding Facebook, I have had another reason for my radio silence: I just haven’t known how to respond to the mounting news about what a monstrous company Facebook is. On the face of it, I’m not sure it should be such a hard decision for me to leave Facebook (and its horrible little sister, Instagram): a company that seems devoted to permitting, even encouraging, the spread of political disinformation, up to and including disinformation that drives genocide, is a company I want nothing to do with.
Copyright Adbusters
One of the only reasons I’ve had trouble leaving is that I don’t normally think of Facebook the company when I’m connecting with friends over Facebook the platform. That is, until about six months ago I was doing a fair amount of compartmentalization regarding my Facebook feelings: I would hear the news about Facebook’s business practices with mounting disgust, then log on and hand out a bunch of likes and haha faces and hearts to my friends’ pictures and memes and political links. Part of me knew that Facebook’s poetic PR language about connecting the world was just so much corporate bullshit. But then I would get on Facebook and act like all of that bullshit was true.
That’s because Facebook has very effectively built a business model which exploits our love for our friends and family. There’s nothing inherently wrong with such a business model: a thousand major companies, from Hallmark to Hasbro to TGIFridays, monetizes our desire to connect with people we love. But I do expect such a company, if it claims to be devoted to connecting me with my loved ones, not sell my personal data to political dirty tricks operations, to voter suppression outfits, to election oppo researchers. And I definitely expect such a company to step in when their platform is being used to encourage genocide.
So, please consider this my last post on Facebook. If you are reading this post on that platform, know that I will miss you. You I like. But so long as Facebook continues under its current leadership, with its mix of smarmy public apologies accompanied by no meaningful change in policy, I won’t be back. As a small potatoes writer who would like to have more exposure, I do understand that leaving Facebook behind will mean cutting off one of the few channels by which most people see my work. But the internet is a big place–there will still be lots of places that an interested reader can find me.
If you happen to be an interested reader, feel free to subscribe to my blog, The Subway Test –you can also find the blog simply by googling “Joe Pitkin.” Until then, I’ll say goodbye and deactivate my accounts on New Year’s Day.
I’m open to coming back someday. In fact, I’ll be happy to come back to Facebook and Instagram if the company will take meaningful action to clean up its act. For starters, the Board of Directors needs to fire Mark Zuckerberg and Sheryl Sandberg. I know that Zuckerberg can go ahead and fire the board in return–he is after all the majority shareholder in Facebook–but the board needs to grow a spine and do its job. If Zuck wants to fire the board in return, let him go ahead and do that: at the very least his doing so will make public what a morally bankrupt human being he is. If the board is able to replace Facebook’s top executives with people who will shepherd a transformation at Facebook, creating a company with meaningful privacy policies, meaningful informed consent about how our data is used, and a serious effort to clamp down on disinformation and incitement, Facebook could be fun again.
I was an indifferent student of math growing up. I wasn’t bad at math exactly, but I didn’t much like the subject (except for geometry, which I took in high school from a brilliant and generous teacher who had left off being a rocket scientist–literally–so that he could teach young people). I pretty much stopped taking math as soon as I was allowed to in high school–I stopped out at algebra III.
A couple of years later, in a spasm of optimism, I signed up to take a 7:00 am calculus class to meet my math requirement in my freshman year of college. I was influenced in this fool’s errand by one of my heroes, my writing professor Tom Lyon, whose hypoglycemia obliged him to teach at 7:00 and 8:00 am exclusively. I believed that something would blossom in me, and I would develop into the scholar and writer I was destined to be, a scholar and writer like Tom Lyon, if I got up every morning for calculus in the early hours.
Alas, my 7:00 am calculus teacher was no Tom Lyon: I remember her as earnest and competent, but not particularly skilled or experienced as a teacher. Probably, given that I was a freshman at a land grant university in a 7:00 am calculus class, she was a relatively new graduate teaching assistant. More importantly, what seeds of knowledge she sowed my way fell on rocky ground, or weedy ground–I remember not a lick of calculus from that class. Practically my only memory of that whole term was one morning watching the sun stream into the room late in the quarter and feeling the joy of being an 18 year-old in springtime.
Somehow I managed to pass that class despite all the time I spent gazing out the window. And 25 years later, somehow I managed to get a master of science degree in environmental science without much knowledge of calculus. I knew enough to be able to recognize that something was a calculus problem–the same way I might recognize that the people next to me are speaking Portuguese–but as for using calculus to model a problem or make a useful prediction about the world, the little glyphs and grammars of differential equations were utterly alien to me.
The gaps in my math knowledge were worse than this, actually: I remember as I was gathering the last data for my thesis that my classmate Alison Jacobs had to explain to me the formula for the slope of a line (y=mx+b) for about 30 seconds before I realized that she was talking about something that I had studied for months and months in junior high school. It comforted me a bit to learn later that the great E. O. Wilson had gotten his PhD in biology at Harvard without calculus–in Letters to a Young Scientist he talks about sitting in calculus class as a 32 year-old assistant professor, trying to atone for his crime of omission. But for me, it has been hard to shake the sense that however well I might use words to describe the thicket of the world, I’ll never know the trails by which I might, using math, penetrate to the heart of things.
I had to climb over my own emotional palisades, then, to set out on a journey to teach myself calculus at age 45. For me, coming back to differential calculus via Khan Academy has felt less like atonement and more like the discovery that someone I had regarded as homely in high school showed up at the 30 year reunion looking like a knockout. Somehow over the thirty years since I first sat in that 7:00 am calculus class, I have discovered that I’m in love with mathematics.
So far as I can tell, there’s no direct benefit to me in learning calculus or any other kind of math. No matter how good I may get at it in middle age, there will always be others around me who know math better and who use it more naturally than I. And what would I use calculus for anyway? I’m no better an English teacher or outcomes assessment specialist because of it. One could argue that I’m a worse English teacher because of it, opportunity costs being what they are–every hour I spend learning about limits and differentiation is an hour I don’t spend honing my knowledge of composition theory or something else I might actually use in the classroom.
But I don’t want to stop myself: I study math because math has become beautiful to me. Perhaps it seems more beautiful to me because it has no obvious use to me. I’m long past the spring term of my life now. Perhaps I can love math now because “the heyday of the blood is tame”–though in so many areas of life I feel I am entering a second youth, or even a long-delayed first youth. I never became, never will become, the scholar that Tom Lyon was in my life. But I’ve come back to scribbling out derivatives at 7:00 in the morning as I did when I was 18. The morning sun in springtime fills me with a different kind of joy.