Here is the opening of an article that I read for the first class of my first course of my first year of graduate school. This may well have been one the first passages I read as an official PhD student:
The initial stage, the act of conceiving or inventing a theory, seems to me neither to call for logical analysis nor to be susceptible of it. The question of how it happens that a new idea occurs to a man - whether it is a musical theme, a dramatic conflict, or a scientific theory - may be of great interest to empirical psychology, but it is irrelevant to the logical analysis of scientific knowledge.
-Karl Popper, The Logic of Scientific Discovery (1961), p. 31
It has taken me almost a year to fully grapple with my discomfort after reading these words. And I have finally arrived at what I think is a proper response: Not only is the act of conceiving of an idea relevant to the analysis of that idea, it is essential.
Now, Popper may have been correct, taken strictly. To critique a research design, you may not always need to ask the researcher whether s/he was in the shower or the subway when the lightbulb appeared. It matters less to an article’s reviewers that the idea came on the way to the office than whether the internal validity of the design is something we can rest easy about. But the message in assigning this text for our class was clear: Welcome to your doctoral training in a social science, where we’d rather interrogate what came after the moment of intellectual genesis than that moment itself. We’re interested in ideas, not how they happen. All that marshy creative stuff? You’re on your own.
And this - the notion that inspiration is a rocky terrain that we can’t tread on with any certainty or grace - is dangerous. It obliterates the existence of what I want to call the contagion creative spirit.
The first part of this little neologism is about infection. A good idea, when it arrives, rarely arrives on its own. It’s what Jonathan Lethem has so perfectly called the “ecstasy of influence” in his eponymous Harper’s essay from February 2007. And so talking about a good idea as if it were a bacteria is actually not a bad idea. Good ideas spread, they take over, they set up shop in a corner of your brain and - if they’re good enough - distort the way you consume other ideas. Most important, ideas are ambulating entities. They slip and slide across borders as if by osmosis.
Maybe the deepest infection I’ve had in a long time set in a few weeks ago when I was in LA. I was doing a short course at our sister Annenberg School at USC. The fact that it happened at Annenberg West is probably as important as the fact that it happened after a 6-hour plane ride. There are just some things that cannot be learned from the comforts of home.
At USC I got to know a bunch of the faculty and doctoral students doing amazing work. Some of the work I wouldn’t be qualified to carry on a sustained conversation about (big data analysis!), but it was the fact of their enthusiasm that mattered most. In the mornings we had spirited class discussions about what it means to be a communication scholar. In the afternoons we tinkered with data visualization tools and discussed the risks of digital scholarship.
One night I went to dinner with a few of the Annenberg students working on similar projects about hacking. One of them, talking about the other two I was about to meet, said, “You’ll love them. They’re so generous with their time and spirit.” Being generous with time and spirit is a wonderful way to capture what being at USC taught me. It’s about bandying meandering emails, it’s about reaching out to people you might only half-know, it’s about listening to good ideas even if they seem tangential…at first. Needless to say, the dinner with the USC students was a whorl of excited geeking out. And only part of this is explained by the fact that we have common interests. The other part - the bubbling sense of creative irruption - was a product of our shared infection.
What I sense happens at my home department is probably something that happens among graduate students everywhere. “Why” is the reigning metric in deciding about how to spend one’s time and energy instead of “why not.” We’re far more concerned with figuring out the added benefit of each potential project than worrying about what we’ll miss out on. We should be more worried about missing out than on proving to ourselves (and our advisors and our dissertation readers and our search committees) that it will advance our careers.
The backseat status of the “why not” imperative makes complete sense, though. For however much we like to criticize the capitalistic drive to define ourselves solely in terms of what we do to make money, us academics are just as liable as anyone else. Grad school, as one of my favorite undergrad professors used to say, is vocational school. We learn to speak in a professional language. And so to argue that we should do away with all the strictures of academic productivity is foolish. The university, like any other workplace, sets its own rules. And we should play by those rules without rashly criticizing each of them.
Fair enough. But this doesn’t mean we can use academe’s tried and true infrastructure as an excuse to reinforce the things we’d like to see changed. We can acknowledge the importance of the rules of engagement as we lobby to change them.
Grad school is often measured in terms of ego-warping units of distinction. How unique is a project? How much does it add to the existing literature? And this, too, makes sense when you think about it. Grant applications are judged by how well they build upon conventional designs. Journal articles are accepted or rejected based on their perceived ingenuity.
But something strange happens when we strive for uniqueness and distinction as its own goal. Because uniqueness is only epiphenomenal; it cannot be generative ipso facto. It’s creativity that produces a product which is later evaluated in terms of distinction. The means to the end must be different from that end. If we want to write successful grant applications, we can’t solely aim to write successful grant applications. We must tether ourselves to a standard somewhere else. So that when we say that we’ve found something “new” to say about a topic, it’s because we really found it, not because we thought about all the potentially convincing ways to couch a new project.
When we think blinkeredly about uniqueness, it creates what Stuart Hall calls “invidious comparison.” It stems from the dilemma of the “anti-social expert,” the person who uses others’ lack of knowledge as support and rationale for his/her own expert status. Talking about the Italian luthier Antonio Stradivari, he writes:
“He could not pass on his experience, which had become his own tacit knowledge. Too many modern experts imagine themselves in the Stradivari trap – indeed, we could call Stradivari Syndrome the conviction that one’s expertise is ineffable.”
The Stradivari Syndrome is also bolstered by the idea that as tortured and brilliant thinkers, we must always be in pain. (Or, as Rachel Toor unpacks in her recent Chronicle piece, that the brilliant thinker’s writing should always be inscrutable.) It’s the idea that gets conveyed in the many phatic exchanges heard university hallways all around the world: “I’m doing okay, but I’m just so tired and busy. I stayed up all night last night reading.” These kinds of quips are usually accompanied with a proud little smirk. As if what should follow is something like: “Can’t you see how willing I am to hurt myself for my research?”
The antidote to these kinds of things is collaboration. Plain and simple. Research needs to be thought of in terms of networks, in terms of the deep networks we build with people who infect us with their ideas. Research should make us busy, but not in a way that compels us to champion our sacrifices. It should be about enthusiasm. It should be about spirit.
Let’s let it spread, this contagion of creative spirit. Let’s let it be a total fucking epidemic.