Academia Shrugged? or, Why You Should Just Quit Your Ph.D. Now

July 27, 2011

Grad school and academia as a potential career have taken a real beating in the media lately. It seems the world has finally woken up and smelled the (three-day old, re-used) coffee beans that are all grad students can afford. The bottom line is that humanities students should run, not walk, away from a life of debt and uncertainty, and a “dream job” that will never quite pan out.

In an article for Slate.com, William Pannapacker, himself a professor at a liberal arts college, proposes a few steps to fix graduate school in the humanities. Some of what he advises – such as providing networking opportunities and internships, and recognizing that it may be better to keep one’s passion for obscure fields of study as a hobby – is similar to what I proposed in my own post on a Three-Pronged Approach to Saving Humanities Departments.

But I was really intrigued by his addition of a final, “nuclear option”: quit. In his words:

Just walk away. Do not let your irrational love for the humanities make you vulnerable to ongoing exploitation. Do not remain a captive to dubious promises about future rewards. Cut your losses, now. Accumulate work experiences and contacts that will enable you to support yourself, have health coverage, and something like a normal life. Even the more privileged students I mentioned earlier—and the ones who are not seeking traditional employment—could do a lot of good by refusing to support the current academic labor system. It exists because so many of us who care about the humanities and higher education in a sincere, idealistic way have been passively complicit with the destruction of both. You don’t have to return to school this fall, but the academic labor system depends on it.

Wow. A group of highly intelligent, capable individuals upon whom “the system” depends but who are scorned by it decide one day to “go on strike” in the hopes of seeing said system implode and leave behind a twitching lump of ingrates suddenly begging them to return and save them.

This sounds familiar. Where did I read about that recently? Oh, yes – in Atlas Shrugged, Ayn Rand’s infamous treatise on objectivism. In it, the heroes grapple with their own superior morality in a world of incompetent ingrates and eventually come to realize they are complicit in the very system that condemns them for their unchecked ambition and capitalistic greed. (Of course, unchecked ambition and capitalistic greed are positive attributes in Rand’s heroes.) So, one by one, they go on strike by withdrawing to a hidden, magical place where labour is justly rewarded, and nobody ever gives anybody anything they haven’t worked for, while they watch the world crumbling around them without their input.

(There, I’ve saved you from slogging through 1000+ pages of libertarian/objectivist bluster that would probably outrage and offend anyone who believes in silly things like equality of opportunity and altruism.)

But putting aside the absurd pairing of tweed-jacketed academics and Wall Street “fat cats,” let’s think a minute about the implications of this Randian proposition for academics. Would it work? As Pannapacker points out, there is always the possibility of having a day job with health care and indulging in one’s “irrational love for the humanities” as a hobby. As he says, “more and more people are learning [that] universities do not have a monopoly on the ‘life of the mind.’”

Maybe. But I think universities should at least have a competitive edge on it, or else they stand to become exactly what vocationalists want them to be: training for jobs that exist today and have clear mandates and required preparation. This would certainly be the case if all the most brilliant liberal arts minds suddenly decided to be brilliant elsewhere in the world.

Because if not universities, then where? Will we have to start paying people to hear about their ideas? Will we have the same level of interaction if everyone is decentralized and off thinking interesting thoughts in different industries? How will we prepare people to think innovatively, and prepare them for the jobs of tomorrow that don’t have clear mandates or preparatory courses?

The whole point of a university is that it is a hub, a place a little apart from the rest of the world (yes, perhaps even an ivory tower) where people can reflect on the “big questions” that may not be directly related to the pressing fads of the moment. What happens when this education becomes more decentralized? Can we trust that individuals will independently seek out all of the different perspectives they’re likely to get in an undergraduate humanities course?

I reflect on what Stanley Fish wrote in the New York Times a few weeks ago: basically, that academics, and by extension universities, should just abandon trying to be relevant and focus instead on academic inquiry for the sake of it alone. I think that would be unwise. Knowledge for the sake of knowledge is a great thing, and we need independent places (i.e. not ones funded by corporations) that will ensure that people continue to seek it. But relevance is important too, and while it should not be the only goal, it needs to be a goal.

In short, the current academic system needs to be refined from within, not by walking away and shrugging off all its problems. (Besides, academic types don’t have a magical land in Colorado where we can retreat and live out our ideals of hard work and free love and no taxes.) Professors and administrators could start with being honest about the reality of life as a grad student, i.e. mostly unemployed without the health coverage Pannapacker so enjoys having. And they should stop denigrating non-academic career choices by framing them as a continuation on the path of intelligent, creative thinking, not a deviation from it.

And then we – all of us – can start changing the way we view “amateur” non-academics outside the system, and invite them in. Let’s not exclude people with insider jargon and inaccessible writing. Let’s make a place inside the ivory tower for people who think about the “big questions” of life outside of it, so we can examine the practical implications of our ideas. Let’s show the vocationalists that “academic” is not a dirty word but one that can bring significant insight and positive change to the world outside universities, as well as in its libraries.

Let’s ask people to help us hold up the world, instead of just dropping it.

Advertisements

Tiger Moms, Drop-Outs, and Credentialism (oh my!)

May 31, 2011

Now here’s a controversial news item: Peter Thiel, famous for having founded PayPal and investing early in facebook, and now a billionaire, is paying young entrepreneurs to drop out of school. His Thiel Foundation has just named 24 “fellows” under 20 who are recipients of $100 000 and mentoring opportunities with high-powered and successful entrepreneurs in order to launch a profitable initiative. They are all drop-outs (of college or high school), a requirement for the prize.

His logic is that many would-be students of elite schools would be better off going right out into the world to generate “significant wealth,” rather than learn about the theories behind what others have said and done and invented. And while I would never blindly advocate that anyone drop out of school, given the prevailing societal opinion about education and the very real value of exposure to new ways of thinking, his initiative is perhaps a useful antidote to those who do blindly advocate more schooling as the solution to all of society’s ills. Education is a wonderful thing – I would even say that it is the key to solving many of the world’s great scourges, such as intolerance, authoritarianism, and the solid grip of misinformation. In a way, Thiel is saying that it is the ideas and the work behind them to make them successful that counts, not the name of one’s alma mater (or even the existence of one).

Credentialism – in the form of collecting degrees and designations from acclaimed institutions – has become a powerful shorthand for measuring societal status. It is iron-clad in an aura of meritocracy, because in theory only the best are able and would choose to obtain a degree or three at the world’s (and especially America’s) finest educational institutions. But, as with all shorthands, a focus on credentials alone as a stand-in for intellectual or societal worth fails is insufficient and at times unfair.

The education situation in many developed countries is drastic. Every year, millions of the world’s best students vie for a place in one of the mere hundreds of top institutions, trying to best each other in any way possible. A recent issue of the Atlantic explores the phenomenon in depth as part of an extended look at the “tiger mom” debate brought about by the now-infamous book by Amy Chua (there is a good write-up on it here, in case you live under a rock and missed it). Much of the furore over the book was caused by the implicit challenge of Chua’s winner-take-all style of parenting. In refusing to give in to her daughters’ tears, frustration, exhaustion, and in some cases disturbing behaviour (biting the piano?), Chua claims she paved the way to their happiness by allowing them to know what they were capable of.  More recently, her eldest daughter’s acceptance to Harvard has renewed the wave of anxious hang-wringing by “Western” parents who think they aren’t pushing their children hard enough to get into good schools and assure their futures.

But are the hours of heartache, rebellions and tooth marks on family instruments worth it? Is pushing a child to his or her limit, encouraging activities like building orphanages in Africa, chairing the local youth orchestra, and volunteering as an assistant surgeon on weekends in order to secure a spot at the Ivies the key to lifelong success and happiness? Is it even likely to yeild a coveted admission letter? Not really, according to what Caitlin Flanigan writes in response to Chua’s book:

Elite-college admissions offices drive professional-class parents crazy because in many respects they do not operate as meritocracies. Consider, for example, those students admitted via one of the two programs that stand as strange mirror opposites: those that give preferential treatment to the sons and daughters of alumni, and those that extend it to the children of unrepresented minorities. The latter practice suggests that generations of injustice and prejudice can be redressed by admission to a fancy college, the former that generations of inclusion and privilege demand their own special prize; the two philosophies would seem to cancel one another out, but each has its place in the larger system.

In fact, when you account for all of the “hooked” seats in the freshman class—spaces specifically set aside for kids who have some kind of recruited talent or family connection or who come from an underrepresented minority group—you accomplish, at the most selective colleges, two things: you fill a large percentage of the class (some researchers believe the figure is as high as 60 percent), and you do so with kids whose average grades and scores are significantly lower than your ideal. Now it’s time to swing a meritocracy into place; the caliber of the class is at stake. All of the unhooked students are now going to be thrown into a hypercompetitive pool, the likes of which the layperson can’t imagine. As daunting as the median grades and test scores of the typical Princeton admittee may appear, those statistics have taken into account all of the legacies and volleyball players and rich people’s children who pushed the averages down.

Sounds terrifying, doesn’t it? And what’s more, there is a growing pile of literature that argues it isn’t worth it. These days, people go to university for four main reasons:

  1. To attain practical/vocational knowledge that will tangibly help them get a job.
  2. To attain theoretical or other knowledge that will expand their minds in an area of interest.
  3. To please their parents/society/employers who consider a post-secondary education to be a mandatory status symbol. The better the reputation of the school, the better the status symbol.
  4. To make connections with peers and professors.

The main benefits of a “top-tier” education, as opposed to one from a large public American or Canadian school, lie in the last two, status and connections. Sharing a room with a future Mark Zuckerberg or getting to vacation on the yachts of the rich and famous must be worth the price of admission, right?

William D. Cohan thinks not, writing in the New York Times that getting into an Ivy League school is a “Pyrrhic victory,” with the outcome of having monstrous student debts (from $50 000+ per year fees) and only slight better-than-average job prospects in a glum economy. Many other American schools have astronomical fees, and even relatively cheap Canadian educations place graduates in debt. There is also, as I referred to above, the non-monetary cost of an education at a prestigious school. Whole families are swept up in the hyper-competitive race to the top, where lazy summer vacations and boredom and play are replaced with summer volunteer trips to Kenya, SAT prep courses, and the endless repetition of mastering a musical instrument.

But the saddest part is that it may all be for naught. A sister article in the same Atlantic issue as the above quotation charts the potential life course of many products of tiger-led households:

Harangued by my own Tiger Dad, I grew up believing in crack math skills and followed—at least initially—a stereotypical Chinese path of acing my tests; getting into the world’s most prestigious science university, Caltech (early admission, no less); majoring in the hardest, most rarefied subject, physics … And then what? Almost 50 years old now, some 30 years after graduation, I look at my Caltech classmates and conclude that math whizzes do not take over the world. The true geniuses—the artists of the scientific world—may be unlocking the mysteries of the universe, but the run-of-the-mill really smart overachievers like me? They’re likely to end up in high-class drone work, perfecting new types of crossword-puzzle-oriented screen savers or perhaps (really) tweaking the computer system that controls the flow in beer guns at Applebee’s. As we know, in this tundra-like new economy, even medical degrees, and especially law degrees, may translate into $250,000 of unrecoverable higher-education debt and no job prospects, despite any amount of hard work and discipline.

The reality, of course, is that there is life after graduation, and I imagine that a lot of students and parents who sacrifice their lives perfecting their viola performance and polishing their resumes will get there and wonder what the hell happened — and what to do next. The same is true for all graduates who feel lost after school, and who may have underplayed their social and entrepreneurial skills in favour of tailoring their lives to academic pursuits that will not help them once they have their degrees. And so I support Peter Thiel’s initiative because it addresses the fact that it takes more than a few letters from any school to achieve success in life.


Minimum Impact, Maximum Time, and the Goodness of Work

February 10, 2011

Is ambling antithetical to success? Is a life of purpose the only path to happiness? And is Gen Y really all that different from previous generations in wanting meaningful work?

On Marx, Meaning, and Materialism

I think often on Marx’s theory of alienation; namely, that under the capitalist system of increasing specialization, workers become alienated from the fruits of their labour, and from their own capacity as workers to work/produce things and grow in doing so. Instead of seeing work as an end in itself, and gaining feelings of fulfilment from seeing the fruit of one’s labour go from raw materials to completed items, according to Marx work had become but a means to an end as workers were increasingly slotted into automated lines of production. Instead of creating the whole shoe, they would nail in a piece of the sole, as it were, with no satisfaction in seeing the “end-to-end process” (as we might say in today’s corporatenewspeak).

Certainly, with the rise of the industrialization, Fordist assembly lines and globalization, the idea of work as a means to an end gained popularity as a way to describe life in the twentieth century. And in some ways, this was acceptable. In the 1930s, one was fortunate to have a job at all – any job. One did not pick and choose. The generation after that (those ubiquitous Boomers) observed their parents’ work ethic and adopted it without thinking, as a means to gain material prosperity. Nice cars, big houses, creature comforts, holidays in Boca Raton, and well-educated children became status symbols, ends worth working for. A life of middle management drudgery and rarely seeing one’s children was, for many, an acceptable trade-off.

But we expect so much more from our work today. Making a living, and a living that will support the lifestyle we’re used to, is mere “table stakes” (more corporatenewspeak). Because, with good education and attentive parenting and the opportunity to develop our skills as children, we have so many options for a career. Consequently, we expect much, much more out of the time we spend at work. (And before someone brings up 40% unemployment among global youth, yes, the recession has, to an extent, made Gen Ys a little less choosy – but only for now.)

The theory of work as an end in itself – and a means to happiness and fulfilment – has important research to back it up. A study out of California a few years ago remarked on the importance of hard work and purpose in achieving happiness in life. The conclusion is worth quoting at length:

A central presumption of the ‘‘American dream’’ is that, through their own efforts and hard work, people may move towards greater happiness and fulfillment in life. This assumption is echoed in the writings of philosophers, both ancient and modern. In Nicomachean Ethics, Aristotle (1985) proposed that happiness involves engagement in activities that promote one’s highest potentials. And, in the Conquest of Happiness, Bertrand Russell (1930/1975) argued that the secrets to happiness include enterprise, exploration of one’s interests, and the overcoming of obstacles. …Our data suggest that effort and hard work offer the most promising route to happiness.

Wow. Good work, it seems, is the answer to all our problems. The only thing left to do is find work that contains enough meaty, purposeful, interesting, content – related to our skills, of course, and with excellent “work-life balance” and good benefits – to meet our needs. Simple!

But is this expectation reasonable?

Really, it’s a wonder anybody finds jobs like this, let alone the majority of people. Even Marx’s (clearly idealized) autonomous, cottage industry shoe-makers (or soldiers, or second sons forced into trade…) no doubt achieved very little of this all-encompassing fulfilment through their work. Yet today we pile the expectations on our jobs. While there are certainly those out there who caution that work will not make anybody happy all on its own, the prevailing narrative remains that fulfilling work is the surest route to happiness. Consider: it’s just not socially acceptable for anyone able to participate in the “knowledge economy” to opt out and instead choose to make money solely as a means to an end with no other agenda – let alone anyone under 30. Do you know anyone? And do they want the situation to be permanent?

Minimizing Impact: Lowering our expectations? Or relieving the pressure?

While I was vacationing in the vineyards of Mendoza (rewards for a life of corporate drudgery?), I got to thinking meta thoughts about what people tend to expect from life. We use a lot of language today that revolves around impact. We want to “make a splash.” We long to stand out in interviews, on dates, and in applications. People everywhere seek to be famous for something (anything! Jersey Shore, anyone?) or to leave a legacy, something that will let current and future generations know they existed as individuals, and left something behind. Modern society refers to the more noble side of this feeling as the desire to change the world, whether through volunteering, winning a Nobel Prize or raising well-adjusted children. We have, as I have pointed out before, a strong bias to action which makes us want to do good and make things “better.” Most of us put a lot of pressure on ourselves, a vague kind of weight that is associated with the Victorian ideal of the innate goodness of work and the possibility of having a hand in making a better future. The idea of finding work that allows us to, as the above-quoted study notes, “promote [our] highest potentials,” is tied up in this pressure.

At the same time we are acutely aware that life is, as an honourary TED talk I watched recently put it, fragile and vulnerable – and short. (This fact creates a very un-Hobbesian empathy, the talk argued, not only for those with whom we share blood ties, but with other humans, other creatures, and the biosphere generally. Worth watching.) It is little wonder that, with the perception of the sand in the hourglass ever running out, we feel pressed for time, overwhelmed, and run off our feet. We try to make every moment count. We multi-task and are always tied to a communication device of some kind. Most things are done for a purpose: we educate ourselves in order to gain employment, money and “success”; we sleep and eat for our health; we watch our health to extend our lives (so we can keep doing it all longer). It has been often noted with bitter irony that with all the myriad time-saving devices we employ on a daily basis, we find ourselves busier than ever before. Trying to do things in the minimum amount of time has not made us happy.

So I decided to try an experiment in reverse-thinking. What if we sought to – even just for a day – minimize our impact, and maximize the amount of time we spent doing things? What would this look like? What does “counter-urgency” feel like in practice? Would it lessen the pressure?

Experiments in living “Slow

I suspect that it would in many ways resemble the slow movement, which has grown exponentially in popularity recently in response to the speed of life and destruction of the environment and local communities in the name of convenience. It must also be a response to the pressure of the purposeful life. The slow movement includes slow food, which is (in contrast to fast food) grown locally, often organically, and savoured. Slow reading is similar, and involves savouring text instead of skimming or summarizing, or any other kind of speed-reading I learned about in university.

A minimum-impact day would also result in fewer outputs (and here I use a very corporatenewspeak word deliberately). We would do purposeless things: ambling with no direction, daydreaming, journaling, writing poetry, reading fiction. There would be no book club to report to. No destination. Poetry, lyrics and plays could be memorized for the sake of the words themselves, lines savoured like chocolates instead of potential “gobbits” to drop into future conversations or be recalled on trivia nights.

Sadly, my brief experiment in slowly minimizing my impact was a failure: I wanted outputs. I wanted to write about it, to share it on this blog. I wanted to tie it into my life’s work and be fulfilled by it.

I sense I would not be unique in feeling this way. Is our desire for impact innate, or learned? Here we have contradictory evidence. An article in the Economist a few months ago referred to a study that concluded that the desire for good, hard work actually isn’t all that innate, particularly in Britain. But if learned, if part of the Marxist legacy we hold that says that fulfilling work is an end in itself, how do we handle the pressure of finding such fulfilment?

Perhaps the idea of work-as-end is a way to rationalize the short time we have on Earth, and that we spend most of it working. But are we destined not to find all we seek in our jobs? Is it possible to use work only as currency to “buy” time for our true passions? Should we seek to maximize the good in our work (whether employment at all, a means to material comfort and status, or even autonomous shoe-making) — even if we hate it? Do you amble purposelessly?

I’d love to hear your thoughts…


Make Money First: The Trouble With Meritocracies

October 19, 2010

For a while now, I’ve been trying to put together a post about the value of polymaths in modern society. 200 or even 100 years ago, such people would need no defenders. What could be more valuable or intrinsically rewarding than being interested in everything and interesting to others? Yet today, polymaths are often seen as dilettantes, unable to focus enough to be serious about something and get a job. There is work, and then there are hobbies, and one should learn to tell the difference and divide one’s life into segments. Few careers reward diversity of knowledge. Fewer still pay well. My tentative title was going to be, “Great Careers for Polymaths,” but the idea made me queasy. Why, I asked myself, do I need to justify having multiple interests with the language of making money?

Because, I realized, we value wealth first. What I mean by “first” is that the goal is to be “secure” financially before seeking career satisfaction, getting in shape or getting married. Wealth is the elusive gateway to a complete life, but many mistake it for a complete life in and of itself.

Read the rest of this entry »


How Gen Y Can Reinvent Work-Life Balance

May 4, 2010

It’s May again, that exciting time of year when newly-minted college graduates venture out into the world and attempt to find a job. Or perhaps go to Europe and attempt to find themselves instead until the hiring freezes are lifted.  What will increase their chances of success?

It seems as though it’s getting harder and harder just getting onto the bottom rung of the “career ladder” (a term which, as someone who works in HR, I can tell you is on its way out as an inappropriate metaphor for the working world – think less in terms of defined rungs and more in terms of the moving staircases in the Harry Potter movies – you never know where you’ll end up). What happened to slogging through a terrible entry-level job booking meeting rooms and fetching coffee, paying one’s dues in order to move up to a better job in a year or two? Is that still necessary, or have things changed?

Well, as it turns out, a lot of things of changed. Many articles have been written about them: an economic slump which has meant declining hire rates and more people being let go; a majority of baby boomers who were supposed to be leaving the workforce in order to live out their golden years on pensions we’re paying for who are not; a glut of “over-qualified” university graduates with little practical experience (which, as we all know, entry-level coffee-making jobs require) who are driving up competition for the few full-time jobs that are out there; and organization structures that are getting flatter, with fewer roles at the top. So the situation now is that one can work making coffee and booking meeting rooms for three or four years and perhaps find there’s no promotional pot-of-gold at the end of the rainbow, or find that it’s still a few years out.

So where does that leave new graduates? If “paying your dues” was the baby boomer way to climb the corporate ladder (which actually existed then), what happens now? As my favourite career blogger, Penelope Trunk, once wrote: paying dues is out; that kind of lifestyle doesn’t allow for real growth or balance at work, because it forces new recruits to work ridiculous hours doing menial tasks. (It also sets a precedent that’s hard to follow once you have commitments outside of work.)

What’s better? In theory, doing many different things to acquire enough experiences to figure out what we really want to do over the long term. One of the advantages new grads have is the freedom to move around and go where the jobs are. But the trouble with this theory is that the way the job market is structured now, we need to be very sure of what jobs we want, specialize early, and be prepared to slog it out for several years gaining “relevant experience” in our field. There is little room now for dilettantism, or having jobs “on the side.” Everything is a career choice.

Take the classic “job on the side” for everything from aspiring writers to rock stars: teaching. Teaching used to be the kind of thing that anybody could do (and there were, accordingly, great teachers and some not-so-great teachers in the mix). Now students are fighting tooth and nail to get a place at teacher’s college, often resorting to attending a school in a different country. And once they graduate, the job market looks terrible – there is a two-stage application process even to be considered for a supply teaching job.  And don’t even get me started on academia as a career.

So despite the fact that it’s better to do different things, we’re now seeing a kind of apprenticeship model reborn, with high entrance requirements to every guild. Career experts say that Gen Yers will have something like 10 different careers in their lives – but in order to do so, we’ll need to have transferable skills, and know very well how to market them. In practical terms, this means that job-hopping, or even industry-hopping, is key, to prove all the different places in which one’s skills have been useful. It’s a kind of paradox where focus and diversity of experience are battling for dominance.

One solution might be to have multiple income streams, or to get experience with various combinations of paid and unpaid work. (Or maybe to start a blog and wind up with a movie or book deal out of it.) Like the realization that your romantic partner can’t be everything to you, we’re now seeing the idea that your main job can’t be everything either, from a remunerative or skills-building perspective. (Forget the idea that a job by itself can’t make you happy in life; we exposed that fallacy several years ago.) This trend is called having a “portfolio career,” that is, using a functional skill to diversify revenue streams.

We’re used to seeing this with careers in things like music, where a conductor will (for example) have a community choir, a church gig, some wedding performances on the go, and a few students all at the same time. When one revenue stream dries up, he or she will pick up another. But it’s new for accountants, or those who might want to mix traditional employment (at a major corporation, say) with self-employment. They key is diversity within a specialization, having skills that people will pay for and capitalizing on them in several different ways.

It also means that members of this generation will have to live with more uncertainty about their careers. Perhaps this is the price we’ll pay for more control over the skills we use and how we spend our time day-to-day. Does this signify a shift back to a pre-industrial time where people could choose how much they worked? Not fully, I’m sure, but it may be the beginning of a new, hybrid system where workers can control their output and work to their real interests more. Maybe this is the new “work-life balance.”

If, that is, all these new grads ever manage to get hired into that first job.

What do you think? Will you try to mix paid and unpaid work? Do you plan on job-hopping or industry-hopping? Do you anticipate that many members of Gen Y will choose to have multiple/multifaceted careers? Or is this a trend that will only affect a small subset of the population? Is it better to work a terrible (paying) job for three years or to get lots of volunteer experience instead?


What is History Writing Now?

April 27, 2010

People reach post historical all the time by searching for odd little historical, philosophical and political science-related phrases. Given the obscure nature of many of these terms to those not deep within postcolonial or imperial studies, I assume they’re doing research for some paper or project. I wonder if they feel they can trust what they read. Am I a reliable source? Are my ideas sound? Can one cite a blog, or is this an even bigger research no-no than citing Wikipedia?

If it is, why? Consider this blogger: I have a graduate history degree from a good school, which, for many, constitutes formal “training” in the discipline.  I know how to cite sources and (hopefully) construct a logical and well-supported argument. Does this make me “qualified” to comment on things? Does being qualified today require being intelligent, well-trained, and peer-reviewed (in the traditional sense), or does it come from an even more democratic approvals process based on sheer number of readers? Would having six million hits to my blog make me a “qualified” opinion leader? Or do I need to have published six books through a university press that only 6 000 people will ever read in order to be a real “expert”?  And is either something to which I should aspire?

These questions have far-reaching implications for me as I go through the process of deciding whether to continue on with studying history as a career, or do something else entirely – something more practical, that would affect people more directly than a well-researched book in an obscure field and a few impassioned lectures about Lord Curzon and the Raj for a dwindling number of undergraduates who don’t care. Because it’s very important to me that I influence the way people think, not in a creepy mind control kind of way but by presenting a fresh perspective that makes them reconsider the world around them and how things work within it.

I’m not sure academic writing is the best way to do that: its scope is too narrow, and its audience is those who are already predisposed to thinking from many angles, and who likely know a lot about the subject already. Traditional academic writing is also very dry. It connects with the reader because it is persuasive, and offers a sourced argument with little personal point of view. Blogs and new media, in contrast, connect with readers because they cover current events and are often based on personal biases or feelings. They are inherently populist, because the vast majority of bloggers want others to read their blogs, and so they talk about things that appeal to a large audience: fashion, entertainment, celebrities, popular political news, etc. And the vast majority of people who read blogs read about the above topics. But does this make them experts in their fields? And does it translate to “academic” subjects like history?

One of my main goals for post historical is to bridge this gap with a forum that is flexible enough to talk about current events and timeless philosophical questions at the same time, yet with a focus that isn’t so personal or academically specialized to be unappealing to a broad audience outside of a strict historical discipline. One might call this “accessible” writing, though as I wrote about in my last post, “accessible” can be a bit of a loaded term. What matters most to me is making an impact in a way that is direct and tangible, which is why the thought of another history degree and a life as a struggling academic is slightly off-putting at times. It’s very clear what such a life could do for me: I’d be a recognized expert in my field; I wouldn’t have to get out of bed torturously early every morning to go to another soul-crushing corporate meeting; I’d be able to have great chats over coffee with fellow bright people and give speeches about things like maps; I could help out engaged students by giving them interesting research suggestions; and I would generally get to run around having people think I was a big smartypants. Clearly, these things all sound fantastic. But what would a life like that do for others, even if I did manage to actually get a job out of it (which these days, as my fellow blogger and history professor trivium points out on his excellent blog, almost nobody does)? How would it contribute to my big life goal of being a respected public intellectual who makes people think in exciting new ways?

I don’t mean to criticize academics, who are generally brilliant, insightful, creative people. It’s the system that is at fault, a system that encourages people to go to school for 10 years with absolutely no hope of finding employment of any kind at the end of it, a system that encourages killing trees by publishing books nobody cares about, and a system that has created the popular feeling that it is so removed from the everyday that it serves only to train its own. I fear academia is becoming so specialized that it just doesn’t have the impact, or the scope, or the popular appeal, to be taken seriously. When the people who make the most money and important decisions all have MBAs and law degrees, humanities majors are in for some trouble. Actually, we’re all in trouble because we’re losing out on diversity of thought and experience – big time.

As I’ve written before, I think great writing is all about having a conversation, which necessitates a connection between readers and writers. One of the great things about blogs, and Wikipedia, and other new media is that the connection – and the feedback, via the comments or revisions – is immediate, and the process of forming consensus iterative. This is when history and philosophy are really exciting (and this is why I love to receive comments and feedback from readers, particularly when you disagree or want to point out something I’ve missed). Traditional academic writing just isn’t set up to react quickly enough to changes in events, or popular feeling.

So, to paraphrase the great E.H. Carr, what is history writing now? One would think that it would adapt to the changing relationship between reader and writer, from words sent down from a lofty perch in an ivory tower to those that are picked over in comments forums around the world. It hasn’t. And we’ve all lost something in the process.  The Economist ran an article today about how this election (in Britain) is bound to be a landmark one, and yet has no landmark book or philosophy written about the popular mood to match it, or to spur discussion, as was the case in 1945, 1964, 1979 and 1997. (I was particularly excited to see that the article cited one of my historian idols, Linda Colley, as having written a significant work from which Tony Blair drew inspiration in 1997.)

Can it be that nobody has written anything groundbreaking in the past five or ten years that bears mention? Or is it that the political audience is too fragmented – or too busy writing their own blog posts – to notice? Is there still a place for the academic as a public intellectual, or has academic writing been pushed to the fringes of literate society by virtue of being irrelevant to everyday concerns? And if academia is on the fringes, who is in the centre?

I suppose we can all take comfort in the fact that there is still the expectation of something by the intelligent people who read and write for publications like The Economist. There is an intellectual void that will always need filling, by academics or writers or particularly insightful and far-reaching bloggers. The question for the next few years, it seems, is whether those who step up to fill it will have new job titles, and if so, what they will be.


The Educated Class and Its Discontents

April 13, 2010

In a Special Report on Germany in the Economist recently, the traditional German system of education, while excellent at producing great engineers and skilled trade workers, came under criticism for its rigidity and unfairness. In Germany, ten-year-olds are marked out for either a career of manual labour (skilled or otherwise), white-collar work, or the bureaucratic/professional work that comes after university, and sent to separate schools accordingly. Ten is too young, its critics argue, to give a child a direction for life, which will become difficult to change later on with guild-like labour markets that prohibit entry into professions without the right qualifications. And many complain that Germany does not have equality of opportunity. Family background is more likely to determine test scores and social status in life in Germany than it is in any other country.

With any talk of equality of opportunity, it comes up again, that old aspirational myth of moving between classes, the Horatio Alger or perhaps Will Hunting story of a genius saved from poverty by good education, mentoring or his own perseverance to rise to a different class. Because it is about class. Germans (and the writers of the Economist) are not concerned as much about eventual income distribution, which is quite fair, as they are about having the opportunity to do something else: move up the social ladder.

Focusing on class seems to be a very Old Europe thing. Only in Europe do we see that holdover of a very, very privileged elite (or aristocracy) that has old family wealth, and a poor or working class that never really seems to shrink outside of meddling with statistics, and isn’t going to because those within it have a sense of pride in being working class. A recent article on class and politics in Britian in the Economist seems to describe the six established statistical class divisions as essentially fixed. David Cameron must appeal to the same middle-class voters as Margaret Thatcher, who appreciated their aspirations to “improve their homes and their lives; to get gradually better cars, washing machines and televisions; to go on holiday in Spain rather than Bournemouth.” Hardly a rapid rise to the upper echelons of power – really just a desire to keep up with what is expected from being “middle class.”

In fact, it seems the most common way of achieving a material increase in living standards is immigration. The quality of life is much higher in “New World” countries like Canada and Australia because the basic cost of living is less, while health care and education are still available at the same high standard, or higher. It’s hard not to notice that eight out of 10 cities ranked “most liveable” by the Economist last year were in Canada, Australia, and New Zealand.

And there is more opportunity for movement between classes in the New World (a term I’ll keep using despite the fact that it makes me sound like Columbus, because I can’t think of a better one), not least because there is less emphasis on “class” in general as something that matters, at least explicitly. The class system of North America has less of a focus on income and history and more on the combination of these with other factors, such as education. My theory is that because New World societies were formed based on merit, and evolved with much less distinction based on income or family wealth (since most everyone was a poor immigrant upon arrival), education and occupation became the primary means of separating out the kind of people with whom one should associate.

The North American system is thus designed to provide more equality of opportunity. In theory, all have the same access to education, even, in some ways, up to the university level. It is a noble goal, and higher education is certainly more accessible in Commonwealth and countries and the US than in continental Europe, as this 2005 study ranking university enrollment in developed countries shows.

But the result of our comparatively open and well-attended university system has been a generation or two of liberal arts or natural science graduates who spend ten years flailing around the entry-level job market before eventually settling into corporate middle management in a completely unrelated field somewhere, making essentially they same money they would have had they been pre-classified at age ten as they do in Germany. Most look back fondly on the days they spent at university, but more for the social connections they made than the time spent reading Cicero. And we, as a society, have trouble finding enough people to sell us mortgages or build our houses, because there aren’t really university programs that teach those skills. Universities have become training grounds for the “middle class” as a whole – including the low end of white collar work – instead of training grounds for occupations where they actually provide valuable preparation, that is, the “upper middle class” work of medicine, law, academia and the like.

If nothing else, we North Americans are certainly losing efficiency with all of this finding ourselves that comes after attaining our university qualifications. We’ve also created a society in which having a B.A. means you’re under-qualified for many jobs – either in experience, or because everyone else applying also has an M.A. or the college-level diploma which is all that’s really required to do the job. It isn’t going to change, though, because we value two things too highly: our “right” to attend school (especially university) for as long as we want to, and the class position that doing so will get us.

True, recently there has been a real push by the government and colleges to recognize skilled labour and professional work as viable career options for high school graduates to consider, and one often hears flippant comments about the world needing more plumbers and electricians, who “actually make a fair bit of money.” (Reality check: this website puts a plumber’s average hourly wage at $24 in Toronto, which over a year works out to about $47 000. This is around what your average white collar worker earns, at least at first, and a plumber doesn’t carry the same student loan debt.)

But while the logic of matching skills to actual jobs may have (almost) caught up, the overall effect on what class one will end up in has not. Doctors and lawyers are still far more likely to associate with white collar workers who have attended university than electricians who earn the same amount, because education and occupation are still important class signifiers.

What would it take to change these biases? And would changing the biases reverse the trend toward hiring managers requiring ever-more degrees when hiring someone to answer telephones and make photocopies? Is there a happy medium between the German and North American systems, where there is still mobility between classes, and still equality of opportunity, but more cultural acceptance that skilled trades and professional work is a respectable way to earn a living? I’m not sure – but for all that, I would still struggle to recommend that anybody give up learning about politics or history or biology and instead learn about practical data models in order to secure a job. We are fortunate to have the privilege of being able to buy those three or four (or more) years of time to learn. I would advise anybody who asked to enjoy it while it lasts, because there’s plenty of time for uninspiring desk work later, if they so choose.