7 Things I’ve Learned About History Since Moving to the Land of the Future

April 25, 2014

“Why on earth did you study history?” I was asked last night, and on many days since I arrived in what is perhaps the world’s most future-oriented place. What answer can I give to an engineer or venture capitalist who can’t rotate his perspective enough to look backward, or see the importance of doing so? I usually say that I love to explore the rich context of our modern world, so much of which was influenced by the past. Or that history, like all the humanities, is a mirror that shows us a different version of ourselves.

But such answers will not satisfy many people here, and in wondering why, I realize I’ve learned a few things about history and its uses since learning the way (to San José):

1. America ≠ California and American History Californian History.

I write a lot about nationalism, because it is one of the ways we identify as part of a group, with shared history. I feel very Canadian, and not very Ontarian at all because I don’t see Ontario’s history as disconnected from that of the Canadian historical narrative. So I assumed it would be very “American” here, like places I’ve been on the East Coast and Midwest.

I was wrong.

The United States, though a young country, seems to be very aware of (certain parts of) its history. After all, how many other countries refer so frequently to and preserve so faithfully the intentions of their founding documents? America has an acute sense of its founding myths, and the historical reenactment culture here is an ongoing source of fascination and delight. (Who wants to be that Union solider who gets shot the first moment of battle and lies on the field the rest of the day in period costume? Is there a hierarchy, and does one get promoted each successive year based on seniority until eventually he is General Lee, or is it merit-based and depends on how well you keel over in your fleeting moment of glory? Such pressing questions.)

California Republic

California is not, however, America. It is, as the t-shirts say, “California Republic,” with its “Governator” and strange direct democracy and fiercely independent, contrarian streak. Very few people here identify as “American” so much as “Californian,” and they don’t seem to share the same historical touch points. More common are nods to the Spanish and Mexican roots of the region, through the missions and street names, or a focus on the history of global trade and cosmopolitan capitalism.

2. People have a different definition of “history” in Silicon Valley.

Silicon Valley is a whole other animal altogether (a shark, perhaps?).

In a place where the next iOS release, must-have gadget or earnings report is breathlessly anticipated, “history” becomes something that matters mostly in your browser. “Legacies” and “artifacts” are usually bad things to Valley dwellers, being outmoded or standing in the way of progress. The tech industry does not look kindly on the past – or rather, doesn’t think much of it at all, an indifference which is, as we all know, much more the opposite of love than dislike.

San José then…

Silicon Valley isn’t kind to its physical history either. The historic orchards and cherry trees that once ringed San José have been paved to make way for sprawling, two-story rental accommodations and carefully landscaped corporate lawns. Giant redwoods are regularly felled to allow for a better view of the advertisements on the side of buildings (seen from the freeway, of course). Dome-shaped Space Age cinemas one frequented by Steven Spielberg are in danger of being torn down, likely so newer, bigger malls can rise up in their places.

Even churches, those bastions of beautiful architecture, look like something out of an IKEA catalogue, all light wood and glass – nary a flying buttress in sight. It’s a full-on assault of the past by the present, in the name of the future.

3. Transience produces ambivalence and a lack of investment in the past.

Many people are new here, as the region’s explosive growth in the last 30 years can attest. Others are “just passing through.” So a lot of people feel disconnected from anything greater than their jobs or family/friend networks here, and there is a pervasive sense of rootlessness.

So why bother to invest in their communities? Or care what they used to look like? So goes the logic and thus the “San José Historic District” encompasses a single square block, with fewer than ten historic monuments. These are mainly just buildings that have survived – earthquakes, vacancy and neglect. This website catalogs the “boneyard of unwanted San José monuments” that are slowly crumbling away near the freeway and very shiny corporate HQ of Adobe.

Santa Clara County Courthouse

The courthouse, crumbling in disrepair. San José is falling down, falling down, falling down…

It’s not all that surprising though when you consider that…

4. …it is personal history that fosters pride and connection.

Perhaps I and others feel disconnected from the history here because so much of historical connection depends on identifying with who made the history in the first place. Several recent studies from the British Commonwealth (Britain itself, Canada, and Australia) and the US indicate that museum attendance increases where a greater percentage of the population identifies with the ancestry of the area. That is, if you are of Scottish origin in Toronto, you are more likely to be interested in a museum about Canadian history, which was largely architected by Scots, than if you are a Native Canadian whose world was essentially trampled on by those same Scots. You’re likely still less interested if you are a recent immigrant to Toronto from Bangladesh. Feeling as though a part of you helped to make a place what it is makes it more real and more interesting. Rightly or wrongly, you feel as if you have more of a stake in the future because “your people” had more of a stake in the past.

Even people that grew up here can barely recognize it, so feel as though a part of their past has been taken from them. Wherefore the cherry blossoms and apple orchards that used to dot the landscape of the “Valley of the Heart’s Delight”? One woman told me her family used to live bordering a fruit farm, and moved six times as the farms were paved over by housing divisions, until “we lived backing on to the mountain, and there were no farms left.”

…and San José now.

And yet, I can only feel that history is critical, from my experiences in Toronto where historical consciousness, like love and Christmas, is all around.


5. History is often the most beautiful part.

I used to love walking through downtown Toronto because every so often a beautiful Art Deco or neo-Gothic gem would emerge amid the drab tower blocks of the 1960s and 1970s. Variations in architectural style provide interest and colour in an otherwise monotonous world of glassy office towers and utilitarian apartment buildings. Grand plazas, churches and monuments make statements about what is important to a place, and what it values.

What do these people value? It is worth cherishing and celebrating the few beautiful examples of history that exist here.

Like this one!


6. Historical traditions provide comfort.

This surprised me. History, of course, is about customs passed down as much as it is about actual events or physical buildings. Traditions ground us and give us some consistency in a world that changes rapidly. This is part of the reason weddings, funerals, and general church-going still exist. We need traditions to mark the big events in life.

We also need traditions to mark out who we are and how we should behave. To take a small but non-trivial example I wrote about recently: our clothing sends out signals about who we are and what we expect from life. There are no standards of dress here, at work or at play. Twenty-five-year-old men dictate the business ambiance, so beards, flip flops and holey t-shirts abound, and you can’t find a restaurant in California fancy enough that you can’t wear jeans.

It is utterly unconventional, which is perhaps just a bit the point. Wearing jeans to a meeting with someone in a suit will instantly destabilize them. It’s the same idea with non-standard working hours, perfected by the tech industry, and turning work into play (both the work itself and the space in which it is done). Even the critical and traditional accent in “José” has all but disappeared, which leads me to wonder if people in future will think this city was pronounced as something that rhymes with “banjos.”

It is groundbreaking to blow up established norms, but also somewhat unsettling. And history is necessary, if only to have something to conscientiously reject.

7. Culture clusters around history.

Life without history would not only be ignorant and untethered, but very boring.

People often view San José and its surrounds as soulless, and it’s easy to see why. One need only look at the cultural draw San Francisco has on the region to appreciate why places with deep roots are attractive. Most of San Francisco’s biggest tourist attractions are historical landmarks. What would the City be without the bridge, cable cars, Alcatraz, Haight-Ashbury, the Ferry Building, or Pier 39? Just a bunch of expensive apartments and hills, really.

History infuses places with meaning, and communities gather to add more layers. So next time someone asks me why on earth I would bother to study history, I think I will tell him that it’s because I care about beauty and culture and connection to the people and places around me — and that if he wants to live in somewhere even half-decent, he should too.

History, paved over

History, paved over

Follow Up: Exploring Identity Through Historical Crime Fiction

October 12, 2010

We tend to like our heroes in fiction to come with obstacles, moments of indecision and crisis, and genuine confusion about how they fit into the world, so much the better when they overcome them. Yet our historical heroes are often portrayed as fully formed characters, certain of themselves and their actions. Hindsight has a way of smoothing out the details and sharpening the focus of individual and group identities.

Fiction does not. Like the best revisionist historian, it “problematizes” identities by zeroing in on the conflict and confusion. This is why I was happy to discover a novel recently that illuminated the confusion of a whole historical period through the identity struggles of its main character.

In response to my last post on problems with agency in historical crime fiction (and indeed, history in general), one of my readers suggested I read A Conspiracy of Paper by David Liss. I am very glad I did, for aside from being a delightful work of fiction, it also allowed me to reflect further upon the differences between writing non-fictional and fictional history. The plot, in brief, involves murder and conspiracy, as the protagonist attempts to discover which shadowy figures and institutions are responsible for the death of his father. It centres on the beginnings of the London Stock Exchange in 1719, just before the South Sea Bubble, when the idea of money was changing rapidly. Instead of tangible coin or goods, the new wealth was in paper, in the form of stocks and other promises of money. The story grew out of the author’s graduate research on how people at this time viewed themselves through their money, and we see that this is in fact a central focus of the book.

Read the rest of this entry »

Historical Crime Fiction, Agency, and the Contemporary Mindset

September 30, 2010

I am a big reader of crime novels. From Agatha Christie to Simon Kernick to the ubiquitous Steig Larsson, they delight me with their rapid plot progression and suspenseful chapter endings, which is something I’ve never been very good at writing myself. Nonetheless, in my daydreams, I have envisioned a series of my own that would combine two of my principal interests: crime novels and historical fiction. It would ideally contain all the charm of the nineteenth-century novel, with the plot development of a modern thriller. Call it Pride & Prejudice & Warrant Cards.

I have occasionally forayed into the world of historical crime fiction, and been disappointed with what I’ve read each time, less because of the lack of CSI-type technology (and have you also noticed that computer hackers of some kind are now essential to move the plot along?) than because they are such unrealistic representations of the eras they claim to inhabit. Admittedly, I am more of a stickler for historical accuracy than most, but it strikes me that there are some major barriers to writing a character in the past that most authors overcome simply by ignoring them. Moreover, being a feminist of sorts, I would insist that the protagonist of my series be female. This choice, of course, compounds the difficulty of the situation – there are a number of alterations that would need to be made to maintain accuracy, all of which would seriously compromise my character’s agency.

Read the rest of this entry »

The Rise and Fall of the Grand Narrative

August 12, 2010

Those of you who read my blog regularly will know how frequently I lament the increasing specificity required of academic writing, and how it threatens to render the profession obsolete due to lack of readership or general interest in the subject matter. My thoughts were echoed in a recent book review which, in discussing the life of Hugh Trevor-Roper, a prominent historian, remarked that he could never be the great academic he wanted to be – an E.P. Thompson, or a Thomas Macauley, or an Edward Gibbon – because of two key factors. The first was the passing of the “grand narrative” approach to history, which is now seen as unprofessional, or worse, imperialistic in the Marxist teleological sense. The second was a result of his being British, and, as the article notes, “By Trevor-Roper’s day … Britain had become too insignificant to provide the subject of a grand narrative of progress in the style of Macaulay.”  The only nation that could conceivably produce historians claiming to write the story of its own empire today would be the United States, and those who do are usually right-wing polemicists who garner little respect in academic circles.

It’s true that the grand narrative has its drawbacks, as I’ve written before. Huge swaths of history that don’t fit in can be glossed over or ignored entirely in order to weave a tight story. And the grand narrative remains a common way for writers to (consciously or otherwise) impose a single, usually Western, trajectory upon world events that can be interpreted as modern intellectual imperialism. But it remains an anchoring lens through which historical events can be contextualized and patterns examined, and is usually more interesting than a narrow study. So what has caused the violent turn away from the grand narrative?  Is it justified?

Read the rest of this entry »

What is History Writing Now?

April 27, 2010

People reach post historical all the time by searching for odd little historical, philosophical and political science-related phrases. Given the obscure nature of many of these terms to those not deep within postcolonial or imperial studies, I assume they’re doing research for some paper or project. I wonder if they feel they can trust what they read. Am I a reliable source? Are my ideas sound? Can one cite a blog, or is this an even bigger research no-no than citing Wikipedia?

If it is, why? Consider this blogger: I have a graduate history degree from a good school, which, for many, constitutes formal “training” in the discipline.  I know how to cite sources and (hopefully) construct a logical and well-supported argument. Does this make me “qualified” to comment on things? Does being qualified today require being intelligent, well-trained, and peer-reviewed (in the traditional sense), or does it come from an even more democratic approvals process based on sheer number of readers? Would having six million hits to my blog make me a “qualified” opinion leader? Or do I need to have published six books through a university press that only 6 000 people will ever read in order to be a real “expert”?  And is either something to which I should aspire?

These questions have far-reaching implications for me as I go through the process of deciding whether to continue on with studying history as a career, or do something else entirely – something more practical, that would affect people more directly than a well-researched book in an obscure field and a few impassioned lectures about Lord Curzon and the Raj for a dwindling number of undergraduates who don’t care. Because it’s very important to me that I influence the way people think, not in a creepy mind control kind of way but by presenting a fresh perspective that makes them reconsider the world around them and how things work within it.

I’m not sure academic writing is the best way to do that: its scope is too narrow, and its audience is those who are already predisposed to thinking from many angles, and who likely know a lot about the subject already. Traditional academic writing is also very dry. It connects with the reader because it is persuasive, and offers a sourced argument with little personal point of view. Blogs and new media, in contrast, connect with readers because they cover current events and are often based on personal biases or feelings. They are inherently populist, because the vast majority of bloggers want others to read their blogs, and so they talk about things that appeal to a large audience: fashion, entertainment, celebrities, popular political news, etc. And the vast majority of people who read blogs read about the above topics. But does this make them experts in their fields? And does it translate to “academic” subjects like history?

One of my main goals for post historical is to bridge this gap with a forum that is flexible enough to talk about current events and timeless philosophical questions at the same time, yet with a focus that isn’t so personal or academically specialized to be unappealing to a broad audience outside of a strict historical discipline. One might call this “accessible” writing, though as I wrote about in my last post, “accessible” can be a bit of a loaded term. What matters most to me is making an impact in a way that is direct and tangible, which is why the thought of another history degree and a life as a struggling academic is slightly off-putting at times. It’s very clear what such a life could do for me: I’d be a recognized expert in my field; I wouldn’t have to get out of bed torturously early every morning to go to another soul-crushing corporate meeting; I’d be able to have great chats over coffee with fellow bright people and give speeches about things like maps; I could help out engaged students by giving them interesting research suggestions; and I would generally get to run around having people think I was a big smartypants. Clearly, these things all sound fantastic. But what would a life like that do for others, even if I did manage to actually get a job out of it (which these days, as my fellow blogger and history professor trivium points out on his excellent blog, almost nobody does)? How would it contribute to my big life goal of being a respected public intellectual who makes people think in exciting new ways?

I don’t mean to criticize academics, who are generally brilliant, insightful, creative people. It’s the system that is at fault, a system that encourages people to go to school for 10 years with absolutely no hope of finding employment of any kind at the end of it, a system that encourages killing trees by publishing books nobody cares about, and a system that has created the popular feeling that it is so removed from the everyday that it serves only to train its own. I fear academia is becoming so specialized that it just doesn’t have the impact, or the scope, or the popular appeal, to be taken seriously. When the people who make the most money and important decisions all have MBAs and law degrees, humanities majors are in for some trouble. Actually, we’re all in trouble because we’re losing out on diversity of thought and experience – big time.

As I’ve written before, I think great writing is all about having a conversation, which necessitates a connection between readers and writers. One of the great things about blogs, and Wikipedia, and other new media is that the connection – and the feedback, via the comments or revisions – is immediate, and the process of forming consensus iterative. This is when history and philosophy are really exciting (and this is why I love to receive comments and feedback from readers, particularly when you disagree or want to point out something I’ve missed). Traditional academic writing just isn’t set up to react quickly enough to changes in events, or popular feeling.

So, to paraphrase the great E.H. Carr, what is history writing now? One would think that it would adapt to the changing relationship between reader and writer, from words sent down from a lofty perch in an ivory tower to those that are picked over in comments forums around the world. It hasn’t. And we’ve all lost something in the process.  The Economist ran an article today about how this election (in Britain) is bound to be a landmark one, and yet has no landmark book or philosophy written about the popular mood to match it, or to spur discussion, as was the case in 1945, 1964, 1979 and 1997. (I was particularly excited to see that the article cited one of my historian idols, Linda Colley, as having written a significant work from which Tony Blair drew inspiration in 1997.)

Can it be that nobody has written anything groundbreaking in the past five or ten years that bears mention? Or is it that the political audience is too fragmented – or too busy writing their own blog posts – to notice? Is there still a place for the academic as a public intellectual, or has academic writing been pushed to the fringes of literate society by virtue of being irrelevant to everyday concerns? And if academia is on the fringes, who is in the centre?

I suppose we can all take comfort in the fact that there is still the expectation of something by the intelligent people who read and write for publications like The Economist. There is an intellectual void that will always need filling, by academics or writers or particularly insightful and far-reaching bloggers. The question for the next few years, it seems, is whether those who step up to fill it will have new job titles, and if so, what they will be.

The Modern Good Life, Part 3: The End of Progress

March 25, 2010

What is the modern “good life,” and how do we know if we are living it?  Is what we have now “good”? Can we honestly look to the past and say that the way we live now is better? And can we reasonably expect that things will continue to improve? These are the questions that started me thinking about this series in the first place.

In Part 1, I wrote about our peculiarly modern bias to action, and in Part 2 I discussed the different ways in which can become slaves to history. In Part 3, I will address our unconscious and seemingly unshakeable assumption of human progress and how our current historical “moment” is unsettling because it may be challenging its dominance.

Gen Y is supposed to be more optimistic than past generations: according to a recent article in Time magazine, 88% of its members believe that one day they will lead the lives they desire.  The “hope gap” (presumably the ‘gap’ is with reality) increases with age, apparently, as people get more disillusioned — but deep down we all remain, at heart, subscribers to a fundamentally optimistic narrative of our present. It is the progress narrative, articulated no better than by John Stuart Mill, its eternally optimistic Victorian proponent, when he said that the goal of progress was for individuals to live long, happy lives without physical or mental suffering, including “indigence, unkindness, worthlessness or premature loss of objects of affection.” Who can argue with that?

I’m sure many of you have heard of Whig History, the idea that humans are progressing toward an ever more liberal and enlightened society: freer, more peaceful, more democratic, more comfortable, and more convenient. Historians like to scoff that Whiggish histories are teleological, Eurocentric, and poorly sourced. We criticize the philosophies of Mill and G.W.F. Hegel, among others, who argued that modern European (now “Western”) society was located at the apex of historical development, and was its logical conclusion. We laugh that Mill and his contemporaries considered nineteenth-century British society to be the most advanced on the scale of civilizations, a trajectory based on liberal criteria such as constitutional checks on rulers, and freedom of the individual enabling the full use of his faculties. But in reality, we think the same thing in our own time.  We know that things been continually improving, and expect that they will continue to do so. And we expect that too will always be at the apex of historical progress.

Amongst all of this certainty, the past few years have been a stumbling block. Suddenly, the balance of media coverage is negative. Is it a temporary setback, we wonder, or a lasting trend? We feel a deep-seated unease as a reputable voice – or collection of voices – begins to think that the past was better than the present. And the main area in which we have concerns is ethical, societal, moral. We can see that technology is advancing, making us smarter (perhaps), wealthier, and more comfortable. But we are no more able to solve society’s eternal ills – poverty, violence, want, fear – than before. New technologies, government policies, or even human kindnesses still have not managed to create our Utopia.

Of course, it isn’t rational to expect Utopia. We all know that. But secretly, we hope that we can achieve it, and we have a vision of the future as “the best of all possible worlds,” as our Panglossian friends would say. And we want to be a part of it, and we want to help it along. We have a bias toward action.

So the question becomes, has the West become a slave to its own idea of progress? I wrote in my last post that today we are unique in seeing history and linear and cumulative. But have we been fooled, and is the “progress” we have seen not really progress at all? Could our technological progress be in fact contributing to a moral decline?

This line of thinking has certainly had its supporters. Several centuries ago, Jean-Jacques Rousseau contested the established idea of progress in his time: economic development, the creation of a state and protection of private property, and the ability to live comfortably. (It appears not much has changed since the eighteenth century.) As he wrote in his Second Discourse:

Due to a multitude of new needs, [man is] subjected…to all of nature and especially to his fellow-men, whose slave he becomes in a sense even in becoming their master; rich, he needs their services; poor, he needs their help.

It certainly isn’t a powerful exhortation to buy that new flat screen TV. Though it is perhaps a given that having more things engenders a need for more things, it doesn’t seem to say much for our evolution as a species. In Toward a History of Needs, Ivan Illich writes that “The model American puts in 1600 hours to get 7500 miles: less than five miles per hour.” Most of us can walk almost that fast, with a lot less effort spent selling our souls for a salary.

Nietzsche continued this anti-progress train of thought in the Geneaolgy of Morals, deriding those who thought comfort and luxury were the end of life:

The diminution and leveling of European man constitutes our greatest danger…We can see nothing today that wants to grow greater, we suspect that things will continue to go down, down, to become thinner, more good-natured, more prudent, more comfortable, more mediocre, more indifferent…there is no doubt that man is getting “better” all the time.

For both Rousseau and Nietzsche, the economic and technological progress that had led to large societies, sedentary means of acquiring food (i.e. non-hunter-gatherer communities), and the general ease of life that Mill had in mind had caused humans to lose something along the way. This something was morality. They had different definitions but meant something of the same thing.

In truth, I don’t think morality is declining, not even with the advent of sexting, or video games, or La-Z-Boy recliners. It’s natural that, by measuring it against objective progress in so many other areas, the presence of our human constants of good and evil will inevitably make us feel like failures. Because there certainly is evidence of objective progress. Are we, the middle class in a developed country, better off today than 25, 50, or 100 years ago? In a multitude of ways, absolutely: we have extended many basic rights to larger populations (de jure and de facto), have much more advanced medical care (and likely better access to it), use a host of labour-saving devices which reduce the amount of manual drudgery we have to endure day to day, have technologies that allow us to control our reproductive output (and therefore our careers, financial situation, etc. better), and, perhaps most importantly, can access vast amounts of information near-instantaneously.

Utopia? Certainly not. But I feel pretty good about being part of a society that is free, and liberal, and generally supportive of those who can’t support themselves. And I have a recurring dream in which (dork alert!) John Stuart Mill comes to visit me in the present, and he’s pretty pleased with how things have turned out as well, though of course we still have a lot of work to do.

In an excellent article on the idea of progress, a columnist in The Economist writes that our constant striving for morality is like aiming for an “unattainable horizon,” and the eternal battle between forces of altruism and selfishness keep society on an even keel (clearly, this author also has a bias to action). I think it’s more important that we keep up the faith that we’ll get there. Gen Y has it right: optimism is one of the keys to happiness. Society may not be perfect, but we have to believe we can keep improving it.

I started this post series with Madonna, so it only seems appropriate to end with the Beatles: I’ve got to admit it’s getting better; a little better all the time.

Read the other posts in this series:

Part 1: The Bias to Action

Part 2: History and its (Ab)Uses

The Modern Good Life, Part 2: History and its (Ab)Uses

March 24, 2010

My brother was a history major 10 years before I ever was, and I distinctly remember one weekend when he was visiting from university and asked me why we (as a people) study history. “Because we need to know about the past so we don’t make the same mistakes in the future,” I answered, quite proud of myself. (Not the most inspiring answer, but I was 8. Give me some credit here.) I think he was impressed too – little did he know I would grow up to write a nerdy history blog! Ha HA!

What I said then is not a novel idea: historians have long advocated the necessity of knowing about the past in order to inform our decisions in the present, and justifying those decisions once made. And everybody loves history, because they love the stories of overcoming great odds, or seeing how much things have changed (or, indeed, stayed the same), or thinking about how with one small shift things could have been very different.

But we tend to forget that our fascination with the past is unique. Other worldviews don’t see it this way. To the followers of many Eastern religions, and humans from most of human history, the past was just a series of fluctuations around the same human constant. I’ll go back to John Gray’s Straw Dogs to where he argues that attempting to make sense of history and giving history meaning that has the potential to inform the present and future is just a “Christian folly,” part of Christianity’s central, mistaken assumption that humans are different from other animals and can direct our lives. “History” was never before considered cumulative, or linear, but cyclical. It was not studied. It was not important. It was as much an unknown as the future. And it certainly did not direct anybody’s actions in the present.

There is a concept within the discipline of the “silent referent,” a particular narrative or idea that acts as the standard against which something else is measured. The narrative is usually the European, Marxist master narrative that charts the “progressive” transition from a feudal, mythical, communal past to a capitalist, secular, modern present. This narrative is celebratory, teleological, and complete. It wraps us all up in the confidence that we have trod a good path that has ended in a happy, modern present. [More on this in my next post.] The idea of the “silent referent” is often used in postcolonial history, most notably in a landmark book by Dipesh Chakrabarty titled Provincializing Europe. Chakrabarty argues that Indian history needs to escape from this master European narrative in which it was never a part and can never measure up.

We would all do well, I think, to take note of his caution. I’m not sure even we can measure up. We run the risk today of being so tied to this celebratory history we have told ourselves that we can barely function without referencing it, or live outside of its temporality. The silent referent of our lives today is the past.

Perhaps this is a simplistic statement. It is natural that the future we imagine for ourselves is a direct output of the past we have experienced. We can hardly imagine anything else. (This is why aliens in movies look like small, green people.) But our high regard for preserving our history – even if it is largely unconscious – is unique to our species, our culture, and our age. We live very historically contextualized, temporized lives. The title of this blog (elitist meta moment alert!) is an ironic note that even those who are trying to escape “time” or “history” by adding the prefix “post-“ to things are still temporizing themselves by saying that we are in the temporal phase that comes after it and thus reinforcing the idea that linear time is of paramount importance.

There are two traps in particular that we might fall into: overspecialization, and overgeneralization. The first can occur when we endlessly analyze, categorize, and pull apart the past in an attempt to preserve it for future generations. This kind of history in the end takes everything from the past as equally worth preserving, with no distinction (historians specializing in German shoemakers from Frankfurt between 1522 and 1523, take note!). Wallowing in the “good old days” is a recipe for disaster, especially because even the most objective historical narrative has a bias and an angle. Nietzsche wrote about this tendency in The Genealogy of Morals, warning that it can effectively prevent any innovation or aspiration for the future.

I have written before about the dangers of over-specialization: information overload leading to a societal inability to discern what is really important, and even paralysis by analysis – the inability to do anything for fear of breaking too strongly with the past. This is exactly what Nietzsche was talking about. Individuals become slaves to history and cannot act outside of or without it. Is this, perhaps, some of what plagues us today?

Maybe. I do suspect we as a population need to be wary of those who seek to ‘preserve’ a traditional way of life, or go back to it – and I don’t mean that we should stop the trend of going back to 80s fashion before it really takes off. I mean that factions arguing for “traditional family values” or established religion carry mistaken and destructive beliefs that contribute to our present woes.

However, I think we are more often slaves to the past in a different way, and another that Nietzsche considered problematic: overgeneralization. This is the kind of history that seeks out role models and teachers from the past when we feel unable to find them in the present. And it can, in excess, create typologies to serve as standard scripts for the present, which, as Herr N. wrote, “generalize and finally equate differences” and as such it does a disservice to the past in masking its historical (and geographic) particularities.  Think of how many times you’ve heard “the worst economic climate since the Depression” or “the largest deficit we’ve ever seen” in the past two years – is this contextualization helpful? Does it help to know that Hillarycare failed in 1993? It isn’t 1993. How many other unchecked assumptions about the past are we dragging around and using as props to justify not changing or trying something new?

History is an anchor, and a necessary one, but it can also be a deadweight that prevents us from moving on. Being tied to the past, afraid to spend more money because we have never spent so much before, unwilling to make bold moves in favour of merely speculating over our downfall as a society doesn’t serve us well because we have no script that prepares us for the present. History never repeats itself, except in overwhelmingly general terms.

There is much to learn from the past. We can find the human characteristics that will inspire us in the present – perseverance, ingenuity, humility, and many more – but not the right political or economic blueprint for the future we’re trying to build.

Previous post in this series: The Bias to Action

Next post in this series: The End of Progress