Minimum Impact, Maximum Time, and the Goodness of Work

February 10, 2011

Is ambling antithetical to success? Is a life of purpose the only path to happiness? And is Gen Y really all that different from previous generations in wanting meaningful work?

On Marx, Meaning, and Materialism

I think often on Marx’s theory of alienation; namely, that under the capitalist system of increasing specialization, workers become alienated from the fruits of their labour, and from their own capacity as workers to work/produce things and grow in doing so. Instead of seeing work as an end in itself, and gaining feelings of fulfilment from seeing the fruit of one’s labour go from raw materials to completed items, according to Marx work had become but a means to an end as workers were increasingly slotted into automated lines of production. Instead of creating the whole shoe, they would nail in a piece of the sole, as it were, with no satisfaction in seeing the “end-to-end process” (as we might say in today’s corporatenewspeak).

Certainly, with the rise of the industrialization, Fordist assembly lines and globalization, the idea of work as a means to an end gained popularity as a way to describe life in the twentieth century. And in some ways, this was acceptable. In the 1930s, one was fortunate to have a job at all – any job. One did not pick and choose. The generation after that (those ubiquitous Boomers) observed their parents’ work ethic and adopted it without thinking, as a means to gain material prosperity. Nice cars, big houses, creature comforts, holidays in Boca Raton, and well-educated children became status symbols, ends worth working for. A life of middle management drudgery and rarely seeing one’s children was, for many, an acceptable trade-off.

But we expect so much more from our work today. Making a living, and a living that will support the lifestyle we’re used to, is mere “table stakes” (more corporatenewspeak). Because, with good education and attentive parenting and the opportunity to develop our skills as children, we have so many options for a career. Consequently, we expect much, much more out of the time we spend at work. (And before someone brings up 40% unemployment among global youth, yes, the recession has, to an extent, made Gen Ys a little less choosy – but only for now.)

The theory of work as an end in itself – and a means to happiness and fulfilment – has important research to back it up. A study out of California a few years ago remarked on the importance of hard work and purpose in achieving happiness in life. The conclusion is worth quoting at length:

A central presumption of the ‘‘American dream’’ is that, through their own efforts and hard work, people may move towards greater happiness and fulfillment in life. This assumption is echoed in the writings of philosophers, both ancient and modern. In Nicomachean Ethics, Aristotle (1985) proposed that happiness involves engagement in activities that promote one’s highest potentials. And, in the Conquest of Happiness, Bertrand Russell (1930/1975) argued that the secrets to happiness include enterprise, exploration of one’s interests, and the overcoming of obstacles. …Our data suggest that effort and hard work offer the most promising route to happiness.

Wow. Good work, it seems, is the answer to all our problems. The only thing left to do is find work that contains enough meaty, purposeful, interesting, content – related to our skills, of course, and with excellent “work-life balance” and good benefits – to meet our needs. Simple!

But is this expectation reasonable?

Really, it’s a wonder anybody finds jobs like this, let alone the majority of people. Even Marx’s (clearly idealized) autonomous, cottage industry shoe-makers (or soldiers, or second sons forced into trade…) no doubt achieved very little of this all-encompassing fulfilment through their work. Yet today we pile the expectations on our jobs. While there are certainly those out there who caution that work will not make anybody happy all on its own, the prevailing narrative remains that fulfilling work is the surest route to happiness. Consider: it’s just not socially acceptable for anyone able to participate in the “knowledge economy” to opt out and instead choose to make money solely as a means to an end with no other agenda – let alone anyone under 30. Do you know anyone? And do they want the situation to be permanent?

Minimizing Impact: Lowering our expectations? Or relieving the pressure?

While I was vacationing in the vineyards of Mendoza (rewards for a life of corporate drudgery?), I got to thinking meta thoughts about what people tend to expect from life. We use a lot of language today that revolves around impact. We want to “make a splash.” We long to stand out in interviews, on dates, and in applications. People everywhere seek to be famous for something (anything! Jersey Shore, anyone?) or to leave a legacy, something that will let current and future generations know they existed as individuals, and left something behind. Modern society refers to the more noble side of this feeling as the desire to change the world, whether through volunteering, winning a Nobel Prize or raising well-adjusted children. We have, as I have pointed out before, a strong bias to action which makes us want to do good and make things “better.” Most of us put a lot of pressure on ourselves, a vague kind of weight that is associated with the Victorian ideal of the innate goodness of work and the possibility of having a hand in making a better future. The idea of finding work that allows us to, as the above-quoted study notes, “promote [our] highest potentials,” is tied up in this pressure.

At the same time we are acutely aware that life is, as an honourary TED talk I watched recently put it, fragile and vulnerable – and short. (This fact creates a very un-Hobbesian empathy, the talk argued, not only for those with whom we share blood ties, but with other humans, other creatures, and the biosphere generally. Worth watching.) It is little wonder that, with the perception of the sand in the hourglass ever running out, we feel pressed for time, overwhelmed, and run off our feet. We try to make every moment count. We multi-task and are always tied to a communication device of some kind. Most things are done for a purpose: we educate ourselves in order to gain employment, money and “success”; we sleep and eat for our health; we watch our health to extend our lives (so we can keep doing it all longer). It has been often noted with bitter irony that with all the myriad time-saving devices we employ on a daily basis, we find ourselves busier than ever before. Trying to do things in the minimum amount of time has not made us happy.

So I decided to try an experiment in reverse-thinking. What if we sought to – even just for a day – minimize our impact, and maximize the amount of time we spent doing things? What would this look like? What does “counter-urgency” feel like in practice? Would it lessen the pressure?

Experiments in living “Slow

I suspect that it would in many ways resemble the slow movement, which has grown exponentially in popularity recently in response to the speed of life and destruction of the environment and local communities in the name of convenience. It must also be a response to the pressure of the purposeful life. The slow movement includes slow food, which is (in contrast to fast food) grown locally, often organically, and savoured. Slow reading is similar, and involves savouring text instead of skimming or summarizing, or any other kind of speed-reading I learned about in university.

A minimum-impact day would also result in fewer outputs (and here I use a very corporatenewspeak word deliberately). We would do purposeless things: ambling with no direction, daydreaming, journaling, writing poetry, reading fiction. There would be no book club to report to. No destination. Poetry, lyrics and plays could be memorized for the sake of the words themselves, lines savoured like chocolates instead of potential “gobbits” to drop into future conversations or be recalled on trivia nights.

Sadly, my brief experiment in slowly minimizing my impact was a failure: I wanted outputs. I wanted to write about it, to share it on this blog. I wanted to tie it into my life’s work and be fulfilled by it.

I sense I would not be unique in feeling this way. Is our desire for impact innate, or learned? Here we have contradictory evidence. An article in the Economist a few months ago referred to a study that concluded that the desire for good, hard work actually isn’t all that innate, particularly in Britain. But if learned, if part of the Marxist legacy we hold that says that fulfilling work is an end in itself, how do we handle the pressure of finding such fulfilment?

Perhaps the idea of work-as-end is a way to rationalize the short time we have on Earth, and that we spend most of it working. But are we destined not to find all we seek in our jobs? Is it possible to use work only as currency to “buy” time for our true passions? Should we seek to maximize the good in our work (whether employment at all, a means to material comfort and status, or even autonomous shoe-making) — even if we hate it? Do you amble purposelessly?

I’d love to hear your thoughts…


Silence and Schematics: The Things You Don’t See

December 16, 2010

In my last post I wrote about context and perspective in mapping, and the biases that are inherent in the information presented in different kinds of maps. Biases, of course, can be dangerous because we generally trust the information maps give us. They are more powerful for their apparent objectivity. The science behind them is sound, we think – after all, cartography is based on empirical data.

But just as maps can inform us, they can also make us ignorant – of context, of specific details, and of what we don’t know – even while they’re giving us other information. It isn’t just what we see in the frame that matters, but also what we don’t see, what’s left out. In conveying information, art can be as important as accuracy, and sometimes even more so.

Most early maps contained a lot of information. When little was known about the area beyond what had been explored, cartographers would create a sense of danger and excitement by inserting allegorical images, fantastical creatures, or mythical mountain ranges. They would decorate the frames with pictorial Biblical references, or symbols of their nation’s prowess at exploration and conquest.

A very busy map of Africa from the 1600s

In the above (relatively complete!) map of Africa from the 1600s, note the prevalence of mountain ranges and large rivers (that don’t really exist) and the animal drawings used to take up space. Also note the many decorations of ships in the ocean around the frame (side note: web address watermark not included on the original). What is silent? The cartographer’s ignorance – about the interior topography and other geographical markers. But a casual observer then would not have known this.

It was considered a great leap forward when in 1759 cartographers – influenced by French mapmaker Jean Baptiste Bourguignon d’Anville and the Enlightenment tradition dictating that all maps be empirically verifiable – begun to leave blank spaces if precise information about parts of the areas they were mapping was unclear. The practice served to encourage new forays into the “unconquered” and “uninhabited” areas they depicted to determine, for example, the as-yet undiscovered mouths of rivers or the potential treasure/glory/conquest that lay beyond established borders. But primarily these blank spaces lent increasing credibility to what was shown (whether it was accurate or not), by silencing everything else.

Accentuating some pieces of information over others with emphasis and silence grew in popularity even further as the centuries progressed. The most common world map we see, for example, privileges the northern hemisphere over the southern through the use of Mercator’s projection. It also puts the Western world – whether Europe or North America – in the centre of the frame, relegating all other areas to the peripheries.

"The Queen's Dominions at the End of the Nineteenth Century"

In the map above, the bright red colour of Britain’s imperial territories contrasts with the neutral colour of other lands. Islands of small geographical significance jump from the page with red underlines and heavy black labels indicating that they are strategic refuelling outposts, places that ship spices back to Britain, or simply more territory in red. Mercator’s projection is used to great effect, enlarging North America even above the bounds of the map’s frame, at the expense of the southern hemisphere.

It is all intended to provide a sense of a vast, interconnected Empire. While looking at this, viewers might fail to notice the absence of information not related to Britain’s imperial conquest. About other lands, the map is relatively silent, because they are not the focus.

Maps are now used for all kinds of things – everything from directions to websites or thoughts. The proliferation of maps has tended to swell the number of those used for a single purpose, and the trend seems to be toward more specificity but less context.

Consider subway maps, most of which are a legacy from the modernist era. They fall squarely into the “art” over “accuracy” way of conveying information, and are characterized by highly stylized lines, multiple colours and use of sans-serif fonts. The most famous, of course, is Harry Beck’s map of the London Underground, which dates to 1931. Its genius lies in its abstraction, its ability to draw order in the form of clean and easy-to-read visuals from the confusion and complexity of the actual system. Compare the official underground map with the actual map of the subway stations from above the ground:

Schematic Tube Map, Zone 1

 

Tube Lines Mapped to Actual Geography

It takes a certain genius to create schematic subway map order from chaos; no doubt this is the reason these maps are such iconic art pieces, found on buttons, t-shirts, and posters the world over. It’s fascinating to me that they are so simple and so focused – and yet divorced from the actual geography they represent. Almost every major city is the same.

Paris:

Paris Metro

Paris Metro

Washington DC:

Washington DC Metro

Washington DC Metro

Moscow:

Moscow Subway Map - Like an Alien Creature

Even maps of New York City’s frenzied system are relatively simple. But sometimes accuracy wins out over art. In 1975, the New York City transit authority determined that the map they had been using to that point was too much so, and commissioned something that would line up more with the streets above ground. (You will find a fascinating interview with the designer of the 1979 map, which was only just retired a few years ago, as well as several old subway maps from NYC, here.) Yet even this more “accurate” and “realistic” new map has some deviations from reality: Manhattan, and lower Manhattan in particular, have been expanded to accommodate the landmarks and subway lines that all seem to converge there; Brooklyn and the other boroughs are made relatively smaller than their actual size.
It would seem that for clarity or for a great story, some alteration is always necessary, and a bit of silence too. No map designed to emphasize transit lines could hope to show every street, and of course designers realize this.  People are perhaps more willing to put up with silence and abstraction in maps now because they are used to it, and because maps are not expected to be geographically accurate to be authoritative.  It’s an interesting trend that points to our increasing ability to cope with the abstraction and de-contextualization of cartography, even as the broader minimalist modernism movement appears to be winding down (the ever-popular clean lines of IKEA products notwithstanding). What does it mean for the future of maps? Will the definition of a map become ever-broader as we incorporate variations from site maps to schematics? Or do we need a new name for this kind of information vehicle altogether?

This post is part two of a three-part series on the past, present and future of mapping. Check back for a wrap-up later this week.


Secrets and Lies — and Google

November 29, 2010

There is an almost hysterical paranoia that permeates the air these days, concerning the information that is being collected about us and helping us make decisions, in many ways without our knowledge. What am I talking about? The secret of success for everything from Google searches to your mortgage approval process: algorithms.

And secret they are. Much of the fear about them is that they are “black boxes” about which we know little, and yet they are making decisions for us every day. In the process, some worry, they are taking away our capacity for decision-making and automating processes in which human input may be necessary to correct inconsistencies or mistakes. An extended report in The Globe & Mail last week examined the impact such incomprehensible and inaccessible mathematical formulas can have: according to the data collected, buying floor protectors at Canadian Tire might signal a conscientious borrower; late-night website browsing may indicate anxiousness and, in combination with a host of other minor activities, derail a credit application.

Google is another example: it uses complex algorithms to filter information to find exactly what it thinks we need, or, as its mission statement says, to “organize the world’s information and make it universally accessible and useful.” It also provides us with ads, of course, based on our search history and preferences and in theory tailored to our needs. Even online dating websites such as OkCupid and eHarmony make extensive use of algorithms to predict who will make a good match. The information that comes out of such sites is a fascinating look at the likes and dislikes of a broad cross-section of the population.

The formulas used are secret, of course, in order to protect the competitive advantages of the organizations they serve. What surprises me is why there is such intense fear of them, these unknown equations that guide our choices. We are not forced to click on any of the links Google serves up. We’re not even forced to use Google as our search engine. If we want a local plumber, we can always use the Yellow Pages, where prominence is determined by advertising payments. Is this any better?

Perhaps it is the lack of control that is so terrifying. Because algorithms filter information for us, there is an unimaginable amount that we just never see. We literally don’t know what we don’t know. Somehow this seems more sinister than the way it used to be when we were all relatively more ignorant, perhaps because, through the Internet, we are now aware of there being a lot more information out there.

Does Google have a sinister hidden agenda? One would think that such a thing would go against its code of conduct of not being evil. Does OkCupid? Likely not, but in filtering information to satisfy our (perceived) needs and wants, argues Alexis Madrigal in this month’s Atlantic, algorithms can serve to maintain the status quo – or even prevent shifts in societal norms:

By drawing on data about the world we live in, [algorithms] end up reinforcing whatever societal values happen to be dominant, without our even noticing. They are normativity made into code—albeit a code that we barely understand, even as it shapes our lives.

Madrigal goes on to say that Google, OkCupid and their ilk give us only “a desiccated kind of choice,” and that we need to break the patterns by choosing against type. We need to make ourselves less predictable, to click unexpected links and choose unexpected partners, presumably in order to ensure that society in general doesn’t stagnate. Don’t trust The Man and all that.

The growing paranoia that unseen and unchecked forces are predicting – even controlling – our behaviour seems to be growing even faster than fear of Yemeni terrorists. I think it relates back to our growing cynicism and distrust toward all large organizations. Believing in anything at all is seen by many as a mug’s game. Trust in governments is ever-declining, the more we find out about how they conceal the truth from citizens, or tap their phone lines, or watch their goings-on. People now, on average, trust NGOs (even ones that are affiliated with large government organizations) much more than governments themselves, and certainly more than the politicians and bureaucrats that staff them. Faith in organized religion has plummeted amid endless sex scandals that are officially acknowledged too late (if at all), refusals from the highest levels to acknowledge the damage done by outdated policies, and generally divergent values from most Westerners about gay marriage, reproductive rights, and female clergy members.

I’ve written before about what apathy and extreme cynicism look like in modern society. I neglected to mention an obsession with knowing the “truth,” even if part of us believes that truth to be fictional or compromised. Hence the enduring popularity of the “exposé,” tabloid journalism, insider specials, and now WikiLeaks, the non-profit whistle-blower organization that is making news (again) this week with the release of thousands of diplomatic cables sent by US ambassadors. Despite pleas from the White House not to release the information (potentially jeopardizing thousands of lives, and undermining US diplomacy and counter-terrorism efforts), the obsession to reveal won out, and the cables were posted anyway.

Why? Secrets may not be entirely benign, but what seems to be missing from the discussion is the idea that neither might their release be. In an age of over-sharing, of laying open our most personal thoughts for the world to see, is even the necessary secrecy of diplomacy unwelcome? It has fallen victim to the public’s need to know anything and everything — or else there must be some ominous conspiracy at play. In democracies, utter transparency seems to be the only option palatable to citizens, and we are unnerved when it isn’t available, so we turn to (often illegal) means of obtaining information, such as WikiLeaks.

It seems we are experiencing a seismic shift in the way we are continually using and desiring more information.  Should we expect it to be entirely accessible at all times, to all people? Knowledge is power, as they say, and everybody wants more. The irony, of course, is that everybody also wants privacy: WikiLeaks, for example, will not disclose its sources, or its founders. One wonders how long they can expect to keep that a secret.


Democracy Rules! 10 Great Reasons to Vote

October 25, 2010

Voting is both a privilege and a duty. If you’re an apathetic type, consider the following 10 less commonly heard (and only slightly sanctimonious) reasons why you should take some time off work to mark an “X” on a ballot today.

1. You’re one of the lucky few in the world who is able to do so.

Accurate numbers on this score are not easy to come by, but this report from the Hoover Institution ranks about 60% of the world’s nations as democratic in the broadest sense, namely that they hold elections. The more stringent classification of a full “liberal democracy” includes electoral competition for power but also:

  • Freedom of belief, expression, organization, and demonstration
  • Protection from political terror and unjustified imprisonment
  • A rule of law under which all citizens are treated equally and due process is secure
  • Political independence and neutrality of the judiciary and of other institutions of “horizontal accountability” that check the abuse of power
  • An open, pluralistic civil society
  • Civilian control over the military

By this measure, the number of global democracies drops to only 37% of nations worldwide. Wikipedia tells us that this is less than 15% of the global population. When you think that (due to age) only about 60-70% of the population in a full democracy can actually vote, that number drops to under 10% of people living in the world today.

2. Voting makes you disproportionately powerful over your fellow citizens.

Read the rest of this entry »


Champions of Ignorance and Mediocrity

October 22, 2010

The world is down on merit, it seems. In addition to the post I wrote on the subject, three separate articles this week have argued that the decline of the meritocratic society and rejection of current elites is proof that something has gone very wrong. But it’s not our meritocratic society that’s the problem. It’s the way we feel about it.

Whither Elitism?

Maureen Dowd in the New York Times writes that Sarah Palin and her ilk are “making ignorance chic” by disparaging the cold and cowardly “elites” who went to Ivy League schools and “refudiating” proper English. As Dowd writes, Palin “believes in American exceptionalism, but when it comes to the people running the country, exceptionalism is suspect; leaders should be — as Palin, [Christine] O’Donnell and [Sharron] Angle keep saying — just like you.” Presumably, the “you” in this case is also ignorant, and proudly so.

It’s enough to make any politician shy away from a good education, lest he or she be labelled another spineless member of the establishment. At best they face the charge of wasting their potential and failing to implement good ideas, like Obama’s heath care plan, or just about anything in Miller’s original plan for Toronto. At worst, they are humiliated, losing their seat (or their legacy) to candidates who think homosexuality is caused by brainwashing or who refer to fellow elected officials with racial slurs.

It’s a sad decline for a noble idea, and it might be just the beginning.
Read the rest of this entry »


Privatopias and the New Social Capital

September 27, 2010

What is the impact of homogeneous thought on political action? It is a pressing question, and one that has received extensive media and scholarly treatment since the explosion of information enabled by recent technological advancements. On the one hand, we have more access to information than ever before, and with it more access to diversity of thought. Such access makes those who can sift through and aggregate information into easily understood patterns and trends extremely valuable, as I discussed in my post about cultural intermediaries. They can make sense of it all, and turn the incoherent information noise into music.

But information can also divide. More information means more segregation, as like-minded individuals take advantage of technology to seek each other out and self-select into communities of shared interests. The result is millions of small forums for like-minded individuals, and less and less interaction with those who think differently in broader, more general social settings. It has also led to decreasing tolerance for those with different views, since it is easier and easier simply to retreat into isolation with those who will not challenge how we think.

It is an ever-quickening acceleration of what Robert Putnam famously wrote about in his 2000 article “Bowling Alone: America’s Declining Social Capital.” These days, he argued, we (and by “we” he meant “Americans,” though I believe the trend can be seen in other western societies) are more likely to join organizations centred on specific common goals and interests, such as professional associations, and less likely to participate in more general ones, like community action groups, boy scouts/girl guides, or (hence the title) bowling leagues, as citizens did forty or fifty years ago. Instead of bowling as a collective with others who may have different backgrounds, we are bowling alone.

Read the rest of this entry »


The Rise and Fall of the Grand Narrative

August 12, 2010

Those of you who read my blog regularly will know how frequently I lament the increasing specificity required of academic writing, and how it threatens to render the profession obsolete due to lack of readership or general interest in the subject matter. My thoughts were echoed in a recent book review which, in discussing the life of Hugh Trevor-Roper, a prominent historian, remarked that he could never be the great academic he wanted to be – an E.P. Thompson, or a Thomas Macauley, or an Edward Gibbon – because of two key factors. The first was the passing of the “grand narrative” approach to history, which is now seen as unprofessional, or worse, imperialistic in the Marxist teleological sense. The second was a result of his being British, and, as the article notes, “By Trevor-Roper’s day … Britain had become too insignificant to provide the subject of a grand narrative of progress in the style of Macaulay.”  The only nation that could conceivably produce historians claiming to write the story of its own empire today would be the United States, and those who do are usually right-wing polemicists who garner little respect in academic circles.

It’s true that the grand narrative has its drawbacks, as I’ve written before. Huge swaths of history that don’t fit in can be glossed over or ignored entirely in order to weave a tight story. And the grand narrative remains a common way for writers to (consciously or otherwise) impose a single, usually Western, trajectory upon world events that can be interpreted as modern intellectual imperialism. But it remains an anchoring lens through which historical events can be contextualized and patterns examined, and is usually more interesting than a narrow study. So what has caused the violent turn away from the grand narrative?  Is it justified?

Read the rest of this entry »


How Bronzed Gods Triumphed Over Pale Britannia

July 28, 2010

It’s summer in the northern hemisphere, the season when the attentions of those who follow fashion shift to achieving that suitable all-over skin blistering we commonly refer to as a “suntan.” I always marvel at how the desire for a deep brown “glow” exists in the same societies in which racism against all those not of European origin still flourishes. I also wonder at how these bronze aspirations exist so strongly in the western world, when pale skin is still the preferred look in much of India, the Far East, and Africa.

Less than a century ago a tan, anywhere in the world, was seen as the unquestionable mark of someone who laboured outside in the sun because he could not afford to pay someone else to do it for him. The sun makes the skin tough and leathery, the opposite effect of what Victorian ladies desired. For centuries, women used everything from arsenic powder to drawn-on blue veins to highlight the soft, pale, translucent nature of their skin. The look was a very European courtly one, where the majority of social gatherings occurred indoors away from the prying eyes of the lower classes. And the ideal spread with European imperialism, condemning those races with naturally darker skin tones to perpetual inferiority. In the famous ad below, for Pears soap, a white boy uses Pears as part of a cleansing ritual with a black boy, the end result being lighter, more desirable, skin. Read the rest of this entry »


Today’s Nihilism: The Sound and Fury of Apathy

April 19, 2010

Nihilism is often defined as believing in nothing, having no purpose or loyalties and no true beliefs. For Nietzsche, who is most commonly identified with it, nihilism is the realization of the extreme subjectivity of the human existence. By this philosophy, all of the structures and beliefs we are raised with are just imposed upon us, not objective realities. In the twentieth century, nihilism is most commonly of the existential type, that is, the belief that life is purposeless and meaningless.

Existential nihilism may have been common among early pagans, but ever since the major religions took hold, life has always been thought by most to have a higher purpose: kindness to others, enlightenment, contemplation, even “making something” of ourselves. We may not think of ourselves as nihilists today: there are still a lot of people who believe in an afterlife to strive for, or the possibility of easing the burdens of others in this life.

But there is a steadily growing cynicism the Western world that approaches the limit of what humans can take and still have any hope for the future. One of the less common definitions of nihilism is, in fact, extreme skepticism. In some ways, this is just hipster culture writ large.

I was recently directed to an interesting article in Adbusters by one of my readers (thanks!) that speaks of the “coming barbarism,” an extreme anti-capitalist reaction by Gen Y that includes a wilful return to “barbaric” unreason. It is choosing to be in the dark as the antidote to, as the article quotes, “all the cant and bullshit and sales commercials fed to us by politicians, bishops and academics…People are deliberately re-primitivizing themselves.”

This is meta-post-modernism, in the sense that everything is a parody of itself, somewhere back along the line. It is an arresting idea, that there is no escape from having one’s sincerely-held beliefs turned into the backdrop of a music video or an ironic ad campaign for jeans. And it leads to a society in which the ideologies of inter-generational conflict play out almost as though they’re scripted by those in power. As the article puts it:

Unlike Gen Xers, many of whom found ways to express anticapitalist sentiment through subculture, Gen Y has nowhere to run or hide. All forms of cultural rebellion have long since been appropriated and integrated into the ideology of capital. Marketing firms and advertising agencies now enjoy an unprecedented relationship with the avant-garde, so much so that they’ve become one and the same.

By this logic, war protests are not unwelcome but expected as part of the success of a functioning democracy. Ad execs steal from hipsters in order to market jeans to their mothers. Even the existential nihilist is an identifiable brand, a beret-wearing, cigarette-smoking philosopher who takes his expected societal place even in the midst of his narrative of pointlessness.

Is this not nihilism by abstraction, this sense that we are all players in someone else’s game, with our moves predetermined and ultimately ineffective? And, it seems, the only remedy is opting out – the coming barbarism of which the article speaks. It is the willing removal of oneself from any genuine commitments or passions. And it is terrifying. With no passion, no investment – and so on in a vicious cycle.

Adbusters advocates political involvement as the solution, to “storm and occupy whatever political and economic space we can.” But I suspect this new meta-nihilism has spread to politics as well: how can one support a politician when in five years it is conceivable that he will have “crossed the floor” or possibly be found in hypocritical violation of every principle he espoused? (Former Representative Mark Foley, anyone?) Even Obama, who actually managed to turn the tide and reach those who traditionally wouldn’t care with his politics of audacious hope, seems to have let us down by not being the messiah he was purported to be. The disillusionment has spread, because if he can’t change Washington (or our world as we know it), stop climate change, end the wars, quiet radical Islam, and bridge all divides, then who can?

Still, for those who can swim against the tide of apathy, perhaps political action is indeed a cure. I can think of another: power.  When (if) the members of the current generation stop listening to Lady Gaga and start to buy into (pun intended) the established structures and perks of capitalism – pensions, mortgages, SUVs, fancy titles on business cards – they will not only have more invested in making the system work, but the ability to actually effect change. And maybe then the rebellious thing to do will be just that: the gradual dismantlement of what we know and what we know is wrong with it into an uncertain future. It would be a postmodern reckoning with the structures and beliefs in which we have been raised, and an examination of whether they hold true as objective “good.”

It would be nihilism … with purpose. A new dark age indeed.


A Culture of Free = Communism 2.0

April 14, 2010

Past ideologies and cultural movements were usually associated with a class, or a gender, or a specific subset of the population that wears funny hats and goes to art shows. These days they’re associated with whole generations, Gen Y in particular. And Gen Y’s ideological leanings are ambitious. A recent article in Adbusters claims that “there is a revolutionary current running through the subconscious of this generation that has yet to be realized or defined. We champion piracy, instinctively believing that information should be free and open, that intellectual property law is contra-progress and that capital is not a necessary intermediary for social organization.”

Capital is not necessary for social organization – that is, we want things to be free. Today, everything from news to music to classified ad services has a new benchmark to attract our attention: no cost to us, and preferably none of those pesky ads, either.  It is a race to the bargain basement, which Gen Y began, but which now encompasses everyone. Meanwhile, content providers are struggling to keep up (and many are not). And the “culture of free” has become an ideology.

I’m going to make the bold statement that I don’t believe we are revolutionary for wanting to get things for free. This supposed worldview – that information should be accessible and open – was and is just a convenient position to hold right now. It is no coincidence that the noble championship of piracy arose when most of the members of Gen Y were teenagers, making them a) not yet old enough to be generating capital of their own and happy to get something for nothing; b) at the age when they wanted to “stick it to the man” (or however that sentiment is phrased now), especially the capitalist pigs who profited off their (parents’) hard-earned cash; and c) able to master new pirating technologies before anybody else could devise a clever way to stop them doing it.

Piracy therefore developed so rapidly simply because there was an option. People figured out how to share music (and books, and movies, and opinions) in a new way, and so they did.  It started with just ripping and burning, making mix CDs rather than mix tapes.  Eventually P2P took off because it was offering a useful new service: downloading music directly to one’s computer.  There was no legal competitor, so the free (not-yet-illegal-but-definitely-immoral) version took off. And now it is expected by all that the thieving will continue unabated. We are affronted when record labels attempt to regain their lost profits from hapless downloaders. We scorn those who prosecute the programmers at Pirate Bay. And we revel in the fact that blogs are thriving while subscription-fuelled media giants are hemorrhaging readers.

Now, there is certainly something to the argument that the freer exchange of copyrighted materials that enables piracy can be a good thing. It exposes people to more music, and many people who “pirate” music get engaged then proceed to purchase more music or go to more concerts than they would otherwise. But I dispute the idea that the free stuff movement is anything more than a convenient justification of existing behaviour. Ideologies rarely stick solely because they are noble and altruistic. More often they are useful, and solve practical problems. The “free stuff” movement solved the problem of paying for copyright materials.

History has seen many excellent, convincing justifications for getting something from nothing. Real pirates and thieves perfected the art of it, and were/are only stopped with effective policing (whether by international tribunals or the more traditional method of hanging). Aristocrats, priests, and the nobility for most of human existence claimed that they deserved by divine right to profit from their vassals’ labour. They were only coerced into some semblance of fairness by the threat or occurrence of violent uprisings. And communists claimed the right to free things with an ideology based on the natural inequalities between humans. From each according to his abilities, to each according to his needs, as Karl Marx wrote. But Communism has never really been successful because whenever some people are getting something for free (or with minimal labour), others are working hard and getting nothing.

This is a fact that modern pirates inherently know: the record label executives and established acts are suffering, but so are the sound engineers and indie artists. So how did stealing from others turn into an altruistic ideology?

Part of it is the appeal of the “web 2.0” culture: it is democratic, innovative, and authentic. It bypasses the elitist filters of Hollywood, publishing or old media. According to Andrew Keen, a leading critic of the Web 2.0 movement, it “worships the creative amateur: the self-taught filmmaker, the dorm-room musician, the unpublished writer. It suggests that everyone–even the most poorly educated and inarticulate amongst us–can and should use digital media to express and realize themselves.” It is appealing because it allows us all to be dabblers, part-time writers (through blogs) or directors (through YouTube) or experts (through Wikipedia). This is, Keen argues, exactly what Marx promised of Communism:

[I]n communist society, where nobody has one exclusive sphere of activity but each can become accomplished in any branch he wishes, society regulates the general production and thus makes it possible for me to do one thing today and another tomorrow, to hunt in the morning, fish in the afternoon, rear cattle in the evening, criticise after dinner, just as I have a mind, without ever becoming hunter, fisherman, shepherd or critic.

And yet the dabbling, for all its appeal, is why the “culture of free” is ultimately unsustainable. Humans want recognition as individuals, and Gen Y wants this more than anybody. But dabblers are rarely experts, and their output is rarely singled out for recognition. As Keen notes, the problem with the democratization of media is that it creates a situation in which everybody has an opinion but nobody has an audience. And no audience means no capital, which will become a problem when Gen Y moves out of their capitalist-pig-baby-boomer-parents’ houses and has to pay for their own internet connections.