The Empire Strikes Back … with Hammers

March 4, 2014

This is a post about curling.

It is also a post about colonialism and the sadness and rhetoric that accompanies the sunset of an empire.

Toward the end of the 2014 Olympics came the men’s curling final, a dramatic showdown between Great Britain and Canada. Watching in Europe, as I was, meant coverage was courtesy of the BBC and commentary by two storied skips from the grand Team GB of yesteryear. (Let’s put aside the fact that, like most British curlers, the commentators and players were all Scottish, because they all displayed a sufficient amount of “national” pride to be considered British. I will get into the whole Scottish nationalism affair later.) The stage was set: the Canadian women had beaten the female British team in the semi-finals and gone on, undefeated, to win the gold medal the day before. There was an enormous amount of pressure from home on the Canadian men to repeat their gold-medal successes of the 2010 and 2006 games. The tension was palpable.

Canada ended up winning a lopsided 9-3 for the gold.

Now, the Canadians were the odds-on favourites in this match. Despite curling being a Scottish sport originally, Canada is its foremost powerhouse nation. Since curling was introduced to the Winter Olympics in Nagano 1998, Canada has won medals in both the women’s and men’s tournaments every time. Only Sweden comes close. This particular team GB was also very good – they have won several World and European Curling Championships – but I doubt many people would have bet on them for the gold.

Our Boys Aren’t Like That

And yet, to listen to the BBC commentary, the victory was Britain’s almost by rights. The callers were making a valiant effort at being neutral at first but later abandoned the impartiality to lament the way the game was going for “our boys.” But what was most fascinating to me, as a student of nationalism and empire, was the language they used. I’ve written before about how the Olympics brings out the very best/worst in our jingoistic selves and allows the media and advertising to fall back on hoary old national tropes (the whole #wearewinter Canadian twitter campaign being just one example – do they not have winter elsewhere?).  But I had never seen this rhetoric play out between former imperial power and its precocious colony before. According to the BBC, the Canadian team was (and please say this with a Scottish accent in your heads, because I assure you it’s better) “a wee bit too aggressive,” “quite loud with their calls” and “not as polite as some of the other teams.” At one point, jokes were made that the Canadians’ shirts were too tight — or perhaps their biceps were too big? It was all just too masculine for Britain! “Our boys aren’t like that.”


Canadian curling skip Brad Jacobs: too much muscle mass for Britain!

Canadian curling skip Brad Jacobs: too much muscle mass and yelling for Britain!


Uncouth colonies! How dare you go to the gym and yell at the rink and celebrate your victories! It was a distant echo of the accusations that have always been aimed at settlement colonies, like Australia and Canada – and internal colonies, like the untamed “Wild West” within the United States – as justifications for the continuation of central control. Australia, incidentally, has never shaken off its image as the raucous outpost of empire “Down Under.” (Google suggest says: “Why are Australians so…” “Racist? Obnoxious? Violent?” Notably masculine traits, and not in a good way.)

It is odd that the British should still be falling back on this language. Perhaps sport commentary, like holiday foods, preserves tradition longer than the everyday. After all, it is hardly news that the games that originated in the former imperial capitals have since spread around the world and been mastered by foreign nationals to a far greater degree than those in the home country. Golf, a typically Scottish exercise in hitting objects with sticks, has been perfected by Americans like Tiger Woods or Fijians like Vijay Singh. Cricket is now the almost exclusive realm of South Asians. And then of course there is (sigh) soccer, an originally English sport which is now dominated at the international level by South Americans and Southern Europeans, much to my biennial chagrin.

Rugger for the Empire

Perhaps the general British population is now past the point with these sports that they feel they should win, as the original players. But that is patently not the case with every sport. For comparison, I thought a look at another English game – rugby, a product of the Victorian English public school system – would be interesting. Rugby spread about as far as the former settlement colonies of Australia, New Zealand and South Africa (though really not much further, to look at the top teams), and my hypothesis is that British commentary would deem those foreign players rough and aggressive as well. Indeed, a short search of British news outlets finds the formidable NZ All Blacks masters of “thuggery” and the English team still fending off accusations of being hampered by its antiquated class system and uselessness on the pitch. One author, a former English international rugby player, talks about how the “relentless,” “ruthless” All Blacks laughed at him and assaulted his manliness when he twisted his knee, and how a recent match between the Aussies and the All Blacks was “a frightening gauntlet thrown down to all the players in the northern hemisphere.” You can’t make this stuff up.


The New Zealand All Blacks: "all things dark and Kiwi"

The New Zealand All Blacks: to the English, “all things dark and Kiwi”


It is competitive and familiar and has overtones of parent-child conflict. This same language was appropriated by the colonies themselves to justify their independence from Mother England: “You’re right: we are stronger and healthier and more willing to get our hands dirty, so we’ll have that control of our own government now, thank you.” Canada and Australia in particular used the physical superiority of their young men as indications that the centres of empire should shift to these places where willing hands were stronger at carrying its mission forth. As one former Canadian Governor General once said, “It is in climates and countries where the white man may multiply…that we must look for the strongest elements of Empire, and it is only at the Cape of Good Hope, in British North America, and in Australasia that we find these conditions realized.” And so it was that British men became stereotyped as effete weaklings more interested in their cravats than the serious business of governing a plurality of the world’s population.

And we’re still talking about it, a century later.

Hammer Time

In curling, the team that gets to throw the last stone (and has the opportunity to win points) in each end has the “hammer.” At the moment, the imperial hammer lies with the United States. And yet, Olympic jingoism was muted this year in the US, with various news outlets decrying the “step back” from previous triumphs, with fewer medals and some surprise podium shut-outs. Much national hand-wringing and poor sportsmanship ensued, perhaps signs of an empire uncertain of its own strength.

A sign of decline? Stay tuned for accusations of China’s uncouth aggression.

Oh wait…

US News Reports of Chinese Aggression

US News Reports of Chinese Aggression

Sparkling Water or Water Lilies? The Comfort vs. Beauty Problem

January 29, 2014

First things first: posthistorical is back! I am very excited to be blogging again. The world seems much the same: Obama, Harper, and Merkel have won more elections; politicians everywhere squabble over ridiculously trivial things and generally accomplish nothing; we collectively still spend way too much time on facebook. And yet much has changed: this blogger now lives in the Golden State instead of the True North Strong and Free, and with a government-enforced sabbatical now has a lot fewer excuses not to post frequently.

It’s also 4 years (ish) since I started posting on this blog, and that means the exciting quadrennial spectacle of nationalism that got many of my juices flowing last time (otherwise known as the Winter Olympics) will soon be upon us. Once more, in Russian! More to come.

But first!

A dichotomy for the ages

One of the things that started me on blogging again was a rush of ideas I encountered while re-reading Aldous Huxley’s Brave New World for a book club meeting. I will likely tease out a number of themes and their repercussions in the modern world in future posts, but the one that resonated most strongly with me was the dichotomy that one of the book’s main characters presents between truth & beauty and comfort & happiness. To have beauty and truth, he reasons, one needs to endure political and emotional instability, heartbreak, misery and terror – basically, opportunities to show mercy, wisdom and courage in the face of overwhelmingly bad odds. Happy and comfortable people have no need to rise above their situations in such a manner.

But who would choose discomfort and misery, given the choice?

The general trend of world history has been toward comfort, both in a material way and in the sense of social stability. If the nineteenth century was the century of engineering and industry, the twentieth century was the century of comfort. It was the century of spandex, widespread air conditioning and La-Z-Boy. More people than ever before were lifted out of poverty, and industrialization led to middle-class (or relative middle-class) comfort worldwide.

The number of people who choose sneakers over high heels or jeans and t-shirts over Little Lord Fauntleroy suits seems to back up comfort’s victory over beauty. And from the range of falsehoods – from “spin” to blatant lies – evident in government, advertising and many other areas, truth doesn’t seem to do very well either.

Have we already made the choice? And if so, is this progress?

The truth/beauty vs. happiness/comfort dichotomy mirrors the idea of moral vs. technological progress. Some thinkers, such as John Gray, whose anti-humanist work Straw Dogs I’ve written about before, believe that technological progress is in theory limitless, but that our moral progress as humans is essentially stalled. Nuclear technology, to use an example he gave, while a huge technological boon that can supply power to millions, has simultaneously allowed us to wipe cities off the map, a more efficient killing machine than had ever been known before.

Systematic discrimination

Perhaps truth and beauty – or moral progress, if we can equate the two – have seemingly lost out to comfort and happiness – technological progress – because the large-scale systems that largely control our lives have focused mainly on them. Take governments: funding for truth and beauty (whatever that would look like) will almost always come second to funding for hospitals, police, and even infrastructure – that is, the necessary building blocks for a comfortable life. The Brave New World character I mentioned earlier also points out that rule of the people leads to an emphasis on comfort and happiness over truth and beauty – certainly, this is the credo of America, “life, liberty, and the pursuit of happiness,” not, incidentally the pursuit of truth. Comfort, or at least freedom from harm and repression, was the first priority of the revolutionaries.

I went back to examine some other modern revolutionaries. Re-reading The Communist Manifesto, I discovered that the aims of Communism also begin with comfort before proceeding to truth, even if the ideals contained within the movement are based on so-called universal truths. Guaranteeing subsistence was the first step, through a radical change in property rights, the tax system, etc. followed by universal education (i.e., the pursuit of truth and beauty).

The other large system that governs our lives, free market capitalism, is also geared toward profits that can more easily be made from comfort than beauty.  This is why Proctor & Gamble, who sell deodorant and diapers, made $US 81 billion in 2013, and the New York Times, winner of more Pulitzer prizes than any other newspaper, struggles each quarter to make a profit. Perhaps this also explains the existence of the phrase “starving artist.”

First things first

There may be a way to see a positive outcome in this. Perhaps it is not so much a dichotomy between truth/beauty and comfort/happiness, as a ladder, or hierarchy, if you will. Perhaps, like ol’ Maslow said, we focus first on satiating the need for food, clean water and safety before striving for self-actualization.

Now, we all know how much I love Maslow (and so does everyone else, apparently, because this is by far my most read post). But this theory would disagree with Huxley’s characters, who imagine that it is either a comfortable, drugged out existence devoid of anything so confusing and challenging as truth, OR starving artists capitalizing on their misery and discomfort by creating beauty, that is, skipping straight to the top of the hierarchy.

I posit this theory: those who can truly move to the self-actualization stage can only do so because they feel their more basic needs have already been met. This is true even though they live in the same world as those more susceptible to advertising campaigns which introduce needs we never knew we had (for the new iPhone, elective rhinoplasty, or gluten-free dog food, for example). Maybe it’s just that those seeking truth and beauty seem deprived and miserable to those who couldn’t imagine taking their places.

Our need for comfort will stay the same as our definition of comfort changes; perhaps those who can be comfortable enough already, without soma and shiny new things, can have their truth/beauty cake and eat it too – happily.

New Money and How To Buy Things Anonymously

June 16, 2011

The more I read, the more I am determined that privacy/anonymity vs. openness/sharing will be the defining dichotomy of our age. The more web sites start to track pieces of information about what we buy and sell, where we browse, and what we like, the greater the number of calls for regulation and privacy protection. The battle lines between privacy and the power of information have been drawn.

But now there is a way to keep spending private, at least. Bitcoin, a digital currency allegedly created by hacker Satoshi Nakamoto, contains complex encryptions that allow its holders to buy and sell anything, anywhere in the world over the Internet, without revealing their real names or having to pay any kind of exchange fees or taxes. (For an interesting and accessible overview of Bitcoins and their implications, see this article in ars technica.) Bitcoin has all the advantages of cash – anonymity – but without the hassle of having to physically transport it anywhere. It also has all the advantages of a “trust-based electronic currency,” such as credit cards, in that it allows instant, ubiquitous transactions, but without the need for an identity attached to them.

Bitcoin has consequently been embraced by Anonymous, an anarchic online community that first came to mass public attention when it disrupted the sites of PayPal, MasterCard, Visa and others in response to perceived censoring of WikiLeaks last year. It is disrupting them again with Bitcoin, but this time more indirectly.

Normally, when new currencies appear on the scene, they have a hard time with what is termed “adoption and valuation,” that is, getting people to use them, and determining what they are worth compared with other currencies. New currencies are usually the prerogative of federal governments, or supranational ones (as in the case of the Euro), which automatically gives them a head start because citizens need to pay taxes in the new currency and generally use it to make purchases. Even then, as this history of the Euro points out, there are remarkably complex logistical and emotional hurdles to overcome, from swapping the money found in ATMs to choosing the images and words for the notes that so many people identify with to establishing the value of the new currency against other existing ones.

It is very rare for new currencies to spring up without a national backing, and perhaps Bitcoin has only been able to gain attention and adoption of the market because it is digital, and thus doesn’t have the physical/logistical barriers to overcome. But why are people using it? Just like a new national currency, Bitcoin has appeared and boldly declared that it stands for a new order, in a sense. Its users can now engage in economic activity outside of the sphere of government control, or the control of multinational credit corporations, in total privacy.

As an article on BigThink puts it, “You don’t need a banking or trading account to buy and trade Bitcoins – all you need is a laptop. They’re like bearer bonds combined with the uber-privacy of a Swiss bank account, mixed together with a hacker secret sauce that stores them as 1’s and 0’s on your computer.” Bitcoin represents the complete disengagement of the buyer from the seller, the furthest distance yet discovered from bartering or exchanging one good for another. Purchases now require approval from no-one.

Is this radical new territory, or a return to what currency is intended to be? As a means of exchange, currency technically need not have an identity attached to it. It stands as a measure of commensurability; buyers and sellers can rely on the value of the currency as a standard without having to ascertain the value of goods being exchanged every time they buy or sell. And it was only very recently in the trajectory of human history that currency was created with no direct correlation to an existing good like gold (called fiat currency), but with instead the backing of a national government with whose laws and regulations the buying and selling parties tacitly agree to comply. New virtual currencies like Bitcoin are similar to all modern government currencies in that their value is not intrinsic but imposed by decree (and perceived rarity, and a bunch of other factors). But they lack the oversight of institutions and regulators that comes with a national means of exchange.

Whether Bitcoins will remain as seemingly ominous and valuable as they have recently become is questionable. This week, the Bitcoin plot thickened with an apparent heist in which approximately $500 000 worth of Bitcoins were stolen from one veteran user. The theft pointed to the limits of exchange without third-party oversight, whether in the form of government or a corporation to monitor fraud and persecute offenders. Is the anonymity of exchange worth the risk?

It seems as though this has come down to the same “privacy vs.  security” debate that has dominated public discourse since the rise of the Internet (and, of course, September 11). In all likelihood, some third-party institutions will step in to regulate Bitcoin trading with limited liability and criminal activity investigations, as the above-linked article details. But these would decrease the anonymity of the users of the currency, in some ways negating the whole point. Perhaps the main take-away of Bitcoin is  that anonymity, in today’s world, has its trade-offs too, and can never be an absolute good.

What Canada Desperately Needs: Visionary Leadership

April 20, 2011

Many people have been calling Canadians parochial throughout this election. Apparently we’re not comfortable with our leaders having opinions about politics outside our own country (and casting votes to back them up). We are apparently less involved internationally than ever before, especially in leadership roles. As a country, Canada is “retreating in on itself, clinging to the security of its own cultural stereotypes.”

Quite frankly, I think the kind of parochialism described above is but an aspiration at this point. I would love to see nation-wide parochialism. Instead, we have something closer to the real, historical definition of the word: looking no further than one’s own church parish. The campaign has showcased several variations of such limited and narrow outlooks, and the dialogue has largely been confined to pet causes, special interests, and the concerns of small minorities.

The real tragedy of this election is not that we will have spent several hundred thousand dollars to get to about the same place, give or take a few seats. It is that we – led by our fearful leaders – have failed to take the opportunity to engage in dialogue about the path Canada is on, and more importantly, what that path should be. This election has mostly been fought over the past: disrespect for Parliament, carpetbaggery, where money was and wasn’t spent, what was and wasn’t allowed to happen, and generally the same tired policies and pot shots we’ve heard for years.

Thus far, there has been a woeful lack of debate about the real issues that will shape the future, such as youth unemployment and skill development, education, and the role of urban areas. Nobody has yet talked about a solution to the looming crisis in pensions. The critical and contentious issue of technology scored nary a mention at the debates. Overall, there is a chronic lack of an overriding, national vision.

This is why I cringe every time I hear someone talk about how Gilles Duceppe would be the best person to elect. “He’s just sooo charismatic, and such a great speaker.” Indeed. (Especially en français in comparison to the other party leaders whose first language is English, n’est-ce pas?) Let’s not forget that he is running on a platform that, 150 years ago, would likely have been considered treasonous, and continues to act as a catalytic force for ill in Canadian politics.

It is very easy for Gilles Duceppe and his Bloc Party colleagues to say whatever is most appealing to Canadians because 1) they know they will never have enough power to actually act on any of their promises; 2) they know they will never have to find any money for their schemes; and 3) since they are at heart a regional party, they need not come up with any coherent vision. They can borrow from the left and the right with no regard for the practicality of their position. As Tasha Kheriddin wrote recently in the National Post:

For federalists, the Bloc continues to represent an immovable force, not only an obstacle to a majority government, but a siphon for political talent and resources which would otherwise be deployed in the other parties, most notably the Tories and the NDP.

Instead of allowing federal politics to develop on a left-right continuum, as in the Rest of Canada, the Bloc continues to perpetuate the federalist-separatist dichotomy, and run an effective extortion scheme to boot.

Basically, the Bloc constitutes a wedge between voters in Quebec and national policies enacted by widely-supported national parties.

I don’t mean to vilify the Bloc above all others, as there are several parties at fault here. I have heard the Green Party criticized for similar reasons, namely being a single-issue party. I can certainly see the merits of that argument, given that the Green party’s platform is neither particularly left- nor right-wing, but mixes and matches policies to suit its “Green” foundation. (It also siphons votes and resources away from other parties, ones that could perhaps be more usefully employed formulating policies within mainstream parties that have a hope of being elected in numbers.)

I personally disagree and think the Green Party is coherent in its vision of offering policies undergirded by a focus on sustainability, in the same way the Tories offer policies broadly based on the principles of personal accountability and small government, and Liberals’ policies are broadly based on the idea of equality of opportunity and greater state involvement. What differentiates the national parties from the Bloc is that their policies (for the most part) allow Canada to work together without demanding rights and special privileges for some and not all.

To be clear, I don’t believe that parties should stick strictly to where their political forebears have trod. But political parties are important because they organize political thought and allow voters to make decisions based on what they imagine will be consistent ideologies. No election campaign can cover every possible scenario, so we want those we elect to act along predictable lines when something unexpected occurs. Those who elected George W. Bush in 2000 should not have been surprised that he reacted to the September 11 attacks as a conservative Republican would; this was the blueprint he ran on. With some exceptions, right-wing American politicians have often shown less regard for multilateral institutions like the UN than their left-wing counterparts. It is part of their ideology.

The American comparison is useful because it also shows us what a visionary candidate for a nation’s leader looks like. Vision is a mandatory quality for American presidents. They need to be able to energize vast numbers of voters into believing in their vision of the future. George W. Bush had a vision, that of “compassionate conservatism.” Obama certainly had a vision – of hope, change, empowerment of communities and international bodies, and support for social programs. Some might argue that he was awarded the Nobel Peace Price for enacting his vision of an America in partnership with other nations around the globe in such a short time after taking office.

The last Canadian PM to win a Nobel Peace Prize was Lester Pearson. During his minority government, he implemented what are now seen as the signature Canadian social programs and icons, including universal healthcare, the CPP, and our current national flag.

Do any of our current potential national leaders have that kind of vision? Please, someone, convince me – my vote is up for grabs.

ANARCHY! (or, Why the State Does Too Much And Yet Not Enough)

April 15, 2011

Nations are ever-present structures in our lives. Their appearance as political entities and actors around the nineteenth century marked an inflection point in how we think about groups of individuals and how power is aggregated. I have outlined before why nations are perennially important, and the enduring popularity of my post on A Hierarchy of National Needs among search engine indices attests to the currency nationalism has in the public sphere.

But I’ve been reading some fascinating criticisms of nationalism (in the form of national structures and governments) lately, which attack nations as ineffective from an ideological standpoint. The attacks come both from below – the position in which I would classify libertarian thinking, in that it desires less state control over aspects of individuals’ lives – and above – those who believe more multi-national or supra-national organizations are necessary to create global solutions to global problems.

(At this point I’ll remark that the root of both of the preceding words is “nation.” If people are going to start thinking differently about the base controlling structures in our lives, they must stop referring to them as, essentially, collections of nations, or at the level above nations.)

Criticisms from Below

A feature on the Pileus political science blog recently discussed Habermas and one of his critics, Hayek, on the issue of public discourse in the nation state. For those unfamiliar with recent political philosophy, Habermas stands as a giant in the area, perhaps most famous for his ideas about the creation of a culture of public interaction within societies in the early eighteenth century, which he termed the “public sphere.” He asserted that humans have the ability to make society more equal and just through rational communication (I can’t help but imagine that he would love erudite and insightful blogs with intelligent commentators for this very reason).

Hayek’s criticism, expanded on Pileus by Mark Pennington (who has just written a book on the subject), is that discussion in a free and protected public sphere is not enough. The actions of individuals can in many cases be better proof than simply the airing of ideas or theories, particularly if such actions are brought about by acting in opposition to the prevailing belief system of the times. As Pennington writes:

The spread of knowledge in markets, the arts and science does not typically proceed via collective deliberation, but advances best when individuals and groups have a ‘private sphere’ that secures the freedom to experiment with projects that do not conform to majority opinions. Then, incrementally, through a process of emulation the prevailing wisdom may change over time. It is not sufficient for people to be able to talk about their ideas. Rather, they must have scope to act on those ideas – and this requires ‘property rights’, not ‘speech rights’.

In a way, this argument can be summed up as advocating free market principles across the board, the ability of people to “vote with their feet” and come to agreement with popular action as well as discussion. Pennington also notes that the independently wealthy have an important role to play as “trail-blazers for new values and ideas.” I think of such individuals as venture capitalists for ideas.

In essence, I see this argument as, broadly, a repeat of one of the key arguments against socialism/communism, which is that it restricts choices and enforces conformity. Pileus clearly has free market, libertarian leanings, and seeks to decrease the role of the state in favour of individual accountability (much like David Cameron does). It’s fascinating to read a defence of this viewpoint from the perspective of the public sphere/political economy.

Shakespeare at the Supranational Level

And yet, on the other end of the spectrum, I see criticisms that the state is not powerful enough. In a discussion from Big Think’s series this month on Shakespeare’s continuing relevance, Kenji Yoshino notes the relationship between the plot of Titus Andronicus and modern statecraft:

Titus is about what happens when the revenge cycles spin out of control. Revenge tragedies represent something that happens when the state is very weak and so the Elizabethans had a very weak state where there wasn’t a standing army.  There wasn’t an effective police force and so when something happened that was horrible like someone kills a member of your family, you had to choose whether to rely on a very weak state that was basically going to do nothing or to take justice into your own hands. It strikes me that we’re at the international level where the Elizabethans were at the national level because we’re stepping onto an international theater in the way that you described and there isn’t a centralized authority that’s going to step in and quash the revenge cycle. So if terrorists fly planes into our buildings what are we going to do, go hat in hand to the UN?  No, we’re not going to do that.  We’re going to engage in vigilante justice, right, but we all know how those stories end.

What a fantastic concept – nations as vigilante justice-inflicting thugs. It speaks eloquently of the toothless nature of the supranational power structures in our world today. One excellent comment on a post I wrote about nations emphasizes the often unworkable nature of supranational bodies: in essence, they are made up of groups of nations that have wildly varying amounts of power, wealth, and desire to change the existing global power structures. The vested interests have no reason to want to give up their advantage – and why would they? The comparison to an early modern land with no police state is apt. Why would a powerful, wealthy family used to settling its own scores want to give up that privilege and pay into a communal system of policing that would essentially render them equal under a higher law to those who would seek to do them ill?

Moreover, surpanational institutions are much more difficult to hold accountable for their actions, because who will hold them to account? Democracy functions well (usually) at the national level because there is always the threat of voters punishing the incumbents for abuse of power or poor decision-making. Such is not the case at the supranational level, where even democratic bodes (such as the EU parliament) are subject to the whims of national leaders and their fears for re-election, which always take precedence.

Considering that empathy is a theme I’ve been working with lately, I can paint this as a picture of its limits, too: while there are certainly many characteristics and loyalties and ideas that are shared among all humans, mostof us can only extend our feelings of “sameness” and empathy so far. Those outside of our national community (or, also quite commonly, race/ethnicity, which often amounts to the same thing as a national community) are easier to ignore because they are not like us. They don’t share our histories, or national institutions, or language, so they are harder to comprehend.

In the longer term, I believe nations will fizzle out, gradually ceasing to hold the importance they do now. With pressure – both practical and ideological – from above and below, power cannot continue to function effectively at the national level forever. The anarchy in the title, however, refers more to the absence of a coherent and consistent “publicly recognized government or enforced political authority,” as Wikipedia puts it, at one level. Perhaps we have a future of shifting loyalties (local, national, supranational) and power brokers vying for our attention as voters and citizens. Sounds like a party.

A Communism of Pain

April 10, 2011

When I was younger I thought often about the idea of a communism of pain.  If all humans were somehow linked to the extent that pain could spread itself out among many, what would be the net effect at the individual level? How much pain – in terms of an impossible-to-quantify objective amount – is out there in the world? Would the extreme suffering of the few spread out to a chronic, if manageable, level of pain for the rest of us? Or would it, distributed amongst the billions of humans on the planet, amount to almost nothing in a single one?

Of course, I understand that pain is a biological imperative, our bodies’ way of telling us that something is wrong and that we should stop whatever we are doing that is causing it. But from a purely sociological (or maybe political) perspective, what would be the result of averaging it out? Perhaps equal distribution wouldn’t be optimal – after all, communism in theory espouses taking from each according to his ability, and giving to each according to his need. Varying pain thresholds might in some way be taken into account. Or perhaps those most in a position to inflict pain could be those who felt it most deeply. (No pain, no gain, as it were.)

Actual sharing of pain through embedded receptors or similar technological enhancements is more in the realm of science fiction or post/transhumanism than reality at present. But empathetic pain-sharing does in fact exist. Recent research has indicated that the same areas of the brain are activated in those observing someone in pain as the actual sufferer. In both cases, our anterior insular cortex, the area that monitors how we feel about things inside our bodies, and the anterior singular cortex, the part of the brain that processes emotions and attention, are engaged. Moreover, the empathetic response is greater the higher the level of affection for, or perceived identification with, the sufferer.

Pain expert Sean Mackey theorizes that pain empathy played a role in mammalian evolution by signalling those in distress so a pack could stick together, heal together, and prosper. Noted primatologist Frans de Waal would agree. He studies bonobos, the great apes scientists now believe are as closely related to humans as chimpanzees. He has concluded, after studying bonobos extensively, that empathy is a much more basic instinct than many consider it to be, and much less intellectual. Instead of a fairness rationalization, or a sense that one can imagine himself in another’s position, he believes that empathy is much deeper, and less complex. His theory explains why infants show empathetic responses to fellow children crying, but only learn theory of mind, or the more intellectual basis for understanding others, around age four. Incidentally, a physical basis for empathy also explains the contagious nature of yawning, as he has explored in other research.

Communist bonobo

A communist bonobo (picture slightly adapted) - does he feel our pain?

Bonobos are also noted for their very sexy way of solving all kinds of problems, and for generally displaying much more cooperative and less competitive behaviour than that of chimpanzees. This is significant because the narrative of competition has coloured much of the modern period’s image of itself, and its image of the way early humans lived – nasty, brutish, and short, as Hobbes once wrote. De Waal locates the competitiveness myth around the time of the Industrial Revolution, as a necessary backbone for the proto-capitalist system that was then forming, and which has now come to dominate global economics and politics.

The political bent of the concept might be significant. A growing number of studies has pointed to those on the more liberal left end of the political spectrum being more open-minded and thus more empathetic than their more conservative counterparts. Tolerance, inclusiveness, and a passion for social justice have recently been linked with both political liberalism and high levels of empathy. (One might ask if this implies that communism is a political representation of empathy, which could set off hours of debate, I’m sure.)

Given the general trend toward a more liberal way of thinking and behaving over the past hundred or so years, and the ever-expanding list of encounters with “others” that telecommunications, air travel, and globalization has allowed us, is it possible that humans are in fact more empathetic today than they were, say, when Victoria ruled England? Or when Arthur did? Would the apparent recent setback of declining empathy and rising conservatism then be a blip, or a reversal?

And if we are more empathetic now, does that mean we inflict less pain on others than in the past?  Sadly, I believe conflicts arising out of urbanization, a skyrocketing global population, and scarce resources – coupled with the arrival every year of new ways to maim and torture others – would signal otherwise. After all, it appears that humans also share enjoyment of schadenfreude, the pleasure in seeing others’ misfortune (apparently as much as a good meal). Similar to the way being in a group can magnify feelings of competitiveness, it can also augment satisfaction in seeing rivals fail. This enjoyment also carries a political twist: in one study, Democrats were found to be secretly happy when reading about the recession, thinking it might benefit the party at the next election. And the stronger the political identification, the stronger the sense of schadenfreude.

It seems, then, that we are hardwired both for empathy towards those in pain, and a delicious satisfaction with seeing it. Perhaps a communism of pain would therefore make us more sensitive to the suffering of others, but all the more likely to enjoy it.

(Note: Almost all of the articles linked to in this post were fascinating to read; I’d highly recommend perusing the ones on primates and schadenfreude in particular.)

Revisiting Posthumanism: Technology and Empathetic Fallacies

April 8, 2011

Empathy is a critical component of human interactions, and has been essential to our evolution as a social species. It lies at the root of our dominance over other species that do not share the collaboration mindset. Effective social interactions and behaviour modelling create group cohesion and action. And as the world becomes ever more urban and crowded, empathy is more important that ever. There is among some scientists a palpable fear that modern technology decreases empathy, lessening our intuitive social skills. But the potential for technology to actually increase empathetic feelings is immense — so can the use of technologies therefore make us more human?

An article in the New York Times this week queried the effect of facebook on relationships: does using facebook make people less inclined to invest in face-to-face contact? It may be too soon to tell, but a recent study has indicated that technology is still just the medium. Those inclined toward fulfilling relationships will use facebook as a tool to expand and deepen them. Those inclined to withdraw from society will use facebook to withdraw still further.

One insightful commenter disagreed, noting that his own studies have found that, among college-age students, empathy has been declining over 30 the past years, and markedly so over the last 10. His findings jived with a recent article in Scientific American on the same subject. The implication, of course, is that all that time at the keyboard, along with the general trend toward social isolation, reading less fiction for pleasure, and an uptick in the number of youth who describe themselves as conservative, has re-wired our brains in such a way that we can no longer relate as well to each other. Moreover, technology makes it easier for people to be exposed to only what they want to be exposed to, and only world views that align with their own – incomprehensible amounts of such one-sided content, in fact. Limiting exposure to those who think the same way is a choice increasingly made by those who can afford to do so.

But I can’t help but wonder if technology is, again, just the medium through which all of this plays out. Those who don’t want to encounter anyone who votes for a different political party or has a lesser socio-economic status and who consequently cloister themselves in a one-note Internet news digest, for example, are the same people who will live in a gated community in the real world, lessening empathy and social cohesion in that way.

And technology can also help empathy expand and grow in the real world. An honourary TED talk I watched recently showed a historical extension of empathy from individual to blood relatives, clans, nations, and even other species. The key is exposure, understanding, and a feeling of shared goals. Without the Internet, there would be a lot less exposure to and understanding of different cultures. Would the “jasmine revolutions” have spread so quickly without knowledge sharing between underemployed 20-somethings with Twitter accounts? Thomas Friedman thinks not, and also credits other technologically-reliant factors with helping to spur them on. Among these are widespread reporting of what corrupt officials were up to through Al Jazeera, the ability to see vas swaths of underutilized government-owned farmland via Google Earth, and an image of China on the rise from the Beijing Olympics.

As far as shared goals, technological interactions are in some places considered to set the standard for cooperation and teamwork. A recent Economist article argued that playing World of Warcraft or similar team-oriented role-playing games can increase engagement and skill-building, leading to greater success in the workplace. (Hey, it worked for the CIO of Starbucks.)

The narrative of the game may be key here. Writing in the Journal of Evolution & Technology, PJ Manney locates storytelling at the centre of empathy. Stories are compelling ways of showing how humans share the same desires, values, hopes, dreams, and fears, he says, and technology has always been important to diffuse stories between different cultures. As the evolution of technology has taken us from the printing press and the novel to instantaneous news and the explosion of opinions in blogs, storytelling has become more immediate, more prolific, and more visual. And, returning to the theme of post-humanism (or the near-synonym transhumanism, or “H+”, which Manney refers to), the future human that has made use of sensory technologies to the point of incorporating them into his or her make-up can be even more directly connected to others by literally experiencing the world as they do. Manney refers to this sensory augmentation as “a more effective connection with others, through a merging of thought or telepathic link or internalized instant messaging.” This is WoW with human-human interactions, instead of human-character role-playing.

Posthumanism/transhumanism is feared, as I wrote about in an earlier post, because some believe technological “enhancements” would create inherent inequalities among humans. Yet it is possible that technology could incite a great equalization of feeling and experience — and empathy.  In effect such changes would therefore make us better able to relate to each other, and in the end more human, not less so.

Knowledge and Power in a Skeptical, Connected World

March 18, 2011

Who do we listen to, and why? In an age when we can find anything information quickly, what does it take to be a voice that rises above many others? What kind of power does this represent?

I read in the latest edition of the Harvard Business Review that in 2011 companies are anticipating an increased focus not just on broadly saturating target markets with facebook ads and silly “viral” videos, but on targeting “influencers” as part of their “social media” strategies. These individuals are those who shape culture and get other people on board with new trends and ways of thinking. Oprah is an influencer. Radiohead are influencers. Steve Jobs is an influencer. And a lot of random bloggers, tweeters, and other social media characters whom you’ve never heard of are influencers, and they are going to be targets of corporations because they are both cheaper and perceived (perhaps) as more authentic shills than their more famous counterparts.

You can be sure that by the time something gets annotated up to the level of an HBR trend to watch, it has already set the Internet abuzz. Further research on “measuring influence” yielded far more twenty-first-century social media examples than any others. It seems that organizations have (finally!) learned that a “social media strategy” on its own is of little benefit without real, grassroots endorsement. However, I’m more interested in what “influence” looked like in the past, before it morphed into a social media concept to be made into the next corporate buzzword, and what characteristics have stayed with perceived “influencers” since.

It seems it is a tricky thing to quantify, or even define. An article I discovered about the role of influence in economic history discusses how it is closely related to communication, but can range from impression to force in the amount of strength it implies. The other critical factors in determining long-term influence were time and space. The example given was Saint Thomas Aquinas, whose ideas were central to much medieval thought (throughout the Latin-speaking world, at least), but are relatively inconsequential today.

Influence and Power – and Money

Influence, as the article points out, is closely related to power. One of the concepts that has stayed with me since learning it in an Organizational Behaviour class years ago is that of differences in the kinds of power wielded by individuals. They can have positional power, power stemming from one’s role as, say, a manager or a parent or some other official and likely formalized figure of authority, or they can have personal power, that stemming from an individual’s character or beliefs, and likely more informal in nature. The difference between them parallels that of practical/mental authority vs. emotional authority, and the general consensus is that emotional authority goes much further in influencing others because it does not rely on a (potentially temporary) and wholly external power differential the way practical authority does.

When I consider what influence looked like in the past, it seems there was little distinction between the two types of power mentioned above. Perhaps the theory I just articulated is a fall-out from our comparatively recent fixation on merit over birth status as a rationale for power. Indeed, the ideas (and names associated with them) that have survived best throughout history to influence many others have always been backed by great financial power. Take religion, for example, which has been perpetuated by wealthy organizations that held positional power in their communities. The familiar expression about history having been written by the victors speaks to the tendency of dominant individuals, families or states to justify their authority with historical precedent. And most of the theories in every field that are still with us today were dreamed up by men with solid financial backing and the ability to spend large amounts of time reading and philosophizing. (Even Marx lived off the generosity of his bourgeois co-author, after all.)

But today that is changing — to an extent. YouTube, twitter and other media that celebrate memes and all things viral can make ordinary people famous astonishingly quickly. Such fame is often fleeting and of dubious value to society, but savvier types can sometimes parry their sudden name recognition into the more lasting sort or influence (Justin Bieber, anyone?). This can happen because influence is magnetic and self-perpetuating. Mommy bloggers who are already widely read and respected are natural candidates to push band-name diaper bags or whatever else new mothers supposedly need and want. That corporations want to latch onto such people is hardly surprising – they are merging their corporate power with bloggers’ influence in new markets, and the bloggers want to in turn increase their own profile through association (or maybe just get free products).

Self-perpetuating influence applies to companies as well. The new techie term for this concept is “network effects” – as the Economist defined it recently, “the more users [services like facebook, eBay, etc.] have, the more valuable they become, thus attracting even more users.” Whereas in the past money and power begat more of the same, today we can add hits and click-throughs to the mix.

Knowledge Brokering from Darwin to Wikipedia

The common link between these people and corporations is the way they treat knowledge. They are what the corporate world now refers to as “knowledge brokers,” a title that refers to the ability to clarify and share information with different audiences or spheres, and determine what the common elements are between, say, Paul Revere, corporate marketing, and the AIDS epidemic. Knowledge brokering (and a bit of luck) is what separates widely-read bloggers from those who write solely for themselves (whether they want to or not). It is the ability to write things that people find interesting and useful. The CIA is investing heavily in such people after a serious of incidents that demonstrated how segregated and impotent their different bodies of knowledge were.

Knowledge brokering is more than simply aggregating (though smart aggregators of information are helpful too). It is the ability to analyze and draw connections while becoming a trusted conduit of information. Knowledge brokers are perhaps an antidote to the pervasive and growing tendency to overspecialize, because they connect many specialists and their ideas with a broad audience. They are the reason we know about Darwin’s ideas. Or Jesus. Or celebrities’ latest faux-pas. Wikipedia is one giant knowledge broker that has an army of largely volunteer knowledge brokers in their own right mobilized on its behalf. That is power.

But what makes us listen to them? I suspect the key is authenticity. A lingering distaste and a keen sense for corporate marketing disguised as something else define our era. Perhaps the main difference between influencers from the past and those of today lies in the type of power they wield, as I outlined above. Personal power – like that wielded by bloggers and Oprah – is seen as more trustworthy because it lacks an agenda (whether or not this is true). Positional power is usually distrusted simply because of what it is. We only listen to Steve Jobs because we truly believe he has our best interests – in being cool and technologically savvy, regardless of the product – at heart. In contrast, many Americans discount everything Obama says because they believe he merely wants to increase his own power and unveil his secret socialist agenda on an unwilling populace.

Is this a reflection of our philosophical allegiance to free-market democracy? Is influence and power of all kinds just the ability to get people to like and trust you? If so, many corporations are going to need a lot more than “influencers” on their side.

Food for thought: How do those with positional power gain credibility? Is this knee-jerk anti-authoritarian mindset in society as prevalent as I say it is? Do people who seek to perpetuate their influence by getting behind corporations somehow weaken their own authority (i.e. do they lose their ‘cred’)? Hm.

MARGINALIA: Though I did not explicitly link to it in this post, the Economist’s Intelligent Life ran a fascinating piece recently on The Philosophical Breakfast Club, a group of four Victorian scientists who were definitely knowledge brokers (and nifty polymaths) and who were key influencers in their time. I’d recommend reading it.

It Takes a Village: Why Not Outsource Childcare?

March 14, 2011

The 100th Anniversary of International Women’s Day last week got me thinking about how glad I am not to be Betty Draper. Yet despite our advances, the promise of happier people – which of course includes happier families – has not borne out. The feminist movement has made great strides toward equality, but often at the expense of children, many of whom now grow up in an environment with no parents at home. We could debate at length why so many families feel the need to have two working parents (is it that corporations no longer pay a “family wage”? or have standards changed and now families believe they need more things, bigger houses, etc.?), but it would not alter the fact that most families have not substituted a father working all the time – and a mother at home – with two parents alternating working half the time. Throw in a divorce rate hovering around 50% in the Western world, and single parents who have no choice but to work long hours, and the result is millions of children with almost no parental direction for much of the time, let alone quality time with two parents.

One of the enduring themes of this blog is the increasing over-specialization of work, study, and entertainment, but I have yet to touch on the arena of parenthood. So allow me to play Jonathan Swift for a moment with my own modest proposal: outsourcing childcare to those who can do it efficiently and – most important – effectively.

Why not outsource parenting? We seem to have made most of the rest of our lives as efficient as possible. Instead of each of us owning farms that grow all our own food, we have created supermarkets and other supercentres that not only sell food, but everything from pharmaceuticals to care tires. Millions of office drones sit in cubicles doing the white-collar equivalent of screwing a bolt into a chassis over and over for eight or more hours a day, the epitome of over-specialized corporate work.

And childcare itself has changed from the days of one parent teaching her young how to get on in life. Public schools were established 1 000 years ago to teach Latin to poor children who could not afford private tutors. Today it is a legal requirement in most countries that children spend their weekdays in classrooms full of other children. (And most do: the latest statistics for homeschooled children that I could find put the number at only about 3% in the United States.) We have already outsourced the majority of education to professional teachers, from the fundamentals of literacy and numeracy to advanced calculus and classic literature.

At an even more basic level, many working parents outsource childcare to day cares, nannies or relatives. Crèches, the forerunner of modern day care, were established in France in the 1840s near factories so working women could drop their children off there during the day. Today they are everywhere. As the percentage of working women (in Canada) aged 25-54 rose from around 50% in the 1970s to over 80% today, there was an accompanying rise in the number of children in non-parental care.  In 2002, 54% of Canadian parents had someone else look after their children during the day, up from 42% in the mid-nineties. In the U.S., almost two-thirds of pre-schoolers are in non-parental child care.

So outsourcing our parenting – if I can be forgiven for using such a cold, economic term – is certainly palatable to the majority of parents, at least some of the time. And there is most definitely a broader need for it, though less quantifiable. I needn’t go into the many social ills connected with a lack of influence, or parental influence, attention, or role-modelling during childhood, as these are well known.

There are many bad parents out there, but while we are quick to want to get rid of other minders who are ineffective, like teachers or nannies, social and biological conventions dictate that it is a lengthy and difficult process to “fire” parents. Leaving children exposed on mountaintops or in the care of a nunnery (in which something like 80% of the unfortunates dropped off died anyway) has gone out of fashion in developed countries, except in certain safe havens like Nebraska, so instead they remain with bad parents, or in foster care, which for most is not the optimal solution. Even parents who love their children can make bad child-rearing decisions with the best of intentions.

But what if the default option for raising children, like public schooling, was communal (or private) care by qualified parent-like figures? The right to “home parenting” (like home schooling) could be awarded only to those who are qualified to practice it, with regular supervision by a central body. Consider: specialist “parents” rearing children in groups is hardly a radical idea. The old African proverb about a child needing more than one knee, or the much more famous one that serves as the title of this post, indicates that our modern way of raising children is little more than a hiccough in the trajectory of human history.

Most parents raise only a few children, but almost all say that it gets easier the more they have, as they build experience and knowledge. Specialized parent substitutes would have the benefit of raising perhaps tens of children, and, what’s more, they would love it, because it would their career of choice. Children would also have the benefit of a diversity of tried-and-true, centrally vetted and approved child care methods, culled from what has been proven to work well internationally and throughout history — call it a “best practice” approach to parenting. Just think of what costs could be reduced or eliminated in  a society with a higher proportion of well-adjusted children – everything from healthcare (therapy and counselling) to policing and incarceration costs.

Clearly, this is not likely to happen anytime soon, and I no doubt open myself up to charges of everything from heartless communism to wanting to run state finances into the ground by proposing elaborate centralized childcare schemes such as these. But consider: we wouldn’t trust spinal surgery to someone who has never done it before and who would spend half the time we’re in the operating theatre off in corporate meetings somewhere else or on his Blackberry. We wouldn’t want an unqualified engineer building a bridge we have to drive over, especially on almost zero sleep while laying the foundations.  Yet we allow complete amateurs to raise their own children armed with little more than evolved instinct and maybe a copy of Dr. Spock. Does that really make more sense?

Some Loose Thoughts on Americans and Trains

March 9, 2011

Apparently there is a movie version of Atlas Shrugged coming out soon, and while I have neither seen it nor read the book (something I plan to remedy within the next month or so), I have read a few of the many criticisms and laments out there about the book and philosophies contained within it. These come from all sides of the political spectrum, but one of the more interesting ones to me concerns the role of infrastructure and the changing nature of support among conservatives and libertarians for large-scale rail projects on American soil. While Ayn Rand’s magnum opus features libertarian railroad moguls who plough vast sums into railroad development, railroads today are pariahs of American transportation infrastructure, and to none more so than the political right.

David Weigel on Slate summarizes the opposition to high-speed rail (and rail in general) from the American right mainly as opposition to state subsidies. There is a widespread belief that money pours from government coffers into railroads – at a cost to the taxpayer of between 13 and 30 cents per passenger, compared with between 1 and 4 cent subsidies for highways and other roads. Whether these numbers are accurate is not the point of this post; merely the perception of it being true is (as with most subjects in American politics these days) enough to colour the popular and official debate substantially. I’ve heard others comment that rail travel is seen as a form of communism.

The irony of that idea, of course, is that railroad owners were among the first übercapitalists of American business, sucking profits from their trade with an almost monopolistic hold on the industry. Names like Vanderbilt and J.P. Morgan and known to us now because many of these obscenely wealthy railroad barons wanted their legacies to live on in the form of grand houses, universities, and other large-scale public charitable works.

I’ve written before about how cars in the early days of automobile travel were seen as a “less technological” option than railroads, more rugged and democratic and, well, American. Travelling by car in those days was both challenging (tires exploding or parts falling off every few miles) and exhilarating (unprecedented access to tourist sites that railroads just didn’t go to). The ideal of the open road, and by extension the “open West,” has echoed down through the annals of American history from beat poets to “Boys of Summer,” and was undergirded by the Eisenhower administration’s creation of the extensive Interstate Freeway System in the 1950s.

But I never picked up on the “communism” angle, in part because that wasn’t a concern or a term bandied about frequently in American political discourse until the second decade of the twentieth century at least. Today, of course, high-speed rail and trains in general aren’t seen as feats of American engineering and technical prowess, but symbols of European- and Chinese-style communism.

Attitudes have changed: both railroads and cars have largely lost their breathless romantic and innovative associations and have become part of the humdrum reality of everyday transportation. Many people view their cars more as prisons (especially when stuck in rush-hour traffic) than gateways to the wonders of nature. And while European-ness today still has some cachet if it involves sitting in a café in Paris on vacation, Americans are confident enough in their own government that they certainly don’t aspire to managing their infrastructure like the Europeans.

The last paragraph of Weigel’s article clearly illustrates the link between railroads, communism and other un-American ideas:

Before and after 9/11, George Will was talking up rail as a way to take more people off planes and make America less vulnerable to terrorists. That argument has more or less vanished. Why? “It helped that somebody bombed a train in Spain,” says O’Toole. “If you concentrate people in one vehicle, then the vehicle is vulnerable. You concentrate society, and it’s vulnerable. So maybe it’s not a good thing to concentrate people.

Makes sense. People concentrated together in one vehicle are vulnerable to attack without the ability to pick up and go whenever and wherever they want to, as in cars. Similarly, people who have a shared and singular collective mindset are vulnerable without the influence of democratic choices. Looks a lot like communism, right? So perhaps we shouldn’t be surprised the next time a state governor turns down a billion-dollar high-speed rail line subsidized by the federal government. He’s probably imagining that it’s the last stop on the Lenin line before Revolution Station…