What is History Writing Now?

April 27, 2010

People reach post historical all the time by searching for odd little historical, philosophical and political science-related phrases. Given the obscure nature of many of these terms to those not deep within postcolonial or imperial studies, I assume they’re doing research for some paper or project. I wonder if they feel they can trust what they read. Am I a reliable source? Are my ideas sound? Can one cite a blog, or is this an even bigger research no-no than citing Wikipedia?

If it is, why? Consider this blogger: I have a graduate history degree from a good school, which, for many, constitutes formal “training” in the discipline.  I know how to cite sources and (hopefully) construct a logical and well-supported argument. Does this make me “qualified” to comment on things? Does being qualified today require being intelligent, well-trained, and peer-reviewed (in the traditional sense), or does it come from an even more democratic approvals process based on sheer number of readers? Would having six million hits to my blog make me a “qualified” opinion leader? Or do I need to have published six books through a university press that only 6 000 people will ever read in order to be a real “expert”?  And is either something to which I should aspire?

These questions have far-reaching implications for me as I go through the process of deciding whether to continue on with studying history as a career, or do something else entirely – something more practical, that would affect people more directly than a well-researched book in an obscure field and a few impassioned lectures about Lord Curzon and the Raj for a dwindling number of undergraduates who don’t care. Because it’s very important to me that I influence the way people think, not in a creepy mind control kind of way but by presenting a fresh perspective that makes them reconsider the world around them and how things work within it.

I’m not sure academic writing is the best way to do that: its scope is too narrow, and its audience is those who are already predisposed to thinking from many angles, and who likely know a lot about the subject already. Traditional academic writing is also very dry. It connects with the reader because it is persuasive, and offers a sourced argument with little personal point of view. Blogs and new media, in contrast, connect with readers because they cover current events and are often based on personal biases or feelings. They are inherently populist, because the vast majority of bloggers want others to read their blogs, and so they talk about things that appeal to a large audience: fashion, entertainment, celebrities, popular political news, etc. And the vast majority of people who read blogs read about the above topics. But does this make them experts in their fields? And does it translate to “academic” subjects like history?

One of my main goals for post historical is to bridge this gap with a forum that is flexible enough to talk about current events and timeless philosophical questions at the same time, yet with a focus that isn’t so personal or academically specialized to be unappealing to a broad audience outside of a strict historical discipline. One might call this “accessible” writing, though as I wrote about in my last post, “accessible” can be a bit of a loaded term. What matters most to me is making an impact in a way that is direct and tangible, which is why the thought of another history degree and a life as a struggling academic is slightly off-putting at times. It’s very clear what such a life could do for me: I’d be a recognized expert in my field; I wouldn’t have to get out of bed torturously early every morning to go to another soul-crushing corporate meeting; I’d be able to have great chats over coffee with fellow bright people and give speeches about things like maps; I could help out engaged students by giving them interesting research suggestions; and I would generally get to run around having people think I was a big smartypants. Clearly, these things all sound fantastic. But what would a life like that do for others, even if I did manage to actually get a job out of it (which these days, as my fellow blogger and history professor trivium points out on his excellent blog, almost nobody does)? How would it contribute to my big life goal of being a respected public intellectual who makes people think in exciting new ways?

I don’t mean to criticize academics, who are generally brilliant, insightful, creative people. It’s the system that is at fault, a system that encourages people to go to school for 10 years with absolutely no hope of finding employment of any kind at the end of it, a system that encourages killing trees by publishing books nobody cares about, and a system that has created the popular feeling that it is so removed from the everyday that it serves only to train its own. I fear academia is becoming so specialized that it just doesn’t have the impact, or the scope, or the popular appeal, to be taken seriously. When the people who make the most money and important decisions all have MBAs and law degrees, humanities majors are in for some trouble. Actually, we’re all in trouble because we’re losing out on diversity of thought and experience – big time.

As I’ve written before, I think great writing is all about having a conversation, which necessitates a connection between readers and writers. One of the great things about blogs, and Wikipedia, and other new media is that the connection – and the feedback, via the comments or revisions – is immediate, and the process of forming consensus iterative. This is when history and philosophy are really exciting (and this is why I love to receive comments and feedback from readers, particularly when you disagree or want to point out something I’ve missed). Traditional academic writing just isn’t set up to react quickly enough to changes in events, or popular feeling.

So, to paraphrase the great E.H. Carr, what is history writing now? One would think that it would adapt to the changing relationship between reader and writer, from words sent down from a lofty perch in an ivory tower to those that are picked over in comments forums around the world. It hasn’t. And we’ve all lost something in the process.  The Economist ran an article today about how this election (in Britain) is bound to be a landmark one, and yet has no landmark book or philosophy written about the popular mood to match it, or to spur discussion, as was the case in 1945, 1964, 1979 and 1997. (I was particularly excited to see that the article cited one of my historian idols, Linda Colley, as having written a significant work from which Tony Blair drew inspiration in 1997.)

Can it be that nobody has written anything groundbreaking in the past five or ten years that bears mention? Or is it that the political audience is too fragmented – or too busy writing their own blog posts – to notice? Is there still a place for the academic as a public intellectual, or has academic writing been pushed to the fringes of literate society by virtue of being irrelevant to everyday concerns? And if academia is on the fringes, who is in the centre?

I suppose we can all take comfort in the fact that there is still the expectation of something by the intelligent people who read and write for publications like The Economist. There is an intellectual void that will always need filling, by academics or writers or particularly insightful and far-reaching bloggers. The question for the next few years, it seems, is whether those who step up to fill it will have new job titles, and if so, what they will be.


Jargon and Power: Why “Touching Base” Equals Linguistic Imperialism

April 26, 2010

I’ve always thought that jargon was just another way to measure inclusivity. Newcomers to the corporate scene are often barraged with inscrutable acronyms, and people who want to “touch base” and “connect” in order to decide on “actionable next steps.” Other favourites of mine are the ever-present “deck,” otherwise known as a PowerPoint presentation in which one expands five sentences into thirty slides with swirling slide transitions, and the “ask” [n.], which, from what I’ve been able to discern, is a way to cut down on the syllables required to say “request.” Efficiency indeed.

In academia, it’s even worse. It seems that no book or article can be taken seriously until the author has proven his or her credentials by name-checking every obscure phrase that has been written on a subject. This practice serves only to repeat ad nauseam the same tired debates over and over with little new beyond increasing specialization, which I’ve attacked at length before.

Considering how pernicious it is to the Plain Language Movement, however, there is shockingly little popular or academic treatment of the subject of jargon. Perhaps it is because, as New Left academic Peter Ives says in his fantastic 1997 article “In defense of jargon,” “jargon is only jargon for those who don’t use it.” Maybe we like to be inscrutable because it makes us feel more intelligent. Or maybe the world is changing so quickly these days, we need something familiar to hold onto, and clichéd language represents a security blanket of sorts.

The ways in which jargon has evolved seem to support this theory. In “‘As Per Your Request’: A History of Business Jargon,” Kitty Locker writes that jargon has eras, identifying the pre-1880s, 1880s-1950s, and post-1950s as distinct periods in business communication. (Given that the article appears in a relatively obscure academic journal and was published in 1987, it obviously doesn’t touch the Internet age, and so I imagine the author would have to add another for the post-1990s period for all of the tech speak we use now.) But if we think that the 1880s-1950s (when jargon use was at its peak, apparently) saw the rise of corporate America, and with that an emphasis on professionalism and specialization, we can see the early roots of corporate-style conformity. And today there is just as much human need for conformity, but more arenas from which to choose one’s allegiance: corporate, social, technological, generational, geographical, etc.

Locker argues that corporate jargon and ‘stock phrases’ came about primarily because new employees tended to copy old correspondence, either in style or in actual phraseology. Often letters doubled as legal documents, and so the terminology had to be fairly set. Then, from the 1920s onward, American firms were interested in improving business communication, with big companies often having a person or department who monitored it and tried to get everyone to use the same words and phrases. (O, that I could have the job of whipping corporate employees’ communications into shape! Alas, cost cutting.)

Today, I suspect jargon use comes less from official processes than by the subtle attempts to reinforce unofficial corporate/academic norms and hierarchies with new employees. Using jargon – in the form of acronyms, company-specific words, or highly technical language – creates a sense of inclusivity among workers, which is exactly why, if senior executives/group leaders ever thought about it, they would have a vested interest in keeping it around.  It is a badge of honour even today for new recruits to master the new group’s/company’s lingo.

Interestingly, Locker points out that companies have had little success in eliminating jargon even when they have tried. A bank in the 1960s tried to freshen up its letters by taking out the standard greetings and salutations, and received numerous complaints from customers who were having trouble recognizing the letters for what they were. As she amusingly quotes, “the value the reader places on the distinctiveness of a business letter can easily be overestimated.” (Indeed.) And it is a daring academic who braves the censure of his or her peers by not mentioning what Foucault thought about the issue, or how “post-x” something is. (One might wonder if s/he even had an advanced degree.) It seems that there is comfort in the conventionality of jargon for both user and receiver.

I wonder if this emphasis on conventionality spreads beyond the walls of corporations and academia. Familiarity and belonging are powerful emotions, after all, and it takes a lot more effort to be fresh and original than to retreat into the comfort of clichéd words and phrases. It is often easier to be anonymous than to be articulate.

Jargon may also have more sinister undertones. Peter Ives argues that most of the jargon we use today (he was writing in 1997) originated in the right-wing military/political/business elite. It seems that we are endorsing a pro-capitalist, individualist language, because the section of society that uses such words also happens to have the means to diffuse their particular linguistic preferences more broadly.

By this logic, even our exhortations to “speak plainly” in language that is “accessible” can be read as elitist, because, as Ives asks, who gets to determine what “accessible” is? Democracy? If so, Chinese would be most accessible. Instead, we assume that “plain English” wins out, and enforce that presumption upon everyone else. Such is the stuff of linguistic imperialism.

It seems language is inextricably tied to power structures, existing hierarchies, and even imperialism. So next time someone asks you to “touch base” later, consider that by deciding just to “talk” instead, you’re standing up for the little guy.


The Educated Class and Its Discontents

April 13, 2010

In a Special Report on Germany in the Economist recently, the traditional German system of education, while excellent at producing great engineers and skilled trade workers, came under criticism for its rigidity and unfairness. In Germany, ten-year-olds are marked out for either a career of manual labour (skilled or otherwise), white-collar work, or the bureaucratic/professional work that comes after university, and sent to separate schools accordingly. Ten is too young, its critics argue, to give a child a direction for life, which will become difficult to change later on with guild-like labour markets that prohibit entry into professions without the right qualifications. And many complain that Germany does not have equality of opportunity. Family background is more likely to determine test scores and social status in life in Germany than it is in any other country.

With any talk of equality of opportunity, it comes up again, that old aspirational myth of moving between classes, the Horatio Alger or perhaps Will Hunting story of a genius saved from poverty by good education, mentoring or his own perseverance to rise to a different class. Because it is about class. Germans (and the writers of the Economist) are not concerned as much about eventual income distribution, which is quite fair, as they are about having the opportunity to do something else: move up the social ladder.

Focusing on class seems to be a very Old Europe thing. Only in Europe do we see that holdover of a very, very privileged elite (or aristocracy) that has old family wealth, and a poor or working class that never really seems to shrink outside of meddling with statistics, and isn’t going to because those within it have a sense of pride in being working class. A recent article on class and politics in Britian in the Economist seems to describe the six established statistical class divisions as essentially fixed. David Cameron must appeal to the same middle-class voters as Margaret Thatcher, who appreciated their aspirations to “improve their homes and their lives; to get gradually better cars, washing machines and televisions; to go on holiday in Spain rather than Bournemouth.” Hardly a rapid rise to the upper echelons of power – really just a desire to keep up with what is expected from being “middle class.”

In fact, it seems the most common way of achieving a material increase in living standards is immigration. The quality of life is much higher in “New World” countries like Canada and Australia because the basic cost of living is less, while health care and education are still available at the same high standard, or higher. It’s hard not to notice that eight out of 10 cities ranked “most liveable” by the Economist last year were in Canada, Australia, and New Zealand.

And there is more opportunity for movement between classes in the New World (a term I’ll keep using despite the fact that it makes me sound like Columbus, because I can’t think of a better one), not least because there is less emphasis on “class” in general as something that matters, at least explicitly. The class system of North America has less of a focus on income and history and more on the combination of these with other factors, such as education. My theory is that because New World societies were formed based on merit, and evolved with much less distinction based on income or family wealth (since most everyone was a poor immigrant upon arrival), education and occupation became the primary means of separating out the kind of people with whom one should associate.

The North American system is thus designed to provide more equality of opportunity. In theory, all have the same access to education, even, in some ways, up to the university level. It is a noble goal, and higher education is certainly more accessible in Commonwealth and countries and the US than in continental Europe, as this 2005 study ranking university enrollment in developed countries shows.

But the result of our comparatively open and well-attended university system has been a generation or two of liberal arts or natural science graduates who spend ten years flailing around the entry-level job market before eventually settling into corporate middle management in a completely unrelated field somewhere, making essentially they same money they would have had they been pre-classified at age ten as they do in Germany. Most look back fondly on the days they spent at university, but more for the social connections they made than the time spent reading Cicero. And we, as a society, have trouble finding enough people to sell us mortgages or build our houses, because there aren’t really university programs that teach those skills. Universities have become training grounds for the “middle class” as a whole – including the low end of white collar work – instead of training grounds for occupations where they actually provide valuable preparation, that is, the “upper middle class” work of medicine, law, academia and the like.

If nothing else, we North Americans are certainly losing efficiency with all of this finding ourselves that comes after attaining our university qualifications. We’ve also created a society in which having a B.A. means you’re under-qualified for many jobs – either in experience, or because everyone else applying also has an M.A. or the college-level diploma which is all that’s really required to do the job. It isn’t going to change, though, because we value two things too highly: our “right” to attend school (especially university) for as long as we want to, and the class position that doing so will get us.

True, recently there has been a real push by the government and colleges to recognize skilled labour and professional work as viable career options for high school graduates to consider, and one often hears flippant comments about the world needing more plumbers and electricians, who “actually make a fair bit of money.” (Reality check: this website puts a plumber’s average hourly wage at $24 in Toronto, which over a year works out to about $47 000. This is around what your average white collar worker earns, at least at first, and a plumber doesn’t carry the same student loan debt.)

But while the logic of matching skills to actual jobs may have (almost) caught up, the overall effect on what class one will end up in has not. Doctors and lawyers are still far more likely to associate with white collar workers who have attended university than electricians who earn the same amount, because education and occupation are still important class signifiers.

What would it take to change these biases? And would changing the biases reverse the trend toward hiring managers requiring ever-more degrees when hiring someone to answer telephones and make photocopies? Is there a happy medium between the German and North American systems, where there is still mobility between classes, and still equality of opportunity, but more cultural acceptance that skilled trades and professional work is a respectable way to earn a living? I’m not sure – but for all that, I would still struggle to recommend that anybody give up learning about politics or history or biology and instead learn about practical data models in order to secure a job. We are fortunate to have the privilege of being able to buy those three or four (or more) years of time to learn. I would advise anybody who asked to enjoy it while it lasts, because there’s plenty of time for uninspiring desk work later, if they so choose.


Suitable – For Men Only

March 11, 2010

Clothing is a funny thing. Some people argue that it means nothing, and is a mere distraction from what lies underneath (figuratively speaking). Many others argue that is sends critical messages about its wearer, and obsess over what those messages are.

The most polarizing issues are always related to women: everything from whether Hillary Clinton’s sensible trouser suits make her qualified or matronly to whether followers of Islam should be permitted/forced to wear clothes that cover their faces or hair. I once heard a model claim that all fashion is women’s fashion, and that we only let men borrow it periodically. It was a joke, but one that implies that the control lies in the hands of women. I believe that it is the opposite, and that because they are free from all the attention, it is really men who have the power in this regard.

I wrote a paper a few years ago about how men’s fashion in the nineteenth century was instrumental in shifting feelings of “otherness” from those of class to those of gender. That is, the key differentiators in society before the 1800s were class-based, and reflected in clothing styles. After the 1800s, the key differentiators were between the sexes. Now, before all of you political historians tune out because you think I’m going to start using wacky postcolonial/postmodern/psychoanalytic/feminist arguments, let me say this: what people wear, and especially what they wear to work, speaks volumes about the values of the society in which they live.

(And, for what it’s worth, most of this post will be about men anyway.)

The nineteenth century was notable for spawning the first modern ideas about working: there was a real middle class for the first time, and it generally participated in a public sphere of manufacturing and commerce. Trade was no longer considered dirty by the upper classes; instead, it was England’s “nation of shopkeepers” that was leading the charge of modernity and Empire, and entrepreneurs were raised to the level formerly attained only by military men and the aristocracy. For the first time, hard work and professional expertise had respect, and this sense of respect bonded men together. Of course, it also separated them from women, who were rarely if ever allowed to participate in this glorious public work – they had to stay at home and raise children (which, of course, isn’t work at all, right? It’s pure joy! That’s why women don’t get paid for it!).

Thus arose the suit. Ah, the suit. That most modern uniform that signifies utilitarianism, seriousness, and piety (through its emphasis on black exterior and white collared shirt overlay, like priests!) all at once. The package that is so simple, easy and flattering that men (and those who see them) don’t even have to think about it. The modern suit was so revolutionary, after so many years of tights and funny short pants and ruffs and wigs, that one eminent historian of fashion has said that since its adoption, women’s fashion has been reduced to its imitation.

Because before the suit, all fashion was men’s fashion. Think ducks: men had to be ostentatious and showy while women merely had to be pure and able to produce offspring. And after, it was only women who had clothing that was complicated, deceptive, and silly. (Don’t even get me started on the kinds of mishaps that could occur while wearing a hoop skirt.)

So what’s changed? Is the suit still master of the professional clothing universe? I think it still represents all of the above (with maybe the exception of ‘piety’) and is still the defining answer to the question of what is appropriate to wear to work. Of course, there are signature looks (Steve Jobs and his black turtlenecks, Richard Branson’s lack of ties, and “Casual Fridays”) but these are remarkable because they stand out from the norm. The suit is so powerful because it is a uniform and gives the wearer immediate currency in the professional world because he does not need to talk about it. But women aren’t included: if a woman wears a suit proper, she stands out. If she wears a pantsuit, she stands out for being too much like Hillary Clinton. If she wears something more feminine, she stands out for that too – perhaps for overly expensive designer elitism, a la Sarah Palin. If she doesn’t wear a suit, she is unprofessional — or worse. Whatever she wears, she stands out. If you don’t believe me, check out this picture of world leaders and tell me who stands out to you.

In casual wear, of course, it doesn’t matter – the separation between work and fun is clear and thus lacks a value judgement about competence. And besides, everyone, of all classes and both genders, wears jeans. But overall very little has changed on the professional world: the classes may mingle, but the genders remain distinct.

So what? you may ask. Clothing doesn’t actually change how competent (or incompetent) a person is. Of course it doesn’t – but isn’t it interesting that as a society we still can’t get past using the outside packaging as an excuse for our real opinions? Without all of the discussion about pantsuits, would Hillary still be considered “traditional” and a “feminist”? And don’t even get me started on shoes…

What do you think? Does it matter to you what people in positions of power are wearing? Do you respect a suit more than a skirt? Do you think clothing enslaves us? If so, how do we escape?


What’s Your Personal Brand?

March 8, 2010

The last post I wrote looked at how countries are attempting to portray themselves internationally through their brands. It is perhaps a bit odd to speak of nations through the lens of branding, as though they are things that can be commoditized and “sold” like sneakers and cola. However, I believe it is part of the zeitgeist; everything these days seems to have a commercial lens, and anything can be processed, packaged, and marketed for a profit. Call it the triumph of capitalism. (Lloyd Dobler would be unimpressed.)

Because the commoditization of everything has come to make a bit of sense to us, I want to examine in a bit more detail another concept I think is novel, and more than  slightly alarming: personal branding. I do a lot of workshops about this at work because of its seeming ubiquity as a concept in the business world right now. But what does it really mean, and why is it so popular?  Is it a change in how we see each other, or just an iteration of something else?

Some have criticized personal branding as emphasizing “packaging” oneself well over focusing on self-improvement. I don’t think that’s actually true. I went back to what some have identified as the first extended discussion on personal branding, a 1997 article in Fast Company titled “The Brand Called You,” by Tom Peters, to see how he positions it. Peters posits that in the late twentieth-century knowledge economy and era of the Internet, workers are no longer mere employees in others’ corporations – they are instead “CEOs of Me, Inc.” The new professional world is all about the individual. He advises readers to describe, in 15 words or less, what their unique skills and contributions are – their “feature-benefit model.” This is their personal brand.

I decided to do a bit of historical contextualization to determine if the advent of personal branding really did up the ante for artifice in the business world, of if it was just another incarnation of self-help advice. I started with one of my favourite books, Steven Covey’s “Seven Habits of Highly Effective People,” first published in 1989. His opening section discusses the history of self-help books, and distinguishes between what he refers to as the “personality” and “character” ethics. The character ethic – a long-term approach to self-improvement that gets at the fundamental roots of behaviour in order to integrate sound principles into one’s life – dominated the literature until about the 1920s, when a new idea, the personality ethic, rose to prominence. The personality ethic, he says, was much more about strategies for achieving success by, essentially, showing others what they wanted to see. By implication, what others wanted to see may not have been one’s genuine self, and in time those who subscribed to the personality ethic might be exposed as insincere frauds. (The implicit criticism of books like “How to Win Friends and Influence People” here is hilarious.) Covey calls for a return to the character ethic – a principles-based approach, over a superficial one.

So we see that, according to one of the leading writers in the genre, packaging oneself for success is not a new thing. Moreover, those who advocate personal branding do emphasize self-improvement. Peters in “The Brand Called You” clearly advocates building one’s skills in order to improve one’s personal product suite – but what he cites as the benefits are largely extrinsic. The difference, then, is not in the lack of focus on personal improvement, but in the desired outcomes from it. According to Peters, the beneficial outcomes are more power, more authority, and, most notably, more visibility. Presumably these benefits add up to personal happiness and fulfilment, but the link is not made explicit. Visibility in particular seems to be an end in itself. I suspect this is a change over the “personality ethic” kind of self-help, because that was much more focused at the interpersonal level.

To tease out the differences some more, I went back to the original in the self-help genre, Sam Smiles’s 1859 work Self-Help, widely considered the literary embodiment of liberal-progressive Victorian morality.  Contrast the personal branding mantra of visibility with what Sam Smiles says about anonymous self-improvement:

Even the humblest person, who sets before his fellows an example of industry, sobriety, and upright honesty of purpose in life, has a present as well as a future influence upon the well-being of his country; for his life and character pass unconsciously into the lives of others, and propagate good example for all time to come.

Not exactly the same as buying a building to have one’s name on it, or founding a scholarship program, or sponsoring a business school. And the return to Smiles highlights another difference. A key part of the visibility end is that it stops at the individual, whereas Smiles advocates self-betterment for the “greater good.” National progress is the Victorian self-help goal, whereas advancement of the self is the personal branding-era goal.

Perhaps the most alarming difference is in the means by which personal branding achieves its goals of visibility – through the commoditization of the self.  Such a concept had to be American, the true home of capitalism and democracy. It seems sometimes as though the history of the United States is the story of competition. Everything is settled through democratic process, and the best “product” wins. Think: religious freedom, competition among school districts, election of neighbourhood dog catchers, etc. (You can debate this concept with me in the comments section, if you like.) These days, it’s all about the money. How can I “sell” myself in a way that people want to invest in me?

Further evidence can be found in the fact that “self help” as a genre has gone from an offshoot of liberal political philosophy to sitting largely within “business literature,” because it is practical and concerned with, at the root, the effectiveness of capitalist organizations and the individuals within them. At least, that’s where the legitimate self help authors have gone, having geared their advice toward executives, or else they face relegation to the “New Age” section.

All of this leads me to believe that a desire for fame (or perhaps notability or notoriety would be a better word) is one of the defining characteristics of our era. But what do you think? I am taking too much from the visibility angle? Do you think self help does reside primarily in the business section now? And are you as alarmed by people casually discussing how to “brand” themselves as I am? I’d love to discuss this with you, so please leave a comment below if something is on your mind!


Paris: The City of Light, Love, and Atrocious Service

February 18, 2010

In the BBC yesterday, columnist Emma Jane Kirby described the customer service experience in Paris – cab drivers refusing to take her (on crutches for breaking a leg skiing) because she was a “cripple,” vendors refusing to assist customers by selecting their produce, and restaurant waiters refusing to answer to anything other than “Monsieur.” It is a sorry picture indeed. “The customer is not always right,” she writes – as though this is acceptable behaviour among those seeking to earn money in a sinking economy.

What is the reason for this particularly French brand of incivility? Apparently, it dates back to the French Revolution. “The revolution of 1789,” Ms. Kirby writes, “Has burned the notion of equality deep into the French psyche and a proud Parisian finds it abhorrently degrading to act subserviently.” Americans in the service industry, on the other hand, use their first names and seek to “give us ‘good folks [i.e., patrons] a great time.’” Friendliness? Promptness? An enjoyable customer experience? Heaven forbid!

I wonder that the author of this article points to the French Revolution as the origin of French servers desiring equal status to their patrons, and yet contrasts their service with that in America. Didn’t the United States have a similar revolution, with similar aims and results? Indeed, I doubt that many countries exist with the idea of equality so firmly engrained in their culture as the United States, nor that of a strong work ethic. I have already posted about the fact that Americans work longer hours, are more productive, and make more money than Europeans. Granted, there are numerous problems that have resulted from the shift from an old, European artisan kind of work (like the article’s French grocer who carefully selects the right avocado for when the customer will use it, perhaps?) to the modern, Fordist division of labour in the American corporation. However, in the area of customer service, the Anglo-Saxon idea of the customer calling the shots clearly wins the day – in theory and in profits.

I suspect that the two countries have diverged in this way for very different reasons. In fact, I think the lingering resentment displayed by the French servers described in this article is more a holdover from the ancien régime than something created by its overthrow in 1789. It is a modern parallel to the old, chafing class consciousness, and reflects the old divisions of education, upbringing, and geography. French peasants – both urban and rural – were overtaxed and undervalued for most of French history. They felt as though they were ignored and treated as though their opinions meant nothing, particularly in Paris, where most of upper classes lived. This was the whole cause of the French Revolution. America, in contrast, began (ostensibly) as a society of equals, in which farmers and the urban working class had as much right to participate politically as the intellectual and social elites. There were – absolutely – de facto class divisions, but in a nation of immigrants, everybody had to start from close to nothing. In comparison to the old European system of birth determining all, the American system was, from the start, a meritocracy in which hard work was prized above all. 

And it’s no coincidence that the eminent management thinker, Peter Drucker (himself an immigrant to the US) was famous for arguing for both good, old-fashioned hard work and putting the customer first. They are closely related. I object to the idea of “Joe,” the American server in the article, who (the implication is) simpers his way obsequiously through the course of meal with no genuine pride in his work, solely seeking a fat tip. In North America, great customer service is a source of pride in itself. It is also a growing trend for companies to (re)focus on the customer. In the January-February 2010 issue of the Harvard Business Review, Roger Martin describes this trend as a new, customer-focused stage of modern capitalism. Are the French so out of sync with modern management theory as to willingly slight their customers?

Quite frankly, I find the idea of French customers being mere “irritants” appalling. Does the service ethic not apply to everyone? Should Nicolas Sarkozy refuse to serve his country if we do not refer to him as “Monsieur”? Is he inferior because he ‘serves’ his voters? Is Thierry Henry inferior to his audience/ticket holders because he ‘serves’ them by bringing in goals (and controversial World Cup qualifying berths)? No. These are their jobs. Few can afford to serve none.

On a personal note, I have always found customer service in the United States to be exceptional, in every regard.  It’s a large contributor to the overall pleasant feeling I have when I’m there – and for tourists, that feeling is invaluable. Perhaps the Parisians have something to learn from the American tourists they dislike so much.


A Woman’s Work Is To Be More Visible

January 27, 2010

I have often been struck, while watching Battlestar Galactica and reading the wonderful Harry Potter books, at the gender equality in the professional world that is a matter of course, and unremarked upon. Male and female Viper pilots, aurors, admirals and politicians are roughly equal in numbers in these imagined/future worlds – a striking contrast to our own.* Can anyone imagine Top Gun with a woman in the cockpit? Will America ever have a female five-star general? (There is currently only one female 4-star general, and it this is indeed an admirable rank for her to have achieved considering that women are excluded by law from combat jobs, which is how one usually attains this rank.) And why is it that this picture of 31 world leaders features only 3 women (and zero men willing to make “bold” fashion statements by wearing something other than a black suit)?

*NB – post historical is unable to find comparable statistics for the gender ratio within the auror population in the muggle world.

I think we would be hard pressed to find many modern, moderately liberal thinkers who really believe that women and men do not demonstrate equal abilities in school, in the workplace, and in life, when given equal opportunities at the start. So what is causing the gap between early achievement and long-term achievement in management, politics, and other fields traditionally dominated by men? Read the rest of this entry »


Job Titles That Last and Why Higher Education Stresses Efficiency Over Fulfilment

January 20, 2010

My current life plan is to find a job that (among other things) has a title that would resonate through the ages. I have this theory that if you can see some iteration of yourself in the past it provides an anchor that grounds what you do in age-old tradition, wisdom and experience – even if you choose to throw it all out the window, you become another chapter of the profession, adding to its history. For example, teaching has varied slightly in its methods and certainly in who constitutes its pupils, but in effect the basic idea is the same: pass on your knowledge or the collected wisdom in a certain area to other (hopefully apt) pupils. Same with politics (helping to set out the framework by which a society lives), or law (advocating on behalf of someone else), or medicine (healing people). The people who practise the work have changed, but not the work itself.

Workforce Analytics Advisor? Strategic Competitive Profit Returns Consultant? Glorified Corporate Meeting Room Booker and Assistant Calendar Organizer? Not so much.

It’s a classic extension of Marx’s theory about the division of labour: Read the rest of this entry »