A Three-Pronged Approach to Saving Humanities Departments

October 29, 2010

So you graduated with a humanities degree. Well, what are you going to do with that?

I really, really hate this question. There are only 3 answers that make sense to the people who ask it:

  1. I’m going to teachers college/law school.
  2. I’m going to grad school (be careful – this one only staves off the questions for another few years and then they come back louder and more persistently than ever).
  3. I have no idea. I just wasted the last four years of my life. Yep, I’m unemployed, bitter, and poor.

For many humanities majors, the trouble with life is that it doesn’t end with university – unless you seek to become a professor in one for the rest of your life, which is a whole different story that I’m not going to talk about today. In reality, most humanities majors will not apply their deep knowledge of the sea battles of 1812 or the role of family in Hegel’s Philosophy of Right in their day-to-day jobs. Many do not even want to. They aren’t able to respond to the many, many people who ask the question above without feeling as though they have to either defend their choice of degree because it makes them “well rounded” and “interesting” or denounce it as useless in helping them find employment.

So a lot of commentators think this means humanities programs are useless, and call for eliminating French departments or combine Comparative Literature departments with a whole host of others to save on administration costs. I’m not going to get into why this is a bad thing; I think that’s fairly obvious and, besides, I write about it all the time. Instead, I’m going to advance a theory about how to fix it.

Read the rest of this entry »


Make Money First: The Trouble With Meritocracies

October 19, 2010

For a while now, I’ve been trying to put together a post about the value of polymaths in modern society. 200 or even 100 years ago, such people would need no defenders. What could be more valuable or intrinsically rewarding than being interested in everything and interesting to others? Yet today, polymaths are often seen as dilettantes, unable to focus enough to be serious about something and get a job. There is work, and then there are hobbies, and one should learn to tell the difference and divide one’s life into segments. Few careers reward diversity of knowledge. Fewer still pay well. My tentative title was going to be, “Great Careers for Polymaths,” but the idea made me queasy. Why, I asked myself, do I need to justify having multiple interests with the language of making money?

Because, I realized, we value wealth first. What I mean by “first” is that the goal is to be “secure” financially before seeking career satisfaction, getting in shape or getting married. Wealth is the elusive gateway to a complete life, but many mistake it for a complete life in and of itself.

Read the rest of this entry »


How Gen Y Can Reinvent Work-Life Balance

May 4, 2010

It’s May again, that exciting time of year when newly-minted college graduates venture out into the world and attempt to find a job. Or perhaps go to Europe and attempt to find themselves instead until the hiring freezes are lifted.  What will increase their chances of success?

It seems as though it’s getting harder and harder just getting onto the bottom rung of the “career ladder” (a term which, as someone who works in HR, I can tell you is on its way out as an inappropriate metaphor for the working world – think less in terms of defined rungs and more in terms of the moving staircases in the Harry Potter movies – you never know where you’ll end up). What happened to slogging through a terrible entry-level job booking meeting rooms and fetching coffee, paying one’s dues in order to move up to a better job in a year or two? Is that still necessary, or have things changed?

Well, as it turns out, a lot of things of changed. Many articles have been written about them: an economic slump which has meant declining hire rates and more people being let go; a majority of baby boomers who were supposed to be leaving the workforce in order to live out their golden years on pensions we’re paying for who are not; a glut of “over-qualified” university graduates with little practical experience (which, as we all know, entry-level coffee-making jobs require) who are driving up competition for the few full-time jobs that are out there; and organization structures that are getting flatter, with fewer roles at the top. So the situation now is that one can work making coffee and booking meeting rooms for three or four years and perhaps find there’s no promotional pot-of-gold at the end of the rainbow, or find that it’s still a few years out.

So where does that leave new graduates? If “paying your dues” was the baby boomer way to climb the corporate ladder (which actually existed then), what happens now? As my favourite career blogger, Penelope Trunk, once wrote: paying dues is out; that kind of lifestyle doesn’t allow for real growth or balance at work, because it forces new recruits to work ridiculous hours doing menial tasks. (It also sets a precedent that’s hard to follow once you have commitments outside of work.)

What’s better? In theory, doing many different things to acquire enough experiences to figure out what we really want to do over the long term. One of the advantages new grads have is the freedom to move around and go where the jobs are. But the trouble with this theory is that the way the job market is structured now, we need to be very sure of what jobs we want, specialize early, and be prepared to slog it out for several years gaining “relevant experience” in our field. There is little room now for dilettantism, or having jobs “on the side.” Everything is a career choice.

Take the classic “job on the side” for everything from aspiring writers to rock stars: teaching. Teaching used to be the kind of thing that anybody could do (and there were, accordingly, great teachers and some not-so-great teachers in the mix). Now students are fighting tooth and nail to get a place at teacher’s college, often resorting to attending a school in a different country. And once they graduate, the job market looks terrible – there is a two-stage application process even to be considered for a supply teaching job.  And don’t even get me started on academia as a career.

So despite the fact that it’s better to do different things, we’re now seeing a kind of apprenticeship model reborn, with high entrance requirements to every guild. Career experts say that Gen Yers will have something like 10 different careers in their lives – but in order to do so, we’ll need to have transferable skills, and know very well how to market them. In practical terms, this means that job-hopping, or even industry-hopping, is key, to prove all the different places in which one’s skills have been useful. It’s a kind of paradox where focus and diversity of experience are battling for dominance.

One solution might be to have multiple income streams, or to get experience with various combinations of paid and unpaid work. (Or maybe to start a blog and wind up with a movie or book deal out of it.) Like the realization that your romantic partner can’t be everything to you, we’re now seeing the idea that your main job can’t be everything either, from a remunerative or skills-building perspective. (Forget the idea that a job by itself can’t make you happy in life; we exposed that fallacy several years ago.) This trend is called having a “portfolio career,” that is, using a functional skill to diversify revenue streams.

We’re used to seeing this with careers in things like music, where a conductor will (for example) have a community choir, a church gig, some wedding performances on the go, and a few students all at the same time. When one revenue stream dries up, he or she will pick up another. But it’s new for accountants, or those who might want to mix traditional employment (at a major corporation, say) with self-employment. They key is diversity within a specialization, having skills that people will pay for and capitalizing on them in several different ways.

It also means that members of this generation will have to live with more uncertainty about their careers. Perhaps this is the price we’ll pay for more control over the skills we use and how we spend our time day-to-day. Does this signify a shift back to a pre-industrial time where people could choose how much they worked? Not fully, I’m sure, but it may be the beginning of a new, hybrid system where workers can control their output and work to their real interests more. Maybe this is the new “work-life balance.”

If, that is, all these new grads ever manage to get hired into that first job.

What do you think? Will you try to mix paid and unpaid work? Do you plan on job-hopping or industry-hopping? Do you anticipate that many members of Gen Y will choose to have multiple/multifaceted careers? Or is this a trend that will only affect a small subset of the population? Is it better to work a terrible (paying) job for three years or to get lots of volunteer experience instead?


What is History Writing Now?

April 27, 2010

People reach post historical all the time by searching for odd little historical, philosophical and political science-related phrases. Given the obscure nature of many of these terms to those not deep within postcolonial or imperial studies, I assume they’re doing research for some paper or project. I wonder if they feel they can trust what they read. Am I a reliable source? Are my ideas sound? Can one cite a blog, or is this an even bigger research no-no than citing Wikipedia?

If it is, why? Consider this blogger: I have a graduate history degree from a good school, which, for many, constitutes formal “training” in the discipline.  I know how to cite sources and (hopefully) construct a logical and well-supported argument. Does this make me “qualified” to comment on things? Does being qualified today require being intelligent, well-trained, and peer-reviewed (in the traditional sense), or does it come from an even more democratic approvals process based on sheer number of readers? Would having six million hits to my blog make me a “qualified” opinion leader? Or do I need to have published six books through a university press that only 6 000 people will ever read in order to be a real “expert”?  And is either something to which I should aspire?

These questions have far-reaching implications for me as I go through the process of deciding whether to continue on with studying history as a career, or do something else entirely – something more practical, that would affect people more directly than a well-researched book in an obscure field and a few impassioned lectures about Lord Curzon and the Raj for a dwindling number of undergraduates who don’t care. Because it’s very important to me that I influence the way people think, not in a creepy mind control kind of way but by presenting a fresh perspective that makes them reconsider the world around them and how things work within it.

I’m not sure academic writing is the best way to do that: its scope is too narrow, and its audience is those who are already predisposed to thinking from many angles, and who likely know a lot about the subject already. Traditional academic writing is also very dry. It connects with the reader because it is persuasive, and offers a sourced argument with little personal point of view. Blogs and new media, in contrast, connect with readers because they cover current events and are often based on personal biases or feelings. They are inherently populist, because the vast majority of bloggers want others to read their blogs, and so they talk about things that appeal to a large audience: fashion, entertainment, celebrities, popular political news, etc. And the vast majority of people who read blogs read about the above topics. But does this make them experts in their fields? And does it translate to “academic” subjects like history?

One of my main goals for post historical is to bridge this gap with a forum that is flexible enough to talk about current events and timeless philosophical questions at the same time, yet with a focus that isn’t so personal or academically specialized to be unappealing to a broad audience outside of a strict historical discipline. One might call this “accessible” writing, though as I wrote about in my last post, “accessible” can be a bit of a loaded term. What matters most to me is making an impact in a way that is direct and tangible, which is why the thought of another history degree and a life as a struggling academic is slightly off-putting at times. It’s very clear what such a life could do for me: I’d be a recognized expert in my field; I wouldn’t have to get out of bed torturously early every morning to go to another soul-crushing corporate meeting; I’d be able to have great chats over coffee with fellow bright people and give speeches about things like maps; I could help out engaged students by giving them interesting research suggestions; and I would generally get to run around having people think I was a big smartypants. Clearly, these things all sound fantastic. But what would a life like that do for others, even if I did manage to actually get a job out of it (which these days, as my fellow blogger and history professor trivium points out on his excellent blog, almost nobody does)? How would it contribute to my big life goal of being a respected public intellectual who makes people think in exciting new ways?

I don’t mean to criticize academics, who are generally brilliant, insightful, creative people. It’s the system that is at fault, a system that encourages people to go to school for 10 years with absolutely no hope of finding employment of any kind at the end of it, a system that encourages killing trees by publishing books nobody cares about, and a system that has created the popular feeling that it is so removed from the everyday that it serves only to train its own. I fear academia is becoming so specialized that it just doesn’t have the impact, or the scope, or the popular appeal, to be taken seriously. When the people who make the most money and important decisions all have MBAs and law degrees, humanities majors are in for some trouble. Actually, we’re all in trouble because we’re losing out on diversity of thought and experience – big time.

As I’ve written before, I think great writing is all about having a conversation, which necessitates a connection between readers and writers. One of the great things about blogs, and Wikipedia, and other new media is that the connection – and the feedback, via the comments or revisions – is immediate, and the process of forming consensus iterative. This is when history and philosophy are really exciting (and this is why I love to receive comments and feedback from readers, particularly when you disagree or want to point out something I’ve missed). Traditional academic writing just isn’t set up to react quickly enough to changes in events, or popular feeling.

So, to paraphrase the great E.H. Carr, what is history writing now? One would think that it would adapt to the changing relationship between reader and writer, from words sent down from a lofty perch in an ivory tower to those that are picked over in comments forums around the world. It hasn’t. And we’ve all lost something in the process.  The Economist ran an article today about how this election (in Britain) is bound to be a landmark one, and yet has no landmark book or philosophy written about the popular mood to match it, or to spur discussion, as was the case in 1945, 1964, 1979 and 1997. (I was particularly excited to see that the article cited one of my historian idols, Linda Colley, as having written a significant work from which Tony Blair drew inspiration in 1997.)

Can it be that nobody has written anything groundbreaking in the past five or ten years that bears mention? Or is it that the political audience is too fragmented – or too busy writing their own blog posts – to notice? Is there still a place for the academic as a public intellectual, or has academic writing been pushed to the fringes of literate society by virtue of being irrelevant to everyday concerns? And if academia is on the fringes, who is in the centre?

I suppose we can all take comfort in the fact that there is still the expectation of something by the intelligent people who read and write for publications like The Economist. There is an intellectual void that will always need filling, by academics or writers or particularly insightful and far-reaching bloggers. The question for the next few years, it seems, is whether those who step up to fill it will have new job titles, and if so, what they will be.


Jargon and Power: Why “Touching Base” Equals Linguistic Imperialism

April 26, 2010

I’ve always thought that jargon was just another way to measure inclusivity. Newcomers to the corporate scene are often barraged with inscrutable acronyms, and people who want to “touch base” and “connect” in order to decide on “actionable next steps.” Other favourites of mine are the ever-present “deck,” otherwise known as a PowerPoint presentation in which one expands five sentences into thirty slides with swirling slide transitions, and the “ask” [n.], which, from what I’ve been able to discern, is a way to cut down on the syllables required to say “request.” Efficiency indeed.

In academia, it’s even worse. It seems that no book or article can be taken seriously until the author has proven his or her credentials by name-checking every obscure phrase that has been written on a subject. This practice serves only to repeat ad nauseam the same tired debates over and over with little new beyond increasing specialization, which I’ve attacked at length before.

Considering how pernicious it is to the Plain Language Movement, however, there is shockingly little popular or academic treatment of the subject of jargon. Perhaps it is because, as New Left academic Peter Ives says in his fantastic 1997 article “In defense of jargon,” “jargon is only jargon for those who don’t use it.” Maybe we like to be inscrutable because it makes us feel more intelligent. Or maybe the world is changing so quickly these days, we need something familiar to hold onto, and clichéd language represents a security blanket of sorts.

The ways in which jargon has evolved seem to support this theory. In “‘As Per Your Request’: A History of Business Jargon,” Kitty Locker writes that jargon has eras, identifying the pre-1880s, 1880s-1950s, and post-1950s as distinct periods in business communication. (Given that the article appears in a relatively obscure academic journal and was published in 1987, it obviously doesn’t touch the Internet age, and so I imagine the author would have to add another for the post-1990s period for all of the tech speak we use now.) But if we think that the 1880s-1950s (when jargon use was at its peak, apparently) saw the rise of corporate America, and with that an emphasis on professionalism and specialization, we can see the early roots of corporate-style conformity. And today there is just as much human need for conformity, but more arenas from which to choose one’s allegiance: corporate, social, technological, generational, geographical, etc.

Locker argues that corporate jargon and ‘stock phrases’ came about primarily because new employees tended to copy old correspondence, either in style or in actual phraseology. Often letters doubled as legal documents, and so the terminology had to be fairly set. Then, from the 1920s onward, American firms were interested in improving business communication, with big companies often having a person or department who monitored it and tried to get everyone to use the same words and phrases. (O, that I could have the job of whipping corporate employees’ communications into shape! Alas, cost cutting.)

Today, I suspect jargon use comes less from official processes than by the subtle attempts to reinforce unofficial corporate/academic norms and hierarchies with new employees. Using jargon – in the form of acronyms, company-specific words, or highly technical language – creates a sense of inclusivity among workers, which is exactly why, if senior executives/group leaders ever thought about it, they would have a vested interest in keeping it around.  It is a badge of honour even today for new recruits to master the new group’s/company’s lingo.

Interestingly, Locker points out that companies have had little success in eliminating jargon even when they have tried. A bank in the 1960s tried to freshen up its letters by taking out the standard greetings and salutations, and received numerous complaints from customers who were having trouble recognizing the letters for what they were. As she amusingly quotes, “the value the reader places on the distinctiveness of a business letter can easily be overestimated.” (Indeed.) And it is a daring academic who braves the censure of his or her peers by not mentioning what Foucault thought about the issue, or how “post-x” something is. (One might wonder if s/he even had an advanced degree.) It seems that there is comfort in the conventionality of jargon for both user and receiver.

I wonder if this emphasis on conventionality spreads beyond the walls of corporations and academia. Familiarity and belonging are powerful emotions, after all, and it takes a lot more effort to be fresh and original than to retreat into the comfort of clichéd words and phrases. It is often easier to be anonymous than to be articulate.

Jargon may also have more sinister undertones. Peter Ives argues that most of the jargon we use today (he was writing in 1997) originated in the right-wing military/political/business elite. It seems that we are endorsing a pro-capitalist, individualist language, because the section of society that uses such words also happens to have the means to diffuse their particular linguistic preferences more broadly.

By this logic, even our exhortations to “speak plainly” in language that is “accessible” can be read as elitist, because, as Ives asks, who gets to determine what “accessible” is? Democracy? If so, Chinese would be most accessible. Instead, we assume that “plain English” wins out, and enforce that presumption upon everyone else. Such is the stuff of linguistic imperialism.

It seems language is inextricably tied to power structures, existing hierarchies, and even imperialism. So next time someone asks you to “touch base” later, consider that by deciding just to “talk” instead, you’re standing up for the little guy.


The Educated Class and Its Discontents

April 13, 2010

In a Special Report on Germany in the Economist recently, the traditional German system of education, while excellent at producing great engineers and skilled trade workers, came under criticism for its rigidity and unfairness. In Germany, ten-year-olds are marked out for either a career of manual labour (skilled or otherwise), white-collar work, or the bureaucratic/professional work that comes after university, and sent to separate schools accordingly. Ten is too young, its critics argue, to give a child a direction for life, which will become difficult to change later on with guild-like labour markets that prohibit entry into professions without the right qualifications. And many complain that Germany does not have equality of opportunity. Family background is more likely to determine test scores and social status in life in Germany than it is in any other country.

With any talk of equality of opportunity, it comes up again, that old aspirational myth of moving between classes, the Horatio Alger or perhaps Will Hunting story of a genius saved from poverty by good education, mentoring or his own perseverance to rise to a different class. Because it is about class. Germans (and the writers of the Economist) are not concerned as much about eventual income distribution, which is quite fair, as they are about having the opportunity to do something else: move up the social ladder.

Focusing on class seems to be a very Old Europe thing. Only in Europe do we see that holdover of a very, very privileged elite (or aristocracy) that has old family wealth, and a poor or working class that never really seems to shrink outside of meddling with statistics, and isn’t going to because those within it have a sense of pride in being working class. A recent article on class and politics in Britian in the Economist seems to describe the six established statistical class divisions as essentially fixed. David Cameron must appeal to the same middle-class voters as Margaret Thatcher, who appreciated their aspirations to “improve their homes and their lives; to get gradually better cars, washing machines and televisions; to go on holiday in Spain rather than Bournemouth.” Hardly a rapid rise to the upper echelons of power – really just a desire to keep up with what is expected from being “middle class.”

In fact, it seems the most common way of achieving a material increase in living standards is immigration. The quality of life is much higher in “New World” countries like Canada and Australia because the basic cost of living is less, while health care and education are still available at the same high standard, or higher. It’s hard not to notice that eight out of 10 cities ranked “most liveable” by the Economist last year were in Canada, Australia, and New Zealand.

And there is more opportunity for movement between classes in the New World (a term I’ll keep using despite the fact that it makes me sound like Columbus, because I can’t think of a better one), not least because there is less emphasis on “class” in general as something that matters, at least explicitly. The class system of North America has less of a focus on income and history and more on the combination of these with other factors, such as education. My theory is that because New World societies were formed based on merit, and evolved with much less distinction based on income or family wealth (since most everyone was a poor immigrant upon arrival), education and occupation became the primary means of separating out the kind of people with whom one should associate.

The North American system is thus designed to provide more equality of opportunity. In theory, all have the same access to education, even, in some ways, up to the university level. It is a noble goal, and higher education is certainly more accessible in Commonwealth and countries and the US than in continental Europe, as this 2005 study ranking university enrollment in developed countries shows.

But the result of our comparatively open and well-attended university system has been a generation or two of liberal arts or natural science graduates who spend ten years flailing around the entry-level job market before eventually settling into corporate middle management in a completely unrelated field somewhere, making essentially they same money they would have had they been pre-classified at age ten as they do in Germany. Most look back fondly on the days they spent at university, but more for the social connections they made than the time spent reading Cicero. And we, as a society, have trouble finding enough people to sell us mortgages or build our houses, because there aren’t really university programs that teach those skills. Universities have become training grounds for the “middle class” as a whole – including the low end of white collar work – instead of training grounds for occupations where they actually provide valuable preparation, that is, the “upper middle class” work of medicine, law, academia and the like.

If nothing else, we North Americans are certainly losing efficiency with all of this finding ourselves that comes after attaining our university qualifications. We’ve also created a society in which having a B.A. means you’re under-qualified for many jobs – either in experience, or because everyone else applying also has an M.A. or the college-level diploma which is all that’s really required to do the job. It isn’t going to change, though, because we value two things too highly: our “right” to attend school (especially university) for as long as we want to, and the class position that doing so will get us.

True, recently there has been a real push by the government and colleges to recognize skilled labour and professional work as viable career options for high school graduates to consider, and one often hears flippant comments about the world needing more plumbers and electricians, who “actually make a fair bit of money.” (Reality check: this website puts a plumber’s average hourly wage at $24 in Toronto, which over a year works out to about $47 000. This is around what your average white collar worker earns, at least at first, and a plumber doesn’t carry the same student loan debt.)

But while the logic of matching skills to actual jobs may have (almost) caught up, the overall effect on what class one will end up in has not. Doctors and lawyers are still far more likely to associate with white collar workers who have attended university than electricians who earn the same amount, because education and occupation are still important class signifiers.

What would it take to change these biases? And would changing the biases reverse the trend toward hiring managers requiring ever-more degrees when hiring someone to answer telephones and make photocopies? Is there a happy medium between the German and North American systems, where there is still mobility between classes, and still equality of opportunity, but more cultural acceptance that skilled trades and professional work is a respectable way to earn a living? I’m not sure – but for all that, I would still struggle to recommend that anybody give up learning about politics or history or biology and instead learn about practical data models in order to secure a job. We are fortunate to have the privilege of being able to buy those three or four (or more) years of time to learn. I would advise anybody who asked to enjoy it while it lasts, because there’s plenty of time for uninspiring desk work later, if they so choose.


Suitable – For Men Only

March 11, 2010

Clothing is a funny thing. Some people argue that it means nothing, and is a mere distraction from what lies underneath (figuratively speaking). Many others argue that is sends critical messages about its wearer, and obsess over what those messages are.

The most polarizing issues are always related to women: everything from whether Hillary Clinton’s sensible trouser suits make her qualified or matronly to whether followers of Islam should be permitted/forced to wear clothes that cover their faces or hair. I once heard a model claim that all fashion is women’s fashion, and that we only let men borrow it periodically. It was a joke, but one that implies that the control lies in the hands of women. I believe that it is the opposite, and that because they are free from all the attention, it is really men who have the power in this regard.

I wrote a paper a few years ago about how men’s fashion in the nineteenth century was instrumental in shifting feelings of “otherness” from those of class to those of gender. That is, the key differentiators in society before the 1800s were class-based, and reflected in clothing styles. After the 1800s, the key differentiators were between the sexes. Now, before all of you political historians tune out because you think I’m going to start using wacky postcolonial/postmodern/psychoanalytic/feminist arguments, let me say this: what people wear, and especially what they wear to work, speaks volumes about the values of the society in which they live.

(And, for what it’s worth, most of this post will be about men anyway.)

The nineteenth century was notable for spawning the first modern ideas about working: there was a real middle class for the first time, and it generally participated in a public sphere of manufacturing and commerce. Trade was no longer considered dirty by the upper classes; instead, it was England’s “nation of shopkeepers” that was leading the charge of modernity and Empire, and entrepreneurs were raised to the level formerly attained only by military men and the aristocracy. For the first time, hard work and professional expertise had respect, and this sense of respect bonded men together. Of course, it also separated them from women, who were rarely if ever allowed to participate in this glorious public work – they had to stay at home and raise children (which, of course, isn’t work at all, right? It’s pure joy! That’s why women don’t get paid for it!).

Thus arose the suit. Ah, the suit. That most modern uniform that signifies utilitarianism, seriousness, and piety (through its emphasis on black exterior and white collared shirt overlay, like priests!) all at once. The package that is so simple, easy and flattering that men (and those who see them) don’t even have to think about it. The modern suit was so revolutionary, after so many years of tights and funny short pants and ruffs and wigs, that one eminent historian of fashion has said that since its adoption, women’s fashion has been reduced to its imitation.

Because before the suit, all fashion was men’s fashion. Think ducks: men had to be ostentatious and showy while women merely had to be pure and able to produce offspring. And after, it was only women who had clothing that was complicated, deceptive, and silly. (Don’t even get me started on the kinds of mishaps that could occur while wearing a hoop skirt.)

So what’s changed? Is the suit still master of the professional clothing universe? I think it still represents all of the above (with maybe the exception of ‘piety’) and is still the defining answer to the question of what is appropriate to wear to work. Of course, there are signature looks (Steve Jobs and his black turtlenecks, Richard Branson’s lack of ties, and “Casual Fridays”) but these are remarkable because they stand out from the norm. The suit is so powerful because it is a uniform and gives the wearer immediate currency in the professional world because he does not need to talk about it. But women aren’t included: if a woman wears a suit proper, she stands out. If she wears a pantsuit, she stands out for being too much like Hillary Clinton. If she wears something more feminine, she stands out for that too – perhaps for overly expensive designer elitism, a la Sarah Palin. If she doesn’t wear a suit, she is unprofessional — or worse. Whatever she wears, she stands out. If you don’t believe me, check out this picture of world leaders and tell me who stands out to you.

In casual wear, of course, it doesn’t matter – the separation between work and fun is clear and thus lacks a value judgement about competence. And besides, everyone, of all classes and both genders, wears jeans. But overall very little has changed on the professional world: the classes may mingle, but the genders remain distinct.

So what? you may ask. Clothing doesn’t actually change how competent (or incompetent) a person is. Of course it doesn’t – but isn’t it interesting that as a society we still can’t get past using the outside packaging as an excuse for our real opinions? Without all of the discussion about pantsuits, would Hillary still be considered “traditional” and a “feminist”? And don’t even get me started on shoes…

What do you think? Does it matter to you what people in positions of power are wearing? Do you respect a suit more than a skirt? Do you think clothing enslaves us? If so, how do we escape?