The rise of lottery professions and why it’s so hard to get a decent job

May 4, 2014

In early September 2002, a young singer named Kelly Clarkson won the inaugural season of a new reality competition called American Idol. Her first single set a new record for fastest rise to number 1, vaulting past the previous record holders from 1964, the Beatles. Her rise to fame was categorized as meteoric, the kind of rags-to-riches story so beloved in America, and one that would be repeated, with more or less success, over the next 12 seasons and in several other similar contests.

Kelly Clarkson is fabulously talented, and also the beneficiary of a windfall. After years of struggling, she rose to the top of what has traditionally been known as a “lottery profession,” one in which there are many aspirants, very few of which succeed, and the rest do menial jobs in the hopes that one day their “big break” will come.

Often it never does. There are thousands of talented singers who never appear on our TV screens and never get Grammy awards because they are unlucky. They don’t win the professional “lottery.” (And in some cases it is a literal lottery: I’ve been to Idol auditions in Canada that have random draws of ticket stubs to determine who is even allowed an audition at the first stage.)

The concept of a “lottery profession” is usually applied to the performing arts – dance, acting, singing – and the literary ones – novel and poetry-writing – as well as sports and other fields in which aspirants need to be exceptionally talented, and also distinct, to an extent. And in these areas it has only become more difficult to succeed in the last hundred years.

The Poor Poet (Der Arme Poet), by Carl Spitzweg

 

Breakaway: how the best of the best get more market share

Chrystia Freeland writes in Plutocrats of how communications technology and economies of scale have made famous people even more famous. In the nineteenth century and before, a singer’s reach (and therefore income) was limited to those who could pay to afford a seat in a theatre; today, she can make money from records, huge live shows and merchandise. An early twentieth-century soccer player was limited in income by those who had paid to see the match – which is likely one of the reasons we don’t hear about many famous professional athletes pre-twentieth century – while today his face is on Nike ads and jerseys and he earns millions per season through licensing deals to see him on television getting yellow-carded.

And where the limited reach of a theatre or soccer pitch allowed a greater number of quite talented individuals to succeed, the limitless reach of television and the internet allow the über-talented to divide greater spoils amongst a much smaller number. Why bother going to see the local hotshot when you can watch Lionel Messi?

 

A Moment Like This: the lottery gets bigger

The trouble is, artists and athletes aren’t the only ones risking their livelihoods on a proverbial lottery ticket anymore. There are more new “lottery professions” all the time, often emerging out of professions that were once solidly middle class, able to support a family, with good salary and benefits. To give but a few examples:

  • Investment banking and stock trading, formerly quite boring, now require numerous credentials (CFA, MBA, etc.), and a good network in the right places, to get into;
  • Law work is now frequently outsourced to the developing world with intense competition for fewer and fewer spots at top firms in the West;
  • Tenured positions in academia, as I’ve already written a lot about, are quickly being eliminated with little hope for current Ph.D. holders;
  • Fire fighters have a 15% chance of acceptance at the local fire training academy, and most fire-fighting professionals have second jobs;
  • Medicine, nursing and social work programs accept fewer applicants for even fewer jobs, despite more demand for these professionals, instead hiring temporary foreign workers;
  • Even teaching, that bastion of middle-class professional employment, is a tough job to get these days, and, as anyone who has done any teaching will tell you, it ain’t a glamourous or lucrative gig.

Some of these changes are the effects of globalization, of course, which has pulled many in the developing world into the middle class even as it has displaced work from North America and Europe. The result is intense competition for what are really quite banal professions with long hours and few perquisites. After all, we are talking about work here, and while many of us can hope to enjoy what we do, more people still think of a job as more a way to make money than a calling.

 

People Like Us: what happens to the “winners”

Who holds the winning tickets? Extraordinarily talented, hardworking and lucky people like Kelly Clarkson and Lionel Messi. Those who inherit wealth. And those who are connected in the right ways. This is a self-perpetuating circle, with fame and money increasingly intertwined.

Success in one area seems to imply expertise in another, which is why we have a rise in parenting and home-making literature from people made famous by playing characters on TV and in movies. Famous actors try their hand at everything from making wine to formulating foreign policy. Just look at Dancing with the Stars, that barometer of cross-professional fame: it has deemed this season alone that comedians, “real housewives,” and Olympic gold medallists and will make money for them, ostensibly as dancers. Previous contestants include politicians, scientists, and athletes galore.

Science! …and cross-promotion.

The links here are fame and money, and while using either to get what you want in a new realm is nothing new, the potential reach of both (and opportunity to make more of each) has exponentially increased. And there is a new opportunity for unprecedented fame from the patronage of modern plutocrats, witnessed by the pantheon of celebrity chefs, celebrity dog trainers, and celebrity litigators. The über-rich want the best, and the best take a disproportionate slice of the industry pie.

 

Thankful: the rise of corporate patronage

So what happens to all those quite talented people who would have played to full theatres two hundred years ago? (Apart from making YouTube videos, that is.)

I’ve been thinking for years now that the “high” arts (theatre, ballet, classical music, dance) depend on wealthy patrons for survival, much as they did before these became popular attractions in the modern period. Those patrons today are largely corporate sponsors, instead of wealthy individuals, and the companies get cultural cachet and corporate social responsibility bonus points while the performers gain a living.

The trend goes beyond the arts. In Silicon Valley (and elsewhere in the US), corporations and wealthy benefactors are extending their philanthropy beyond traditional areas of giving. Mark Zuckerberg sponsors New Jersey school districts. Mike Bloomberg helps municipalities with their tech budgets. The Clinton Global Initiative finances green retrofits in the built environment. As the public sector falls apart, we become more dependent on the proclivities of wealthy people and the companies they run, for better or worse.

Your discretionary income at work!

 

Don’t Waste Your Time: what happens to everyone else

Those without a good corporate job or corporate patronage can still have interesting weekends. The last twenty years have seen a rise in hobby culture. Not just for hipsters anymore, farming, knitting, and brewing are all things to count as hobbies as it becomes harder and harder to actually make any money doing them. Assembly-line economics prompted a decline in bespoke items in favour of cheaper, ready-to-use/ready-to-wear equivalents, and with it the near-demise of artisan production. Hence, hobby culture has taken over. Many people today have side businesses that were once considered a main income stream, such as making crafts (e.g. through Etsy), photography (helped by the rise of Pinterest and Instagram) or self-publishing. I suspect this trend will only increase as 3D printing becomes more popular.

And for everyone else holding tickets and waiting for their numbers to come up, there is retail. The old stereotype of underemployed actors waiting tables persists because it is still true, and some are servers forever. In some industries and some places (for example, grocery cashiers in Toronto), service jobs are a means to an end, some spare cash earned while in school. In others, like much of the United States and suburban areas generally, people work in retail and/or service (the largest category of employment in North America) because they have no other option.

The result is a proliferation of companies pushing a “service culture,” a movement toward glorifying the customer experience everywhere from fast food to discount clothing stores. And while there is a long history of service as a noble profession (for example, in high-end restaurants), and giving clients what they desire is a laudable goal, claiming a service mandate while maintaining a pay gap between customer-facing employees and top management of 20, 80 or 200 times is deceitful, the false empowerment of the economically disenfranchised.

All of the above trends reflect a growing inequality in the workforce, one that becomes ever-more entrenched. Inequality is a major hot-button issue in politics at the moment, and a number of initiatives have been proposed to combat it, including raising the minimum wage. The long-term success of any solution, however, requires recognizing that the ability to earn a living can’t depend on holding a winning ticket.

 

Advertisements

The Brands That Still Matter

February 13, 2014

Dannon Oikos’s nostalgic Superbowl spot was a great advertisement for both the French multinational and its new yogurt product.

But will knowing Oikos is a Dannon product make consumers want to purchase it? Or will they turn instead to nutritional count, pro-biotic content, or price to make their decision? How strong is this brand?

How strong, these days, is any brand?

What I need right now is some good advice

Well, it depends. An excellent piece in the New Yorker this week explores “the end of brand loyalty” and whether this spells the decline of an age in which brands were useful shorthands for purchasing everything from baked beans to luxury sedans. In an era in which the customer review is king, all companies must compete product by product, it says, even established giants. The field is open for upstarts and smaller rivals who can win over a market on the strength of a single, well-reviewed product.

It’s easier than ever for young companies to establish a foothold with a signature product: think Crocs, which have expanded from those foam clogs to flip flops and winter gear, creating a whole new hideous/comfortable footwear market. What propelled Crocs to fame was the strength of customer testimonials saying that it really was worth the price and the look to get that level of comfort.

The same trends that allowed Crocs to happen also signal the decline of major brands. When we have so much information at the click of a button, the promise of a consistent level of quality – which is really all a brand is – becomes less important than the fact – actual product reviews. Why trust a company to make things you know you’ll love when you can trust other users to tell you their opinions instead? It’s true: the level of trust in a product’s brand as a shorthand for making a good purchasing decision is at its nadir.

However, the decline of product brands has led to the rise in service brands, particularly those giving advice. Booking a holiday? The competition for who gives the best advice on hotels, restaurants and attractions seems to have been decisively won by TripAdvisor. Purchasing a book? Bypass the publishing house and read the reviews on Amazon, and then let the site recommend a choice for you. Looking for a good movie? Hardly anybody makes decisions about movies based on the studios that produce them, but Netflix can tell you what to watch based on what you’ve seen before.

These are all Internet-based examples, because the advice industry has moved online for the most part, but brick-and-mortar service brands have also maintained their strength amid the fall of brand loyalty for products. Banks are an example of organizations that are judged based on the selection of products they have curated for their customers, but more importantly how they advise their clients, particularly in the higher end, higher-margin businesses of wealth management and institutional and corporate banking. Consulting firms continue to prosper through economic slowdowns because they can advise on both growing revenue (in good economic climates) and streamlining expenses (in bad). And it all began with things like Consumer Reports, J.D. Power, and other ranking agencies who built their reputations upon being the ones who choose the products that matter, and whose advice you can trust.

The service brand becomes personal

Those who host the platforms that enable others to recommend products – the information aggregators and analysts – are poised to be the big winners of the near economic future. And this extends to individuals as well, which explains the push in the last ten years to develop “personal brands.” I’ve written before about how this makes many feel a bit icky, and yet if we think of skills as “products,” and analytical ability as “service,” it makes sense to have a personal brand that emphasizes how you think and relate to others as opposed to what you know. (This is why most personal brands focus on a combination of attitude and experience, e.g. Oprah’s signature empathy which resulted from her life experiences.)

Skills can be learned and degrees earned by many individuals, just like many companies can manufacture clothing. They are interchangeable. But proof of being able to think well, in the form of awards, complementary experiences, and attitudes, expressed through a self-aware brand, is unique.

This is likely why LinkedIn has moved to a model that goes beyond listing skills and abilities to providing references (“recommendations” and “endorsements”) to indicate past performance, and “followers” to show how popular one’s ideas are. These serve the exact same function as the ranking and number of reviews a destination has on TripAdvisor.

No doubt this has contributed to the large number of individuals wanting to strike out on their own. At a recent networking meeting I attended, 100% of attendees were looking to become independent personal nutritionists, career or life coaches, or consultants. They weren’t wanting to sell things, they wanted to sell themselves and their advice.

A strong brand – a personal one – is essential for this kind of career change, and part of creating a strong brand is ensuring consistency. Working for an organization whose values don’t align with yours – even if you are doing the same thing you’d want to do independently – is a brand mis-match.

All of this highlights another key similarity to traditional product brands: service brands, once established, have a grip on market share. Most companies would prefer to have an accountant at an established firm do their taxes over a sole proprietor. TripAdvisor has few competitors in the travel advice industry, which is why travel agencies are faring so poorly. The barriers to entry are high and name recognition and brand still counts for a lot.

My advice to newcomers: time to call up Uncle Jesse to make an ad for you and get some brand recognition.


Academia Shrugged? or, Why You Should Just Quit Your Ph.D. Now

July 27, 2011

Grad school and academia as a potential career have taken a real beating in the media lately. It seems the world has finally woken up and smelled the (three-day old, re-used) coffee beans that are all grad students can afford. The bottom line is that humanities students should run, not walk, away from a life of debt and uncertainty, and a “dream job” that will never quite pan out.

In an article for Slate.com, William Pannapacker, himself a professor at a liberal arts college, proposes a few steps to fix graduate school in the humanities. Some of what he advises – such as providing networking opportunities and internships, and recognizing that it may be better to keep one’s passion for obscure fields of study as a hobby – is similar to what I proposed in my own post on a Three-Pronged Approach to Saving Humanities Departments.

But I was really intrigued by his addition of a final, “nuclear option”: quit. In his words:

Just walk away. Do not let your irrational love for the humanities make you vulnerable to ongoing exploitation. Do not remain a captive to dubious promises about future rewards. Cut your losses, now. Accumulate work experiences and contacts that will enable you to support yourself, have health coverage, and something like a normal life. Even the more privileged students I mentioned earlier—and the ones who are not seeking traditional employment—could do a lot of good by refusing to support the current academic labor system. It exists because so many of us who care about the humanities and higher education in a sincere, idealistic way have been passively complicit with the destruction of both. You don’t have to return to school this fall, but the academic labor system depends on it.

Wow. A group of highly intelligent, capable individuals upon whom “the system” depends but who are scorned by it decide one day to “go on strike” in the hopes of seeing said system implode and leave behind a twitching lump of ingrates suddenly begging them to return and save them.

This sounds familiar. Where did I read about that recently? Oh, yes – in Atlas Shrugged, Ayn Rand’s infamous treatise on objectivism. In it, the heroes grapple with their own superior morality in a world of incompetent ingrates and eventually come to realize they are complicit in the very system that condemns them for their unchecked ambition and capitalistic greed. (Of course, unchecked ambition and capitalistic greed are positive attributes in Rand’s heroes.) So, one by one, they go on strike by withdrawing to a hidden, magical place where labour is justly rewarded, and nobody ever gives anybody anything they haven’t worked for, while they watch the world crumbling around them without their input.

(There, I’ve saved you from slogging through 1000+ pages of libertarian/objectivist bluster that would probably outrage and offend anyone who believes in silly things like equality of opportunity and altruism.)

But putting aside the absurd pairing of tweed-jacketed academics and Wall Street “fat cats,” let’s think a minute about the implications of this Randian proposition for academics. Would it work? As Pannapacker points out, there is always the possibility of having a day job with health care and indulging in one’s “irrational love for the humanities” as a hobby. As he says, “more and more people are learning [that] universities do not have a monopoly on the ‘life of the mind.’”

Maybe. But I think universities should at least have a competitive edge on it, or else they stand to become exactly what vocationalists want them to be: training for jobs that exist today and have clear mandates and required preparation. This would certainly be the case if all the most brilliant liberal arts minds suddenly decided to be brilliant elsewhere in the world.

Because if not universities, then where? Will we have to start paying people to hear about their ideas? Will we have the same level of interaction if everyone is decentralized and off thinking interesting thoughts in different industries? How will we prepare people to think innovatively, and prepare them for the jobs of tomorrow that don’t have clear mandates or preparatory courses?

The whole point of a university is that it is a hub, a place a little apart from the rest of the world (yes, perhaps even an ivory tower) where people can reflect on the “big questions” that may not be directly related to the pressing fads of the moment. What happens when this education becomes more decentralized? Can we trust that individuals will independently seek out all of the different perspectives they’re likely to get in an undergraduate humanities course?

I reflect on what Stanley Fish wrote in the New York Times a few weeks ago: basically, that academics, and by extension universities, should just abandon trying to be relevant and focus instead on academic inquiry for the sake of it alone. I think that would be unwise. Knowledge for the sake of knowledge is a great thing, and we need independent places (i.e. not ones funded by corporations) that will ensure that people continue to seek it. But relevance is important too, and while it should not be the only goal, it needs to be a goal.

In short, the current academic system needs to be refined from within, not by walking away and shrugging off all its problems. (Besides, academic types don’t have a magical land in Colorado where we can retreat and live out our ideals of hard work and free love and no taxes.) Professors and administrators could start with being honest about the reality of life as a grad student, i.e. mostly unemployed without the health coverage Pannapacker so enjoys having. And they should stop denigrating non-academic career choices by framing them as a continuation on the path of intelligent, creative thinking, not a deviation from it.

And then we – all of us – can start changing the way we view “amateur” non-academics outside the system, and invite them in. Let’s not exclude people with insider jargon and inaccessible writing. Let’s make a place inside the ivory tower for people who think about the “big questions” of life outside of it, so we can examine the practical implications of our ideas. Let’s show the vocationalists that “academic” is not a dirty word but one that can bring significant insight and positive change to the world outside universities, as well as in its libraries.

Let’s ask people to help us hold up the world, instead of just dropping it.


Tiger Moms, Drop-Outs, and Credentialism (oh my!)

May 31, 2011

Now here’s a controversial news item: Peter Thiel, famous for having founded PayPal and investing early in facebook, and now a billionaire, is paying young entrepreneurs to drop out of school. His Thiel Foundation has just named 24 “fellows” under 20 who are recipients of $100 000 and mentoring opportunities with high-powered and successful entrepreneurs in order to launch a profitable initiative. They are all drop-outs (of college or high school), a requirement for the prize.

His logic is that many would-be students of elite schools would be better off going right out into the world to generate “significant wealth,” rather than learn about the theories behind what others have said and done and invented. And while I would never blindly advocate that anyone drop out of school, given the prevailing societal opinion about education and the very real value of exposure to new ways of thinking, his initiative is perhaps a useful antidote to those who do blindly advocate more schooling as the solution to all of society’s ills. Education is a wonderful thing – I would even say that it is the key to solving many of the world’s great scourges, such as intolerance, authoritarianism, and the solid grip of misinformation. In a way, Thiel is saying that it is the ideas and the work behind them to make them successful that counts, not the name of one’s alma mater (or even the existence of one).

Credentialism – in the form of collecting degrees and designations from acclaimed institutions – has become a powerful shorthand for measuring societal status. It is iron-clad in an aura of meritocracy, because in theory only the best are able and would choose to obtain a degree or three at the world’s (and especially America’s) finest educational institutions. But, as with all shorthands, a focus on credentials alone as a stand-in for intellectual or societal worth fails is insufficient and at times unfair.

The education situation in many developed countries is drastic. Every year, millions of the world’s best students vie for a place in one of the mere hundreds of top institutions, trying to best each other in any way possible. A recent issue of the Atlantic explores the phenomenon in depth as part of an extended look at the “tiger mom” debate brought about by the now-infamous book by Amy Chua (there is a good write-up on it here, in case you live under a rock and missed it). Much of the furore over the book was caused by the implicit challenge of Chua’s winner-take-all style of parenting. In refusing to give in to her daughters’ tears, frustration, exhaustion, and in some cases disturbing behaviour (biting the piano?), Chua claims she paved the way to their happiness by allowing them to know what they were capable of.  More recently, her eldest daughter’s acceptance to Harvard has renewed the wave of anxious hang-wringing by “Western” parents who think they aren’t pushing their children hard enough to get into good schools and assure their futures.

But are the hours of heartache, rebellions and tooth marks on family instruments worth it? Is pushing a child to his or her limit, encouraging activities like building orphanages in Africa, chairing the local youth orchestra, and volunteering as an assistant surgeon on weekends in order to secure a spot at the Ivies the key to lifelong success and happiness? Is it even likely to yeild a coveted admission letter? Not really, according to what Caitlin Flanigan writes in response to Chua’s book:

Elite-college admissions offices drive professional-class parents crazy because in many respects they do not operate as meritocracies. Consider, for example, those students admitted via one of the two programs that stand as strange mirror opposites: those that give preferential treatment to the sons and daughters of alumni, and those that extend it to the children of unrepresented minorities. The latter practice suggests that generations of injustice and prejudice can be redressed by admission to a fancy college, the former that generations of inclusion and privilege demand their own special prize; the two philosophies would seem to cancel one another out, but each has its place in the larger system.

In fact, when you account for all of the “hooked” seats in the freshman class—spaces specifically set aside for kids who have some kind of recruited talent or family connection or who come from an underrepresented minority group—you accomplish, at the most selective colleges, two things: you fill a large percentage of the class (some researchers believe the figure is as high as 60 percent), and you do so with kids whose average grades and scores are significantly lower than your ideal. Now it’s time to swing a meritocracy into place; the caliber of the class is at stake. All of the unhooked students are now going to be thrown into a hypercompetitive pool, the likes of which the layperson can’t imagine. As daunting as the median grades and test scores of the typical Princeton admittee may appear, those statistics have taken into account all of the legacies and volleyball players and rich people’s children who pushed the averages down.

Sounds terrifying, doesn’t it? And what’s more, there is a growing pile of literature that argues it isn’t worth it. These days, people go to university for four main reasons:

  1. To attain practical/vocational knowledge that will tangibly help them get a job.
  2. To attain theoretical or other knowledge that will expand their minds in an area of interest.
  3. To please their parents/society/employers who consider a post-secondary education to be a mandatory status symbol. The better the reputation of the school, the better the status symbol.
  4. To make connections with peers and professors.

The main benefits of a “top-tier” education, as opposed to one from a large public American or Canadian school, lie in the last two, status and connections. Sharing a room with a future Mark Zuckerberg or getting to vacation on the yachts of the rich and famous must be worth the price of admission, right?

William D. Cohan thinks not, writing in the New York Times that getting into an Ivy League school is a “Pyrrhic victory,” with the outcome of having monstrous student debts (from $50 000+ per year fees) and only slight better-than-average job prospects in a glum economy. Many other American schools have astronomical fees, and even relatively cheap Canadian educations place graduates in debt. There is also, as I referred to above, the non-monetary cost of an education at a prestigious school. Whole families are swept up in the hyper-competitive race to the top, where lazy summer vacations and boredom and play are replaced with summer volunteer trips to Kenya, SAT prep courses, and the endless repetition of mastering a musical instrument.

But the saddest part is that it may all be for naught. A sister article in the same Atlantic issue as the above quotation charts the potential life course of many products of tiger-led households:

Harangued by my own Tiger Dad, I grew up believing in crack math skills and followed—at least initially—a stereotypical Chinese path of acing my tests; getting into the world’s most prestigious science university, Caltech (early admission, no less); majoring in the hardest, most rarefied subject, physics … And then what? Almost 50 years old now, some 30 years after graduation, I look at my Caltech classmates and conclude that math whizzes do not take over the world. The true geniuses—the artists of the scientific world—may be unlocking the mysteries of the universe, but the run-of-the-mill really smart overachievers like me? They’re likely to end up in high-class drone work, perfecting new types of crossword-puzzle-oriented screen savers or perhaps (really) tweaking the computer system that controls the flow in beer guns at Applebee’s. As we know, in this tundra-like new economy, even medical degrees, and especially law degrees, may translate into $250,000 of unrecoverable higher-education debt and no job prospects, despite any amount of hard work and discipline.

The reality, of course, is that there is life after graduation, and I imagine that a lot of students and parents who sacrifice their lives perfecting their viola performance and polishing their resumes will get there and wonder what the hell happened — and what to do next. The same is true for all graduates who feel lost after school, and who may have underplayed their social and entrepreneurial skills in favour of tailoring their lives to academic pursuits that will not help them once they have their degrees. And so I support Peter Thiel’s initiative because it addresses the fact that it takes more than a few letters from any school to achieve success in life.


Knowledge and Power in a Skeptical, Connected World

March 18, 2011

Who do we listen to, and why? In an age when we can find anything information quickly, what does it take to be a voice that rises above many others? What kind of power does this represent?

I read in the latest edition of the Harvard Business Review that in 2011 companies are anticipating an increased focus not just on broadly saturating target markets with facebook ads and silly “viral” videos, but on targeting “influencers” as part of their “social media” strategies. These individuals are those who shape culture and get other people on board with new trends and ways of thinking. Oprah is an influencer. Radiohead are influencers. Steve Jobs is an influencer. And a lot of random bloggers, tweeters, and other social media characters whom you’ve never heard of are influencers, and they are going to be targets of corporations because they are both cheaper and perceived (perhaps) as more authentic shills than their more famous counterparts.

You can be sure that by the time something gets annotated up to the level of an HBR trend to watch, it has already set the Internet abuzz. Further research on “measuring influence” yielded far more twenty-first-century social media examples than any others. It seems that organizations have (finally!) learned that a “social media strategy” on its own is of little benefit without real, grassroots endorsement. However, I’m more interested in what “influence” looked like in the past, before it morphed into a social media concept to be made into the next corporate buzzword, and what characteristics have stayed with perceived “influencers” since.

It seems it is a tricky thing to quantify, or even define. An article I discovered about the role of influence in economic history discusses how it is closely related to communication, but can range from impression to force in the amount of strength it implies. The other critical factors in determining long-term influence were time and space. The example given was Saint Thomas Aquinas, whose ideas were central to much medieval thought (throughout the Latin-speaking world, at least), but are relatively inconsequential today.

Influence and Power – and Money

Influence, as the article points out, is closely related to power. One of the concepts that has stayed with me since learning it in an Organizational Behaviour class years ago is that of differences in the kinds of power wielded by individuals. They can have positional power, power stemming from one’s role as, say, a manager or a parent or some other official and likely formalized figure of authority, or they can have personal power, that stemming from an individual’s character or beliefs, and likely more informal in nature. The difference between them parallels that of practical/mental authority vs. emotional authority, and the general consensus is that emotional authority goes much further in influencing others because it does not rely on a (potentially temporary) and wholly external power differential the way practical authority does.

When I consider what influence looked like in the past, it seems there was little distinction between the two types of power mentioned above. Perhaps the theory I just articulated is a fall-out from our comparatively recent fixation on merit over birth status as a rationale for power. Indeed, the ideas (and names associated with them) that have survived best throughout history to influence many others have always been backed by great financial power. Take religion, for example, which has been perpetuated by wealthy organizations that held positional power in their communities. The familiar expression about history having been written by the victors speaks to the tendency of dominant individuals, families or states to justify their authority with historical precedent. And most of the theories in every field that are still with us today were dreamed up by men with solid financial backing and the ability to spend large amounts of time reading and philosophizing. (Even Marx lived off the generosity of his bourgeois co-author, after all.)

But today that is changing — to an extent. YouTube, twitter and other media that celebrate memes and all things viral can make ordinary people famous astonishingly quickly. Such fame is often fleeting and of dubious value to society, but savvier types can sometimes parry their sudden name recognition into the more lasting sort or influence (Justin Bieber, anyone?). This can happen because influence is magnetic and self-perpetuating. Mommy bloggers who are already widely read and respected are natural candidates to push band-name diaper bags or whatever else new mothers supposedly need and want. That corporations want to latch onto such people is hardly surprising – they are merging their corporate power with bloggers’ influence in new markets, and the bloggers want to in turn increase their own profile through association (or maybe just get free products).

Self-perpetuating influence applies to companies as well. The new techie term for this concept is “network effects” – as the Economist defined it recently, “the more users [services like facebook, eBay, etc.] have, the more valuable they become, thus attracting even more users.” Whereas in the past money and power begat more of the same, today we can add hits and click-throughs to the mix.

Knowledge Brokering from Darwin to Wikipedia

The common link between these people and corporations is the way they treat knowledge. They are what the corporate world now refers to as “knowledge brokers,” a title that refers to the ability to clarify and share information with different audiences or spheres, and determine what the common elements are between, say, Paul Revere, corporate marketing, and the AIDS epidemic. Knowledge brokering (and a bit of luck) is what separates widely-read bloggers from those who write solely for themselves (whether they want to or not). It is the ability to write things that people find interesting and useful. The CIA is investing heavily in such people after a serious of incidents that demonstrated how segregated and impotent their different bodies of knowledge were.

Knowledge brokering is more than simply aggregating (though smart aggregators of information are helpful too). It is the ability to analyze and draw connections while becoming a trusted conduit of information. Knowledge brokers are perhaps an antidote to the pervasive and growing tendency to overspecialize, because they connect many specialists and their ideas with a broad audience. They are the reason we know about Darwin’s ideas. Or Jesus. Or celebrities’ latest faux-pas. Wikipedia is one giant knowledge broker that has an army of largely volunteer knowledge brokers in their own right mobilized on its behalf. That is power.

But what makes us listen to them? I suspect the key is authenticity. A lingering distaste and a keen sense for corporate marketing disguised as something else define our era. Perhaps the main difference between influencers from the past and those of today lies in the type of power they wield, as I outlined above. Personal power – like that wielded by bloggers and Oprah – is seen as more trustworthy because it lacks an agenda (whether or not this is true). Positional power is usually distrusted simply because of what it is. We only listen to Steve Jobs because we truly believe he has our best interests – in being cool and technologically savvy, regardless of the product – at heart. In contrast, many Americans discount everything Obama says because they believe he merely wants to increase his own power and unveil his secret socialist agenda on an unwilling populace.

Is this a reflection of our philosophical allegiance to free-market democracy? Is influence and power of all kinds just the ability to get people to like and trust you? If so, many corporations are going to need a lot more than “influencers” on their side.

Food for thought: How do those with positional power gain credibility? Is this knee-jerk anti-authoritarian mindset in society as prevalent as I say it is? Do people who seek to perpetuate their influence by getting behind corporations somehow weaken their own authority (i.e. do they lose their ‘cred’)? Hm.

MARGINALIA: Though I did not explicitly link to it in this post, the Economist’s Intelligent Life ran a fascinating piece recently on The Philosophical Breakfast Club, a group of four Victorian scientists who were definitely knowledge brokers (and nifty polymaths) and who were key influencers in their time. I’d recommend reading it.


It Takes a Village: Why Not Outsource Childcare?

March 14, 2011

The 100th Anniversary of International Women’s Day last week got me thinking about how glad I am not to be Betty Draper. Yet despite our advances, the promise of happier people – which of course includes happier families – has not borne out. The feminist movement has made great strides toward equality, but often at the expense of children, many of whom now grow up in an environment with no parents at home. We could debate at length why so many families feel the need to have two working parents (is it that corporations no longer pay a “family wage”? or have standards changed and now families believe they need more things, bigger houses, etc.?), but it would not alter the fact that most families have not substituted a father working all the time – and a mother at home – with two parents alternating working half the time. Throw in a divorce rate hovering around 50% in the Western world, and single parents who have no choice but to work long hours, and the result is millions of children with almost no parental direction for much of the time, let alone quality time with two parents.

One of the enduring themes of this blog is the increasing over-specialization of work, study, and entertainment, but I have yet to touch on the arena of parenthood. So allow me to play Jonathan Swift for a moment with my own modest proposal: outsourcing childcare to those who can do it efficiently and – most important – effectively.

Why not outsource parenting? We seem to have made most of the rest of our lives as efficient as possible. Instead of each of us owning farms that grow all our own food, we have created supermarkets and other supercentres that not only sell food, but everything from pharmaceuticals to care tires. Millions of office drones sit in cubicles doing the white-collar equivalent of screwing a bolt into a chassis over and over for eight or more hours a day, the epitome of over-specialized corporate work.

And childcare itself has changed from the days of one parent teaching her young how to get on in life. Public schools were established 1 000 years ago to teach Latin to poor children who could not afford private tutors. Today it is a legal requirement in most countries that children spend their weekdays in classrooms full of other children. (And most do: the latest statistics for homeschooled children that I could find put the number at only about 3% in the United States.) We have already outsourced the majority of education to professional teachers, from the fundamentals of literacy and numeracy to advanced calculus and classic literature.

At an even more basic level, many working parents outsource childcare to day cares, nannies or relatives. Crèches, the forerunner of modern day care, were established in France in the 1840s near factories so working women could drop their children off there during the day. Today they are everywhere. As the percentage of working women (in Canada) aged 25-54 rose from around 50% in the 1970s to over 80% today, there was an accompanying rise in the number of children in non-parental care.  In 2002, 54% of Canadian parents had someone else look after their children during the day, up from 42% in the mid-nineties. In the U.S., almost two-thirds of pre-schoolers are in non-parental child care.

So outsourcing our parenting – if I can be forgiven for using such a cold, economic term – is certainly palatable to the majority of parents, at least some of the time. And there is most definitely a broader need for it, though less quantifiable. I needn’t go into the many social ills connected with a lack of influence, or parental influence, attention, or role-modelling during childhood, as these are well known.

There are many bad parents out there, but while we are quick to want to get rid of other minders who are ineffective, like teachers or nannies, social and biological conventions dictate that it is a lengthy and difficult process to “fire” parents. Leaving children exposed on mountaintops or in the care of a nunnery (in which something like 80% of the unfortunates dropped off died anyway) has gone out of fashion in developed countries, except in certain safe havens like Nebraska, so instead they remain with bad parents, or in foster care, which for most is not the optimal solution. Even parents who love their children can make bad child-rearing decisions with the best of intentions.

But what if the default option for raising children, like public schooling, was communal (or private) care by qualified parent-like figures? The right to “home parenting” (like home schooling) could be awarded only to those who are qualified to practice it, with regular supervision by a central body. Consider: specialist “parents” rearing children in groups is hardly a radical idea. The old African proverb about a child needing more than one knee, or the much more famous one that serves as the title of this post, indicates that our modern way of raising children is little more than a hiccough in the trajectory of human history.

Most parents raise only a few children, but almost all say that it gets easier the more they have, as they build experience and knowledge. Specialized parent substitutes would have the benefit of raising perhaps tens of children, and, what’s more, they would love it, because it would their career of choice. Children would also have the benefit of a diversity of tried-and-true, centrally vetted and approved child care methods, culled from what has been proven to work well internationally and throughout history — call it a “best practice” approach to parenting. Just think of what costs could be reduced or eliminated in  a society with a higher proportion of well-adjusted children – everything from healthcare (therapy and counselling) to policing and incarceration costs.

Clearly, this is not likely to happen anytime soon, and I no doubt open myself up to charges of everything from heartless communism to wanting to run state finances into the ground by proposing elaborate centralized childcare schemes such as these. But consider: we wouldn’t trust spinal surgery to someone who has never done it before and who would spend half the time we’re in the operating theatre off in corporate meetings somewhere else or on his Blackberry. We wouldn’t want an unqualified engineer building a bridge we have to drive over, especially on almost zero sleep while laying the foundations.  Yet we allow complete amateurs to raise their own children armed with little more than evolved instinct and maybe a copy of Dr. Spock. Does that really make more sense?


Minimum Impact, Maximum Time, and the Goodness of Work

February 10, 2011

Is ambling antithetical to success? Is a life of purpose the only path to happiness? And is Gen Y really all that different from previous generations in wanting meaningful work?

On Marx, Meaning, and Materialism

I think often on Marx’s theory of alienation; namely, that under the capitalist system of increasing specialization, workers become alienated from the fruits of their labour, and from their own capacity as workers to work/produce things and grow in doing so. Instead of seeing work as an end in itself, and gaining feelings of fulfilment from seeing the fruit of one’s labour go from raw materials to completed items, according to Marx work had become but a means to an end as workers were increasingly slotted into automated lines of production. Instead of creating the whole shoe, they would nail in a piece of the sole, as it were, with no satisfaction in seeing the “end-to-end process” (as we might say in today’s corporatenewspeak).

Certainly, with the rise of the industrialization, Fordist assembly lines and globalization, the idea of work as a means to an end gained popularity as a way to describe life in the twentieth century. And in some ways, this was acceptable. In the 1930s, one was fortunate to have a job at all – any job. One did not pick and choose. The generation after that (those ubiquitous Boomers) observed their parents’ work ethic and adopted it without thinking, as a means to gain material prosperity. Nice cars, big houses, creature comforts, holidays in Boca Raton, and well-educated children became status symbols, ends worth working for. A life of middle management drudgery and rarely seeing one’s children was, for many, an acceptable trade-off.

But we expect so much more from our work today. Making a living, and a living that will support the lifestyle we’re used to, is mere “table stakes” (more corporatenewspeak). Because, with good education and attentive parenting and the opportunity to develop our skills as children, we have so many options for a career. Consequently, we expect much, much more out of the time we spend at work. (And before someone brings up 40% unemployment among global youth, yes, the recession has, to an extent, made Gen Ys a little less choosy – but only for now.)

The theory of work as an end in itself – and a means to happiness and fulfilment – has important research to back it up. A study out of California a few years ago remarked on the importance of hard work and purpose in achieving happiness in life. The conclusion is worth quoting at length:

A central presumption of the ‘‘American dream’’ is that, through their own efforts and hard work, people may move towards greater happiness and fulfillment in life. This assumption is echoed in the writings of philosophers, both ancient and modern. In Nicomachean Ethics, Aristotle (1985) proposed that happiness involves engagement in activities that promote one’s highest potentials. And, in the Conquest of Happiness, Bertrand Russell (1930/1975) argued that the secrets to happiness include enterprise, exploration of one’s interests, and the overcoming of obstacles. …Our data suggest that effort and hard work offer the most promising route to happiness.

Wow. Good work, it seems, is the answer to all our problems. The only thing left to do is find work that contains enough meaty, purposeful, interesting, content – related to our skills, of course, and with excellent “work-life balance” and good benefits – to meet our needs. Simple!

But is this expectation reasonable?

Really, it’s a wonder anybody finds jobs like this, let alone the majority of people. Even Marx’s (clearly idealized) autonomous, cottage industry shoe-makers (or soldiers, or second sons forced into trade…) no doubt achieved very little of this all-encompassing fulfilment through their work. Yet today we pile the expectations on our jobs. While there are certainly those out there who caution that work will not make anybody happy all on its own, the prevailing narrative remains that fulfilling work is the surest route to happiness. Consider: it’s just not socially acceptable for anyone able to participate in the “knowledge economy” to opt out and instead choose to make money solely as a means to an end with no other agenda – let alone anyone under 30. Do you know anyone? And do they want the situation to be permanent?

Minimizing Impact: Lowering our expectations? Or relieving the pressure?

While I was vacationing in the vineyards of Mendoza (rewards for a life of corporate drudgery?), I got to thinking meta thoughts about what people tend to expect from life. We use a lot of language today that revolves around impact. We want to “make a splash.” We long to stand out in interviews, on dates, and in applications. People everywhere seek to be famous for something (anything! Jersey Shore, anyone?) or to leave a legacy, something that will let current and future generations know they existed as individuals, and left something behind. Modern society refers to the more noble side of this feeling as the desire to change the world, whether through volunteering, winning a Nobel Prize or raising well-adjusted children. We have, as I have pointed out before, a strong bias to action which makes us want to do good and make things “better.” Most of us put a lot of pressure on ourselves, a vague kind of weight that is associated with the Victorian ideal of the innate goodness of work and the possibility of having a hand in making a better future. The idea of finding work that allows us to, as the above-quoted study notes, “promote [our] highest potentials,” is tied up in this pressure.

At the same time we are acutely aware that life is, as an honourary TED talk I watched recently put it, fragile and vulnerable – and short. (This fact creates a very un-Hobbesian empathy, the talk argued, not only for those with whom we share blood ties, but with other humans, other creatures, and the biosphere generally. Worth watching.) It is little wonder that, with the perception of the sand in the hourglass ever running out, we feel pressed for time, overwhelmed, and run off our feet. We try to make every moment count. We multi-task and are always tied to a communication device of some kind. Most things are done for a purpose: we educate ourselves in order to gain employment, money and “success”; we sleep and eat for our health; we watch our health to extend our lives (so we can keep doing it all longer). It has been often noted with bitter irony that with all the myriad time-saving devices we employ on a daily basis, we find ourselves busier than ever before. Trying to do things in the minimum amount of time has not made us happy.

So I decided to try an experiment in reverse-thinking. What if we sought to – even just for a day – minimize our impact, and maximize the amount of time we spent doing things? What would this look like? What does “counter-urgency” feel like in practice? Would it lessen the pressure?

Experiments in living “Slow

I suspect that it would in many ways resemble the slow movement, which has grown exponentially in popularity recently in response to the speed of life and destruction of the environment and local communities in the name of convenience. It must also be a response to the pressure of the purposeful life. The slow movement includes slow food, which is (in contrast to fast food) grown locally, often organically, and savoured. Slow reading is similar, and involves savouring text instead of skimming or summarizing, or any other kind of speed-reading I learned about in university.

A minimum-impact day would also result in fewer outputs (and here I use a very corporatenewspeak word deliberately). We would do purposeless things: ambling with no direction, daydreaming, journaling, writing poetry, reading fiction. There would be no book club to report to. No destination. Poetry, lyrics and plays could be memorized for the sake of the words themselves, lines savoured like chocolates instead of potential “gobbits” to drop into future conversations or be recalled on trivia nights.

Sadly, my brief experiment in slowly minimizing my impact was a failure: I wanted outputs. I wanted to write about it, to share it on this blog. I wanted to tie it into my life’s work and be fulfilled by it.

I sense I would not be unique in feeling this way. Is our desire for impact innate, or learned? Here we have contradictory evidence. An article in the Economist a few months ago referred to a study that concluded that the desire for good, hard work actually isn’t all that innate, particularly in Britain. But if learned, if part of the Marxist legacy we hold that says that fulfilling work is an end in itself, how do we handle the pressure of finding such fulfilment?

Perhaps the idea of work-as-end is a way to rationalize the short time we have on Earth, and that we spend most of it working. But are we destined not to find all we seek in our jobs? Is it possible to use work only as currency to “buy” time for our true passions? Should we seek to maximize the good in our work (whether employment at all, a means to material comfort and status, or even autonomous shoe-making) — even if we hate it? Do you amble purposelessly?

I’d love to hear your thoughts…