History Through Rose-Coloured Glasses

November 12, 2014

Rarely have there been so many meanings so definitively associated with the same colour.

From the innocence of childhood to the sexy, all-night glow of Las Vegas neon, pink has a colourful and controversial history associated with noble and common, demure and gaudy, masculine and feminine. And it wasn’t even known as “pink” (in English) until the late 1600s, centuries after its purported opposite — blue — really arrived on the scene, both linguistically and in the popular consciousness.

Madame de Pompadour, mistress to King Louis XV

Madame de Pompadour, mistress to King Louis XV

Some have argued that pink’s “golden” age was in the eighteenth century, when it was the mode for high-fashion ladies of the French court. At that time, of course, they were among the only people who could afford the expensive dyes that coloured the fabrics they wore. Madame de Pompadour, mistress to King Louis XV, popularized pink amid a bevy of other pastels that were favoured in the Rococo period.

Pink continued to be associated with the rich and royal until the twentieth century, when chemical dyes allowed for its more widespread use in clothing that could be washed repeatedly without the colour fading or washing out. It was also around this time that pink transitioned from being largely a pastel hue associated with the innocence of children to a more bold, exotic shade. The new dyes allowed for the creation of deeper and darker versions of pink that spread around the world in the fashions of the 1920s.

The new and the neon

Buildings started to be sheathed in rose around the same time. In the 1920s and 30s, at the height of the Art Deco movement, vivid colours emerged as an alternative to the drab sameness and deprivation of depression-era interiors. A splash of bright paint could change the tone of a whole room. And with a focus on modern, technologically-enabled streamlining of form, the architecture and products of this age contrasted both with the ornate and intricate styles from earlier in the century and the contemporary countertrends of European functional Mies Van der Rohe-style block modernism.

Pink on pink at the Hotel De Anza, a classic example of Art Deco in San Jose, California

Pink on pink at the Hotel De Anza, a classic example of Art Deco in San Jose, California

Art Deco was colourful and accessible — and immensely popular. This was particularly the case in America, where, as architectural historian Robert M. Craig puts it,“Art Deco was jazzy, bright, sexy, loud, and visually appealing.” It was everywhere: from department stores to movie theatres to the new motels that had sprung up all over the country to provide for a growing motoring class.

Pink walls and pink fashions were a way to stand out and be noticed, and thus the colour was increasingly used in advertising, from splashy storefronts to the neon signs that dominated the landscape starting in the 1920s. In this way pink came to be associated with both the egalitarianism of commerce and material things: stylish perfume bottles, vacation homes in South Beach, new living room walls. Marilyn Monroe wore a notorious pink dress on the cover of the 1953 film Gentlemen Prefer Blondes. Elvis’s famous pink convertible, purchased in 1955, was seen as the height of post-war luxury and is featured at Graceland.

Gentlemen Prefer Blondes (in pink) -- Marilyn Monroe in the 1953 movie poster.

Gentlemen Prefer Blondes (in pink) — Marilyn Monroe in the 1953 movie poster.

Flight of the pink flamingos 

Pink is everywhere in California, as it is in many places where there are beaches, single-story construction, and a touch of the exotic. It is the colour of soft sunsets (because of Rayleigh scattering, in which only the longer rays on the visual spectrum, in the red-yellow colour range, reach the eye), and flowering plants. And in its heyday in 1950s, it represented the triumph of modernism and new frontiers.

Then its meaning shifted again. From being the bright colour of the future, it became the gaudy holdover from a bygone age. The lights of Las Vegas started to look a bit too commercial, too fake. Pink houses now stand out, “island[s] of whimsy in a sea of drab conformity,” and as such aren’t always viewed positively by the neighbours. Gradually pink started to represent the Miami Vice-like excesses of the 1980s or the wastefulness of neon tube lighting, first patented almost 100 years ago.

Nothing symbolizes the pink backlash more than the popular conception of lawn flamingos. Elegant and exotic, flamingos can be found across the globe in warm and wet areas, from India to Chile. The first pink lawn ornament was created in 1957 and was a smash hit. But by the late 1960s, the negative image of the plastics industry and the “unnatural” look of giant pink birds on the lawn led to a spiralling decline in their popularity. Now, of course, they are popular again, an ironic wink and nod to the kitsch of an earlier time.

Gentlemen prefer … pink?

This was not, however, the greatest reversal in the popular perception of pink. It is perhaps surprising today to imagine that pink was for most of its history considered a very masculine colour. Contrasted (as it always is) with blue, pink was seen as more stimulating and active, appropriate for clothing young boys, and the soft daintiness of blue more appropriate for clothing young girls (think: Cinderella’s dress at the ball). It remains a symbol of strength to this day in Japan, where it is associated with cherry blossoms, said to represent fallen warriors.

In nineteenth-century Britain, when military might was shown with red uniforms, boys wore pink as a kind of lesser red. And let’s not forget that the standard map of the British Empire is coloured pink, symbolizing the strength and breadth of British power, from the Cape to Cairo, and Whitehorse to Wellington. The old pink maps cemented the idea of empire in the popular consciousness of the time, creating what Linda Colley, (my favourite) scholar of the British Empire, has termed “a sense of absolutely uncomplicated, uncompromising power.”

Imperial Federation Map of the British Empire, 1886

Imperial Federation Map of the British Empire, 1886, by John Charles Ready Colomb

Pink now, of course, is considered near-exclusively feminine. It is often used idiomatically to refer to women’s or gay rights issues, as in “pink-collar” work, or “the pink economy.” And it has been helped in this image by marketers for almost seventy years, who both helped to shape tastes in colour and hew to common perceptions of them. Pink was a target during the 1970s with the feminist backlash against the confines of gendered clothing. As women started to dress in a more unisex and stereotypically masculine way, pink was eschewed. As an interesting overview in the Smithsonian notes, there was a time in that decade when even major retailers such as Sears Roebuck didn’t sell pink baby clothes, for girls or boys.

Living in a material world

2011 Color of the Year, "Honeysuckle"

2011 Color of the Year, “Honeysuckle”

The shift toward the ownership of colour could be said to have begun with the Pantone Institute’s codification of colours for matching purposes in the late 1950s. In recent colour analyses of brands, pink is considered warm, sensitive and nurturing, commonly used in products or campaigns targeted at women, such as Cosmopolitan and Victoria’s Secret. And that most enduring lightning rod of femininity, Barbie, naturally has her own shade. Barbie pink (Pantone 219C) has been associated with everything Barbie from the very beginning, including a fuzzy pink bathroom scale released in 1965 that was permanently (and controversially) set to 110 lbs.

Love in pink. Photo courtesy of Flickr user Chris Goldberg.

Love in pink. Photo courtesy of Flickr user Chris Goldberg.

And yet pink remains an aspirational colour, just as it was when Madame de Pompadour wore it at the French court. In 2011, Pantone chose Honeysuckle (18-2120), a bright variation of classic pink, as its Color [sic] of the Year, citing its “confidence, courage and spirit to meet the exhaustive challenges that have become part of everyday life.” It is a colour for the zeitgeist, a necessary perk in the dark days of our latest recession, with its many pink slips. According to Leatrice Eisema, Pantone’s Executive Director,”In times of stress, we need something to lift our spirits. Honeysuckle is a captivating, stimulating color that gets the adrenaline going – perfect to ward off the blues.”

So often viewed in opposition to something, pink can nonetheless be understood as a world unto itself. Whether seen as high or low, kitschy or elegant, soft or strong — or all of the above — it seems doubtful we’ve reached peak pink. Who knows what it will signify next?


The rise of lottery professions and why it’s so hard to get a decent job

May 4, 2014

In early September 2002, a young singer named Kelly Clarkson won the inaugural season of a new reality competition called American Idol. Her first single set a new record for fastest rise to number 1, vaulting past the previous record holders from 1964, the Beatles. Her rise to fame was categorized as meteoric, the kind of rags-to-riches story so beloved in America, and one that would be repeated, with more or less success, over the next 12 seasons and in several other similar contests.

Kelly Clarkson is fabulously talented, and also the beneficiary of a windfall. After years of struggling, she rose to the top of what has traditionally been known as a “lottery profession,” one in which there are many aspirants, very few of which succeed, and the rest do menial jobs in the hopes that one day their “big break” will come.

Often it never does. There are thousands of talented singers who never appear on our TV screens and never get Grammy awards because they are unlucky. They don’t win the professional “lottery.” (And in some cases it is a literal lottery: I’ve been to Idol auditions in Canada that have random draws of ticket stubs to determine who is even allowed an audition at the first stage.)

The concept of a “lottery profession” is usually applied to the performing arts – dance, acting, singing – and the literary ones – novel and poetry-writing – as well as sports and other fields in which aspirants need to be exceptionally talented, and also distinct, to an extent. And in these areas it has only become more difficult to succeed in the last hundred years.

The Poor Poet (Der Arme Poet), by Carl Spitzweg

 

Breakaway: how the best of the best get more market share

Chrystia Freeland writes in Plutocrats of how communications technology and economies of scale have made famous people even more famous. In the nineteenth century and before, a singer’s reach (and therefore income) was limited to those who could pay to afford a seat in a theatre; today, she can make money from records, huge live shows and merchandise. An early twentieth-century soccer player was limited in income by those who had paid to see the match – which is likely one of the reasons we don’t hear about many famous professional athletes pre-twentieth century – while today his face is on Nike ads and jerseys and he earns millions per season through licensing deals to see him on television getting yellow-carded.

And where the limited reach of a theatre or soccer pitch allowed a greater number of quite talented individuals to succeed, the limitless reach of television and the internet allow the über-talented to divide greater spoils amongst a much smaller number. Why bother going to see the local hotshot when you can watch Lionel Messi?

 

A Moment Like This: the lottery gets bigger

The trouble is, artists and athletes aren’t the only ones risking their livelihoods on a proverbial lottery ticket anymore. There are more new “lottery professions” all the time, often emerging out of professions that were once solidly middle class, able to support a family, with good salary and benefits. To give but a few examples:

  • Investment banking and stock trading, formerly quite boring, now require numerous credentials (CFA, MBA, etc.), and a good network in the right places, to get into;
  • Law work is now frequently outsourced to the developing world with intense competition for fewer and fewer spots at top firms in the West;
  • Tenured positions in academia, as I’ve already written a lot about, are quickly being eliminated with little hope for current Ph.D. holders;
  • Fire fighters have a 15% chance of acceptance at the local fire training academy, and most fire-fighting professionals have second jobs;
  • Medicine, nursing and social work programs accept fewer applicants for even fewer jobs, despite more demand for these professionals, instead hiring temporary foreign workers;
  • Even teaching, that bastion of middle-class professional employment, is a tough job to get these days, and, as anyone who has done any teaching will tell you, it ain’t a glamourous or lucrative gig.

Some of these changes are the effects of globalization, of course, which has pulled many in the developing world into the middle class even as it has displaced work from North America and Europe. The result is intense competition for what are really quite banal professions with long hours and few perquisites. After all, we are talking about work here, and while many of us can hope to enjoy what we do, more people still think of a job as more a way to make money than a calling.

 

People Like Us: what happens to the “winners”

Who holds the winning tickets? Extraordinarily talented, hardworking and lucky people like Kelly Clarkson and Lionel Messi. Those who inherit wealth. And those who are connected in the right ways. This is a self-perpetuating circle, with fame and money increasingly intertwined.

Success in one area seems to imply expertise in another, which is why we have a rise in parenting and home-making literature from people made famous by playing characters on TV and in movies. Famous actors try their hand at everything from making wine to formulating foreign policy. Just look at Dancing with the Stars, that barometer of cross-professional fame: it has deemed this season alone that comedians, “real housewives,” and Olympic gold medallists and will make money for them, ostensibly as dancers. Previous contestants include politicians, scientists, and athletes galore.

Science! …and cross-promotion.

The links here are fame and money, and while using either to get what you want in a new realm is nothing new, the potential reach of both (and opportunity to make more of each) has exponentially increased. And there is a new opportunity for unprecedented fame from the patronage of modern plutocrats, witnessed by the pantheon of celebrity chefs, celebrity dog trainers, and celebrity litigators. The über-rich want the best, and the best take a disproportionate slice of the industry pie.

 

Thankful: the rise of corporate patronage

So what happens to all those quite talented people who would have played to full theatres two hundred years ago? (Apart from making YouTube videos, that is.)

I’ve been thinking for years now that the “high” arts (theatre, ballet, classical music, dance) depend on wealthy patrons for survival, much as they did before these became popular attractions in the modern period. Those patrons today are largely corporate sponsors, instead of wealthy individuals, and the companies get cultural cachet and corporate social responsibility bonus points while the performers gain a living.

The trend goes beyond the arts. In Silicon Valley (and elsewhere in the US), corporations and wealthy benefactors are extending their philanthropy beyond traditional areas of giving. Mark Zuckerberg sponsors New Jersey school districts. Mike Bloomberg helps municipalities with their tech budgets. The Clinton Global Initiative finances green retrofits in the built environment. As the public sector falls apart, we become more dependent on the proclivities of wealthy people and the companies they run, for better or worse.

Your discretionary income at work!

 

Don’t Waste Your Time: what happens to everyone else

Those without a good corporate job or corporate patronage can still have interesting weekends. The last twenty years have seen a rise in hobby culture. Not just for hipsters anymore, farming, knitting, and brewing are all things to count as hobbies as it becomes harder and harder to actually make any money doing them. Assembly-line economics prompted a decline in bespoke items in favour of cheaper, ready-to-use/ready-to-wear equivalents, and with it the near-demise of artisan production. Hence, hobby culture has taken over. Many people today have side businesses that were once considered a main income stream, such as making crafts (e.g. through Etsy), photography (helped by the rise of Pinterest and Instagram) or self-publishing. I suspect this trend will only increase as 3D printing becomes more popular.

And for everyone else holding tickets and waiting for their numbers to come up, there is retail. The old stereotype of underemployed actors waiting tables persists because it is still true, and some are servers forever. In some industries and some places (for example, grocery cashiers in Toronto), service jobs are a means to an end, some spare cash earned while in school. In others, like much of the United States and suburban areas generally, people work in retail and/or service (the largest category of employment in North America) because they have no other option.

The result is a proliferation of companies pushing a “service culture,” a movement toward glorifying the customer experience everywhere from fast food to discount clothing stores. And while there is a long history of service as a noble profession (for example, in high-end restaurants), and giving clients what they desire is a laudable goal, claiming a service mandate while maintaining a pay gap between customer-facing employees and top management of 20, 80 or 200 times is deceitful, the false empowerment of the economically disenfranchised.

All of the above trends reflect a growing inequality in the workforce, one that becomes ever-more entrenched. Inequality is a major hot-button issue in politics at the moment, and a number of initiatives have been proposed to combat it, including raising the minimum wage. The long-term success of any solution, however, requires recognizing that the ability to earn a living can’t depend on holding a winning ticket.

 


The Brands That Still Matter

February 13, 2014

Dannon Oikos’s nostalgic Superbowl spot was a great advertisement for both the French multinational and its new yogurt product.

But will knowing Oikos is a Dannon product make consumers want to purchase it? Or will they turn instead to nutritional count, pro-biotic content, or price to make their decision? How strong is this brand?

How strong, these days, is any brand?

What I need right now is some good advice

Well, it depends. An excellent piece in the New Yorker this week explores “the end of brand loyalty” and whether this spells the decline of an age in which brands were useful shorthands for purchasing everything from baked beans to luxury sedans. In an era in which the customer review is king, all companies must compete product by product, it says, even established giants. The field is open for upstarts and smaller rivals who can win over a market on the strength of a single, well-reviewed product.

It’s easier than ever for young companies to establish a foothold with a signature product: think Crocs, which have expanded from those foam clogs to flip flops and winter gear, creating a whole new hideous/comfortable footwear market. What propelled Crocs to fame was the strength of customer testimonials saying that it really was worth the price and the look to get that level of comfort.

The same trends that allowed Crocs to happen also signal the decline of major brands. When we have so much information at the click of a button, the promise of a consistent level of quality – which is really all a brand is – becomes less important than the fact – actual product reviews. Why trust a company to make things you know you’ll love when you can trust other users to tell you their opinions instead? It’s true: the level of trust in a product’s brand as a shorthand for making a good purchasing decision is at its nadir.

However, the decline of product brands has led to the rise in service brands, particularly those giving advice. Booking a holiday? The competition for who gives the best advice on hotels, restaurants and attractions seems to have been decisively won by TripAdvisor. Purchasing a book? Bypass the publishing house and read the reviews on Amazon, and then let the site recommend a choice for you. Looking for a good movie? Hardly anybody makes decisions about movies based on the studios that produce them, but Netflix can tell you what to watch based on what you’ve seen before.

These are all Internet-based examples, because the advice industry has moved online for the most part, but brick-and-mortar service brands have also maintained their strength amid the fall of brand loyalty for products. Banks are an example of organizations that are judged based on the selection of products they have curated for their customers, but more importantly how they advise their clients, particularly in the higher end, higher-margin businesses of wealth management and institutional and corporate banking. Consulting firms continue to prosper through economic slowdowns because they can advise on both growing revenue (in good economic climates) and streamlining expenses (in bad). And it all began with things like Consumer Reports, J.D. Power, and other ranking agencies who built their reputations upon being the ones who choose the products that matter, and whose advice you can trust.

The service brand becomes personal

Those who host the platforms that enable others to recommend products – the information aggregators and analysts – are poised to be the big winners of the near economic future. And this extends to individuals as well, which explains the push in the last ten years to develop “personal brands.” I’ve written before about how this makes many feel a bit icky, and yet if we think of skills as “products,” and analytical ability as “service,” it makes sense to have a personal brand that emphasizes how you think and relate to others as opposed to what you know. (This is why most personal brands focus on a combination of attitude and experience, e.g. Oprah’s signature empathy which resulted from her life experiences.)

Skills can be learned and degrees earned by many individuals, just like many companies can manufacture clothing. They are interchangeable. But proof of being able to think well, in the form of awards, complementary experiences, and attitudes, expressed through a self-aware brand, is unique.

This is likely why LinkedIn has moved to a model that goes beyond listing skills and abilities to providing references (“recommendations” and “endorsements”) to indicate past performance, and “followers” to show how popular one’s ideas are. These serve the exact same function as the ranking and number of reviews a destination has on TripAdvisor.

No doubt this has contributed to the large number of individuals wanting to strike out on their own. At a recent networking meeting I attended, 100% of attendees were looking to become independent personal nutritionists, career or life coaches, or consultants. They weren’t wanting to sell things, they wanted to sell themselves and their advice.

A strong brand – a personal one – is essential for this kind of career change, and part of creating a strong brand is ensuring consistency. Working for an organization whose values don’t align with yours – even if you are doing the same thing you’d want to do independently – is a brand mis-match.

All of this highlights another key similarity to traditional product brands: service brands, once established, have a grip on market share. Most companies would prefer to have an accountant at an established firm do their taxes over a sole proprietor. TripAdvisor has few competitors in the travel advice industry, which is why travel agencies are faring so poorly. The barriers to entry are high and name recognition and brand still counts for a lot.

My advice to newcomers: time to call up Uncle Jesse to make an ad for you and get some brand recognition.


Sparkling Water or Water Lilies? The Comfort vs. Beauty Problem

January 29, 2014

First things first: posthistorical is back! I am very excited to be blogging again. The world seems much the same: Obama, Harper, and Merkel have won more elections; politicians everywhere squabble over ridiculously trivial things and generally accomplish nothing; we collectively still spend way too much time on facebook. And yet much has changed: this blogger now lives in the Golden State instead of the True North Strong and Free, and with a government-enforced sabbatical now has a lot fewer excuses not to post frequently.

It’s also 4 years (ish) since I started posting on this blog, and that means the exciting quadrennial spectacle of nationalism that got many of my juices flowing last time (otherwise known as the Winter Olympics) will soon be upon us. Once more, in Russian! More to come.

But first!

A dichotomy for the ages

One of the things that started me on blogging again was a rush of ideas I encountered while re-reading Aldous Huxley’s Brave New World for a book club meeting. I will likely tease out a number of themes and their repercussions in the modern world in future posts, but the one that resonated most strongly with me was the dichotomy that one of the book’s main characters presents between truth & beauty and comfort & happiness. To have beauty and truth, he reasons, one needs to endure political and emotional instability, heartbreak, misery and terror – basically, opportunities to show mercy, wisdom and courage in the face of overwhelmingly bad odds. Happy and comfortable people have no need to rise above their situations in such a manner.

But who would choose discomfort and misery, given the choice?

The general trend of world history has been toward comfort, both in a material way and in the sense of social stability. If the nineteenth century was the century of engineering and industry, the twentieth century was the century of comfort. It was the century of spandex, widespread air conditioning and La-Z-Boy. More people than ever before were lifted out of poverty, and industrialization led to middle-class (or relative middle-class) comfort worldwide.

The number of people who choose sneakers over high heels or jeans and t-shirts over Little Lord Fauntleroy suits seems to back up comfort’s victory over beauty. And from the range of falsehoods – from “spin” to blatant lies – evident in government, advertising and many other areas, truth doesn’t seem to do very well either.

Have we already made the choice? And if so, is this progress?

The truth/beauty vs. happiness/comfort dichotomy mirrors the idea of moral vs. technological progress. Some thinkers, such as John Gray, whose anti-humanist work Straw Dogs I’ve written about before, believe that technological progress is in theory limitless, but that our moral progress as humans is essentially stalled. Nuclear technology, to use an example he gave, while a huge technological boon that can supply power to millions, has simultaneously allowed us to wipe cities off the map, a more efficient killing machine than had ever been known before.

Systematic discrimination

Perhaps truth and beauty – or moral progress, if we can equate the two – have seemingly lost out to comfort and happiness – technological progress – because the large-scale systems that largely control our lives have focused mainly on them. Take governments: funding for truth and beauty (whatever that would look like) will almost always come second to funding for hospitals, police, and even infrastructure – that is, the necessary building blocks for a comfortable life. The Brave New World character I mentioned earlier also points out that rule of the people leads to an emphasis on comfort and happiness over truth and beauty – certainly, this is the credo of America, “life, liberty, and the pursuit of happiness,” not, incidentally the pursuit of truth. Comfort, or at least freedom from harm and repression, was the first priority of the revolutionaries.

I went back to examine some other modern revolutionaries. Re-reading The Communist Manifesto, I discovered that the aims of Communism also begin with comfort before proceeding to truth, even if the ideals contained within the movement are based on so-called universal truths. Guaranteeing subsistence was the first step, through a radical change in property rights, the tax system, etc. followed by universal education (i.e., the pursuit of truth and beauty).

The other large system that governs our lives, free market capitalism, is also geared toward profits that can more easily be made from comfort than beauty.  This is why Proctor & Gamble, who sell deodorant and diapers, made $US 81 billion in 2013, and the New York Times, winner of more Pulitzer prizes than any other newspaper, struggles each quarter to make a profit. Perhaps this also explains the existence of the phrase “starving artist.”

First things first

There may be a way to see a positive outcome in this. Perhaps it is not so much a dichotomy between truth/beauty and comfort/happiness, as a ladder, or hierarchy, if you will. Perhaps, like ol’ Maslow said, we focus first on satiating the need for food, clean water and safety before striving for self-actualization.

Now, we all know how much I love Maslow (and so does everyone else, apparently, because this is by far my most read post). But this theory would disagree with Huxley’s characters, who imagine that it is either a comfortable, drugged out existence devoid of anything so confusing and challenging as truth, OR starving artists capitalizing on their misery and discomfort by creating beauty, that is, skipping straight to the top of the hierarchy.

I posit this theory: those who can truly move to the self-actualization stage can only do so because they feel their more basic needs have already been met. This is true even though they live in the same world as those more susceptible to advertising campaigns which introduce needs we never knew we had (for the new iPhone, elective rhinoplasty, or gluten-free dog food, for example). Maybe it’s just that those seeking truth and beauty seem deprived and miserable to those who couldn’t imagine taking their places.

Our need for comfort will stay the same as our definition of comfort changes; perhaps those who can be comfortable enough already, without soma and shiny new things, can have their truth/beauty cake and eat it too – happily.


Knowledge and Power in a Skeptical, Connected World

March 18, 2011

Who do we listen to, and why? In an age when we can find anything information quickly, what does it take to be a voice that rises above many others? What kind of power does this represent?

I read in the latest edition of the Harvard Business Review that in 2011 companies are anticipating an increased focus not just on broadly saturating target markets with facebook ads and silly “viral” videos, but on targeting “influencers” as part of their “social media” strategies. These individuals are those who shape culture and get other people on board with new trends and ways of thinking. Oprah is an influencer. Radiohead are influencers. Steve Jobs is an influencer. And a lot of random bloggers, tweeters, and other social media characters whom you’ve never heard of are influencers, and they are going to be targets of corporations because they are both cheaper and perceived (perhaps) as more authentic shills than their more famous counterparts.

You can be sure that by the time something gets annotated up to the level of an HBR trend to watch, it has already set the Internet abuzz. Further research on “measuring influence” yielded far more twenty-first-century social media examples than any others. It seems that organizations have (finally!) learned that a “social media strategy” on its own is of little benefit without real, grassroots endorsement. However, I’m more interested in what “influence” looked like in the past, before it morphed into a social media concept to be made into the next corporate buzzword, and what characteristics have stayed with perceived “influencers” since.

It seems it is a tricky thing to quantify, or even define. An article I discovered about the role of influence in economic history discusses how it is closely related to communication, but can range from impression to force in the amount of strength it implies. The other critical factors in determining long-term influence were time and space. The example given was Saint Thomas Aquinas, whose ideas were central to much medieval thought (throughout the Latin-speaking world, at least), but are relatively inconsequential today.

Influence and Power – and Money

Influence, as the article points out, is closely related to power. One of the concepts that has stayed with me since learning it in an Organizational Behaviour class years ago is that of differences in the kinds of power wielded by individuals. They can have positional power, power stemming from one’s role as, say, a manager or a parent or some other official and likely formalized figure of authority, or they can have personal power, that stemming from an individual’s character or beliefs, and likely more informal in nature. The difference between them parallels that of practical/mental authority vs. emotional authority, and the general consensus is that emotional authority goes much further in influencing others because it does not rely on a (potentially temporary) and wholly external power differential the way practical authority does.

When I consider what influence looked like in the past, it seems there was little distinction between the two types of power mentioned above. Perhaps the theory I just articulated is a fall-out from our comparatively recent fixation on merit over birth status as a rationale for power. Indeed, the ideas (and names associated with them) that have survived best throughout history to influence many others have always been backed by great financial power. Take religion, for example, which has been perpetuated by wealthy organizations that held positional power in their communities. The familiar expression about history having been written by the victors speaks to the tendency of dominant individuals, families or states to justify their authority with historical precedent. And most of the theories in every field that are still with us today were dreamed up by men with solid financial backing and the ability to spend large amounts of time reading and philosophizing. (Even Marx lived off the generosity of his bourgeois co-author, after all.)

But today that is changing — to an extent. YouTube, twitter and other media that celebrate memes and all things viral can make ordinary people famous astonishingly quickly. Such fame is often fleeting and of dubious value to society, but savvier types can sometimes parry their sudden name recognition into the more lasting sort or influence (Justin Bieber, anyone?). This can happen because influence is magnetic and self-perpetuating. Mommy bloggers who are already widely read and respected are natural candidates to push band-name diaper bags or whatever else new mothers supposedly need and want. That corporations want to latch onto such people is hardly surprising – they are merging their corporate power with bloggers’ influence in new markets, and the bloggers want to in turn increase their own profile through association (or maybe just get free products).

Self-perpetuating influence applies to companies as well. The new techie term for this concept is “network effects” – as the Economist defined it recently, “the more users [services like facebook, eBay, etc.] have, the more valuable they become, thus attracting even more users.” Whereas in the past money and power begat more of the same, today we can add hits and click-throughs to the mix.

Knowledge Brokering from Darwin to Wikipedia

The common link between these people and corporations is the way they treat knowledge. They are what the corporate world now refers to as “knowledge brokers,” a title that refers to the ability to clarify and share information with different audiences or spheres, and determine what the common elements are between, say, Paul Revere, corporate marketing, and the AIDS epidemic. Knowledge brokering (and a bit of luck) is what separates widely-read bloggers from those who write solely for themselves (whether they want to or not). It is the ability to write things that people find interesting and useful. The CIA is investing heavily in such people after a serious of incidents that demonstrated how segregated and impotent their different bodies of knowledge were.

Knowledge brokering is more than simply aggregating (though smart aggregators of information are helpful too). It is the ability to analyze and draw connections while becoming a trusted conduit of information. Knowledge brokers are perhaps an antidote to the pervasive and growing tendency to overspecialize, because they connect many specialists and their ideas with a broad audience. They are the reason we know about Darwin’s ideas. Or Jesus. Or celebrities’ latest faux-pas. Wikipedia is one giant knowledge broker that has an army of largely volunteer knowledge brokers in their own right mobilized on its behalf. That is power.

But what makes us listen to them? I suspect the key is authenticity. A lingering distaste and a keen sense for corporate marketing disguised as something else define our era. Perhaps the main difference between influencers from the past and those of today lies in the type of power they wield, as I outlined above. Personal power – like that wielded by bloggers and Oprah – is seen as more trustworthy because it lacks an agenda (whether or not this is true). Positional power is usually distrusted simply because of what it is. We only listen to Steve Jobs because we truly believe he has our best interests – in being cool and technologically savvy, regardless of the product – at heart. In contrast, many Americans discount everything Obama says because they believe he merely wants to increase his own power and unveil his secret socialist agenda on an unwilling populace.

Is this a reflection of our philosophical allegiance to free-market democracy? Is influence and power of all kinds just the ability to get people to like and trust you? If so, many corporations are going to need a lot more than “influencers” on their side.

Food for thought: How do those with positional power gain credibility? Is this knee-jerk anti-authoritarian mindset in society as prevalent as I say it is? Do people who seek to perpetuate their influence by getting behind corporations somehow weaken their own authority (i.e. do they lose their ‘cred’)? Hm.

MARGINALIA: Though I did not explicitly link to it in this post, the Economist’s Intelligent Life ran a fascinating piece recently on The Philosophical Breakfast Club, a group of four Victorian scientists who were definitely knowledge brokers (and nifty polymaths) and who were key influencers in their time. I’d recommend reading it.


Coffee vs. Alcohol: A better brew?

February 28, 2011

Almost everyone enjoys a good brew, but some brews are more acceptable than others, it seems. Around the world, coffee consumption has far outstripped that of alcoholic beverages, with around 2.9 pounds (or around 30 litres) of coffee consumed per person, on average, in one year. Compared with an average consumption of 5 litres per person, per year of alcohol worldwide, it seems we are much more inclined to be hitting a Starbucks than a bar on an average day.

Global average alcohol consumption

Global average alcohol consumption

Coffee is also a critically important trading commodity, second only to oil in terms of dollar value globally. I won’t get into the cultural influence of Starbucks, Tim Hortons and the like, but the impact on consumers and on the business world has been significant – much more so than any individual brand of alcohol in recent history.

Coffee is a relatively modern beverage. There is no Greek god of coffee, like there is of wine (though if there were, no doubt he would be a very spirited half-child of Zeus who enjoyed bold flavours, waking up early, and being chipper). The first evidence of coffee drinking as we know it today is generally placed in the fifteenth-century Middle East. Evidence of wine and beer consumption, in contrast, dates to 6000 BC and 9500 BC, respectively, or even earlier. Yet for such a young contender, coffee’s rise in popularity has been impressive.

No doubt in part this rise in Europe related to the appeal of the exotic, like the chocolate and other goods that originated in Turkey or other Arab countries. It is also likely that, like sugar, coffee was just tasty and appealing in its own right, and those who tried it liked it and wanted more. And certainly there is the social aspect, the rise of coffeehouse culture across France and Britain in the eighteenth century, which brought together politics, business and social interaction in a public forum as never before. The purported offspring of the coffeehouses, such as the stock market, French Enlightenment ideals, and even democracy, were significant. In a TED talk I watched recently, author Steven Johnson slyly remarked that the English Renaissance was curiously closely tied to the changeover from imbibing large amounts of depressants to large amounts of stimulants with the rise of the coffeehouse (go figure).

The best part of waking up?

Today, it seems that coffee has generally been linked to a host of other caffeinated beverages that are considered “good” (such as tea and cola) and alcohol has been linked with commodities that are “bad” and “unhealthy” (such as drugs and cigarettes). Why? Perhaps it is because colas, tea and coffee are unregulated, entirely legal, and (to a point) even considered safe for children, while the opposite can be said of alcohol, drugs and cigarettes.

Is the association fair? Hardly. While the dangers of addiction may be greater for the latter group, and public drunkenness more severely chastised than public hyperactivity, coffee and sugary colas (as fantastic as they are) are hardly the healthiest choices of beverages.

I suspect it is something else, something in the inherent nature or promotion of coffee that makes it seem less threatening than alcohol. Coffee suffers from none of the religious ordinances forbidding its consumption the way alcohol does (though, interestingly, coffee was also banned in several Islamic countries in its early years). Is has also never endured the smug wrath of teetotalers or wholesale prohibition.

Alcohol is generally placed into the realms of evenings and night-times, bars, and sexy movies, while coffee is the drink of busy weekday mornings, weekends with the paper, and businesspeople. Both are oriented toward adults, but coffee is in some ways more socially acceptable. Consider the difference between remarking that you just can’t get started in the morning without your coffee versus saying the same about your morning shot of whiskey. Similarly, asking someone out for a drink connotes much more serious intentions than asking someone for a coffee. And vendors are catching on: in Britain, many pubs are weathering the downturn in business caused by the recession and changing attitudes by tapping into the morning market of coffee drinkers.

Worldwide annual average coffee consumption (courtesy of ChartsBin)

Worldwide annual average coffee consumption (graphic courtesy of ChartsBin)

I wonder if the trend toward increased coffee consumption is in place of alcohol. I also wonder if it mirrors the general cultural shift toward an American orientation. The global dominance of Starbucks and other coffee shops seem to me to be supplanting the role of the local pub or licensed hang outs of the old world with a chirpy kind of Americanism and a whole new roster of bastardized European terms and ideas like “caramelo” and “frappuccino.” The New York Times backs up the idea of American dominance, noting that the U.S. makes up 25% of global coffee consumption and was a primary instigator of the takeover of coffee shop chains. Yet coffee is also extremely popular in Europe (especially in Scandinavia, as fans of Stieg Larsson would be unsurprised to discover) and even Japan.

Is this another case of American cultural colonialism, whereby traditions from Europe are adopted, commercialized, and re-sold to captive populations who want to tap into small piece of American corporate and social culture? Or is the global interest in coffee indifferent to American opinion?

Reading the tea leaves (coffee grinds?) to tell the future of consumption

Will coffee culture continue to increase in popularity, eventually supplanting the role of alcohol in social meetings? Two factors are worth considering here. The first is that while demand for alcoholic beverages in the developed world is shrinking, there is a growing interest in all kinds of alcohol (and especially wine) in emerging markets. Take, for instance, the rise of wine as a drink of choice and status symbol in China and Hong Kong as expendable incomes have grown. A similarly proportioned increase in coffee consumption there could be monumental – will it occur?

The second factor is the great cost of producing coffee. Putting aside the fact that most coffee is produced in comparatively poorer countries than those that refine, sell, and consume the finished product, the environmental cost is staggering. Waterfootprint asserts that for every 1 cup of coffee, 141 litres of water are required (mostly at the growing stage). Compare this figure with 75 litres for one similarly sized glass of beer and 120 litres for the average glass of wine and it would seem that a rise in coffee culture at the expense of alcohol could be disastrous for the environment.

Do the above statistics figure largely in the minds of those who drink any of the above beverages? Likely not. But all might – and likely will – in time affect production, and the economics of supply and demand will come into play, changing the equation once more and making it even harder to determine which is the better brew.


A Three-Pronged Approach to Saving Humanities Departments

October 29, 2010

So you graduated with a humanities degree. Well, what are you going to do with that?

I really, really hate this question. There are only 3 answers that make sense to the people who ask it:

  1. I’m going to teachers college/law school.
  2. I’m going to grad school (be careful – this one only staves off the questions for another few years and then they come back louder and more persistently than ever).
  3. I have no idea. I just wasted the last four years of my life. Yep, I’m unemployed, bitter, and poor.

For many humanities majors, the trouble with life is that it doesn’t end with university – unless you seek to become a professor in one for the rest of your life, which is a whole different story that I’m not going to talk about today. In reality, most humanities majors will not apply their deep knowledge of the sea battles of 1812 or the role of family in Hegel’s Philosophy of Right in their day-to-day jobs. Many do not even want to. They aren’t able to respond to the many, many people who ask the question above without feeling as though they have to either defend their choice of degree because it makes them “well rounded” and “interesting” or denounce it as useless in helping them find employment.

So a lot of commentators think this means humanities programs are useless, and call for eliminating French departments or combine Comparative Literature departments with a whole host of others to save on administration costs. I’m not going to get into why this is a bad thing; I think that’s fairly obvious and, besides, I write about it all the time. Instead, I’m going to advance a theory about how to fix it.

Read the rest of this entry »