Mapping in Three Dimensions

December 14, 2010

It’s standard fare these days to think of maps as political tools, each with its own particular bias or bent. From Mercator’s projection to Greenwich Mean Time, what we see in maps is often exactly what we have been conditioned to see by generations of cartographers and a tradition of Western mapping that prioritizes specific views, divisions, and perspectives. J.B. Harley, a preeminent scholar of maps, characterized cartography as a language, with the cartographer in a significant position of power over the viewer. Yet maps are changing. No longer static lines on a canvas, they can now be seen and experienced with modern tools like GPS and Google Maps. In this first post in my series on modern mapping, I’m going to zero in on a specific element of maps and how we read them that is changing rapidly: perspective.

 

A delightful aerial view map of the St. George Campus in 1932 (From the University of Toronto Archives)

Consider: when you think of a map, you likely think of an aerial, “bird’s eye” view of a terrain, probably subdivided by political boundaries into nations, states, provinces, or other political entities. Land is probably marked as green. Water is blue.  It’s probably centred on Europe or North America. This marks you (and me) as a student of fairly recent Western-style cartography. The method of representing the world through political divisions dates to the era of the rise of the nation-state, but the predominance of the aerial view is also a legacy of the nineteenth century and grew with the rise of scientific cartography. Imperial cartographers, often at the forefront of discovery and seizure of territory, derived much of their feeling of superiority over the populations that inhabited the lands they discovered from what they perceived as their more advanced cartographic knowledge.

To geographers who could explore, measure, name and represent large areas of land in two dimensions, ownership was the logical next step, and a natural right. Organizing territory in this way made sense, and particularly reinforced (for example) the British obsession with characterizing and arranging things. Their power was often exerted even by the act of mapping itself.  Imperial mapmakers used tools and methodology that colonial populations rarely understood, which in the minds of cartographers clearly legitimized the modernizing imperial mission. To the British, a nation that could not identify its own resources, borders and population through mapping could legitimately be colonized by one that could, as various historians from Ian Barrow to Anne McClintock (of Freudian Imperial Leather fame) have explored.  To map using scientific methods was to differentiate colonizer from colonized and project Western “progress” onto the landscape – and its peoples – through logical and rational classification and categorization.

The Great Trigonometric Survey of India

A view of India while undergoing the "Great Trigonometric Survey" in the 19th Century, at the zenith of British Imperial scientific cartography (from the Utah Education Network)

 

The irony, of course, is that science has caused the imperial map of two dimensions to become almost obsolete for practical purposes. Of course, people still pore over maps, hang them on their walls as artefacts, and even sometimes use them to find their way, but the way we actually use maps has shifted.

Native populations in what would later become colonial territories (for example, in Africa) had their own ways of categorizing and describing territory. Boundaries ran along tribal lines, and were dynamic and flexible as tribal lands changed. Rivers – often unexplored to their full extent, in the absence of a British fixation with discovering their mouths and sources – could have a variety of names as they passed through different stages of their existence, from a spring to rapids to a wide oasis on a dry savannah that changed with the season. Geographical markers made sense and were referenced in terms of local context and use, not neat aerial classification at a high level. Above all, land was experienced in three dimensions, as humans really see it, not as birds do. Native peoples were closer to the lands they lived on without the distance of science, and experienced its fluctuations and nuances deeply.

Today we appear to want maps to be more this way. With distance may come power, but with experience comes understanding. I was repeatedly tripped up by a new GPS system last weekend that depicted the area I was driving through in three dimensions. Apparently this form of navigating is far more popular than a bird’s-eye view equivalent. Drivers can feel more a part of the territory through which they are driving – and there is the added benefit of a textual overlay with street names and important markers.

Mapping UofT in Three Dimensions

Mapping UofT in Three Dimensions

And there is little distance at all with applications like Google Earth and Google Maps. With Street View, people can experience geographies without ever having been there. They can effect an instant déjà-vu and familiarize themselves with territory before arriving, enabling familiarity without experience. Mapping has returned to three dimensions.

But what are the costs? I can’t help but think of two articles I read in the last year, one of which described the “barbell” effect of living in a city, where we tend to know the areas in which we live and work, with little knowledge of the neighbourhoods in between. The other referred to the rise of GPS as sounding a death knell for getting lost, a tragedy because it forestalls ever getting to know an area to which we are not explicitly travelling. (Clearly this author had never experienced a faulty GPS that landed him in a sketchy parking garage 40 minutes from the movie theatre he had intended to visit, as I have.) Nonetheless, being lost can be an experience of discovery. But how often do we look to explore areas on Google Maps with which we aren’t already familiar?

We now have a scientific three-dimensional view, in some ways the best of both worlds. But it may be that we lose the overall sense of continuity that a scientific, small-scale map brings while also losing the sense of connectedness and local context that comes with intimate knowledge of a certain small piece of territory.

I started this post talking about bias and manipulation by cartographers. With the variety of perspectives we can simultaneously employ now, much of that manipulation has disappeared. What we choose to see and seek out is now up to us, and the bias has become less the cartographer’s than our own. We’ll learn in time which one obscures more.

This post is part one of a three-part series on the past, present and future of mapping. Stay tuned.


Secrets and Lies — and Google

November 29, 2010

There is an almost hysterical paranoia that permeates the air these days, concerning the information that is being collected about us and helping us make decisions, in many ways without our knowledge. What am I talking about? The secret of success for everything from Google searches to your mortgage approval process: algorithms.

And secret they are. Much of the fear about them is that they are “black boxes” about which we know little, and yet they are making decisions for us every day. In the process, some worry, they are taking away our capacity for decision-making and automating processes in which human input may be necessary to correct inconsistencies or mistakes. An extended report in The Globe & Mail last week examined the impact such incomprehensible and inaccessible mathematical formulas can have: according to the data collected, buying floor protectors at Canadian Tire might signal a conscientious borrower; late-night website browsing may indicate anxiousness and, in combination with a host of other minor activities, derail a credit application.

Google is another example: it uses complex algorithms to filter information to find exactly what it thinks we need, or, as its mission statement says, to “organize the world’s information and make it universally accessible and useful.” It also provides us with ads, of course, based on our search history and preferences and in theory tailored to our needs. Even online dating websites such as OkCupid and eHarmony make extensive use of algorithms to predict who will make a good match. The information that comes out of such sites is a fascinating look at the likes and dislikes of a broad cross-section of the population.

The formulas used are secret, of course, in order to protect the competitive advantages of the organizations they serve. What surprises me is why there is such intense fear of them, these unknown equations that guide our choices. We are not forced to click on any of the links Google serves up. We’re not even forced to use Google as our search engine. If we want a local plumber, we can always use the Yellow Pages, where prominence is determined by advertising payments. Is this any better?

Perhaps it is the lack of control that is so terrifying. Because algorithms filter information for us, there is an unimaginable amount that we just never see. We literally don’t know what we don’t know. Somehow this seems more sinister than the way it used to be when we were all relatively more ignorant, perhaps because, through the Internet, we are now aware of there being a lot more information out there.

Does Google have a sinister hidden agenda? One would think that such a thing would go against its code of conduct of not being evil. Does OkCupid? Likely not, but in filtering information to satisfy our (perceived) needs and wants, argues Alexis Madrigal in this month’s Atlantic, algorithms can serve to maintain the status quo – or even prevent shifts in societal norms:

By drawing on data about the world we live in, [algorithms] end up reinforcing whatever societal values happen to be dominant, without our even noticing. They are normativity made into code—albeit a code that we barely understand, even as it shapes our lives.

Madrigal goes on to say that Google, OkCupid and their ilk give us only “a desiccated kind of choice,” and that we need to break the patterns by choosing against type. We need to make ourselves less predictable, to click unexpected links and choose unexpected partners, presumably in order to ensure that society in general doesn’t stagnate. Don’t trust The Man and all that.

The growing paranoia that unseen and unchecked forces are predicting – even controlling – our behaviour seems to be growing even faster than fear of Yemeni terrorists. I think it relates back to our growing cynicism and distrust toward all large organizations. Believing in anything at all is seen by many as a mug’s game. Trust in governments is ever-declining, the more we find out about how they conceal the truth from citizens, or tap their phone lines, or watch their goings-on. People now, on average, trust NGOs (even ones that are affiliated with large government organizations) much more than governments themselves, and certainly more than the politicians and bureaucrats that staff them. Faith in organized religion has plummeted amid endless sex scandals that are officially acknowledged too late (if at all), refusals from the highest levels to acknowledge the damage done by outdated policies, and generally divergent values from most Westerners about gay marriage, reproductive rights, and female clergy members.

I’ve written before about what apathy and extreme cynicism look like in modern society. I neglected to mention an obsession with knowing the “truth,” even if part of us believes that truth to be fictional or compromised. Hence the enduring popularity of the “exposé,” tabloid journalism, insider specials, and now WikiLeaks, the non-profit whistle-blower organization that is making news (again) this week with the release of thousands of diplomatic cables sent by US ambassadors. Despite pleas from the White House not to release the information (potentially jeopardizing thousands of lives, and undermining US diplomacy and counter-terrorism efforts), the obsession to reveal won out, and the cables were posted anyway.

Why? Secrets may not be entirely benign, but what seems to be missing from the discussion is the idea that neither might their release be. In an age of over-sharing, of laying open our most personal thoughts for the world to see, is even the necessary secrecy of diplomacy unwelcome? It has fallen victim to the public’s need to know anything and everything — or else there must be some ominous conspiracy at play. In democracies, utter transparency seems to be the only option palatable to citizens, and we are unnerved when it isn’t available, so we turn to (often illegal) means of obtaining information, such as WikiLeaks.

It seems we are experiencing a seismic shift in the way we are continually using and desiring more information.  Should we expect it to be entirely accessible at all times, to all people? Knowledge is power, as they say, and everybody wants more. The irony, of course, is that everybody also wants privacy: WikiLeaks, for example, will not disclose its sources, or its founders. One wonders how long they can expect to keep that a secret.


Encore, Encore! On Music and Unpredictability

November 18, 2010

I attended a remarkable performance at the Toronto Symphony Orchestra last night, and after a partial standing ovation, I was surprised to discover that we would be treated to an “encore” of sorts. Naturally, as is now the custom, it was not a repeat of anything we had just heard, but a different piece entirely. I recalled other acts I’ve seen where an encore was welcome and the pieces increasingly popular (for example, with George Michael, who did three, each contaning songs better and more well-known than the last). There were others where encores were notably absent, and the audience felt almost as though they hadn’t had their money’s worth from the evening.

I assumed it was a growing trend, this repeated encore thing, perhaps showing my bias of believing my contemporaries far sillier than our ancestors in putting up with and propagating it. Some research, however, has proven me to be wrong on this score. According to Oxford music historian Michael Burden, giving “encore” performances in fact dates to the early eighteenth century Italian opera circuit in London, when audiences would call for a repeat of an aria by a particularly good prima donna or primo uomo – sometimes right after the initial performance of the piece itself. This means audiences, who had already heard the main theme twice (as per the common ABA da capo format of such music) would ask for it again, and sometimes multiple times, with increasing ornamentation each time from the singer. It all got to be a bit much for some opera-goers, fatigued by performances that were already getting to be increasingly long, sometimes to one o’clock in the morning. (No doubt this was especially hard on those who only attended the opera for fashion’s sake.) It also became too much for many singers, who would often become exhausted and even have to take an extended break to rest their voices. Yet those who did not capitulate would be punished, sometimes for years, in the form of hissing amongst audience members and derision in the fashionable papers.

Thus, a tradition was invented. It appears we are now able to exercise some restraint in our calls for “encores,” and yet we still expect them. It is part of the performance, the elaborate dance between the musicians on stage and the audience. We are all performers now – we play our parts as appreciative audiences with the requisite ovation, even perhaps the standing sort – over the course of an evening. It can be tiresome, all this pageantry, when one might simply prefer to attend a concert, hear the lovely music, pay due appreciation, and depart. (And please feel free to debate with me in the comments section whether you believe standing ovations to be too common and expected – as I do – or audiences too stingy if they fail to leap to their feet – as I’ve heard.)

But the pageantry is now one of the only defining features of live music, encores included. The music is usually not new to us, as it was to eighteenth-century opera-goers. We can hear it whenever we like. So why attend a concert in person when we all have access to world-class recordings of any imaginable piece we would want to hear at the click of a button? Why bother with the expense, the inconvenience of travelling to and fro, the irritation of listening to hacking coughers rattling lozenge wrappers in the seats behind us, when we can simply enjoy the same music in surround sound with sub-woofer enhancements from the comfort of our own homes?

It’s the unpredictability, the multi-sensory experience, the feel of being in the audience. Pick-and-choose music downloading programs like iTunes (and of course Napster, LimeWire, and the like) have brought the recording industry to its knees. They’ve also hampered the ability of artists to choose how their music will be enjoyed (i.e. in the form of track layout on albums, etc.). But it is the appeal of live music, with its surprises, unpredictability and interactivity, that will ensure the continuation of the music industry. Differentiation will come in the form of the unexpected, even if we as audiences expect some kind of extras to make attending worth our while. We will ask musicians to push the limits of how we experience music. After all, as Burden points out, the whole idea of an encore is “not simply to hear it again, but by definition, … to hear it differently.”

We might as well expect more of them. Encore.


Make Money First: The Trouble With Meritocracies

October 19, 2010

For a while now, I’ve been trying to put together a post about the value of polymaths in modern society. 200 or even 100 years ago, such people would need no defenders. What could be more valuable or intrinsically rewarding than being interested in everything and interesting to others? Yet today, polymaths are often seen as dilettantes, unable to focus enough to be serious about something and get a job. There is work, and then there are hobbies, and one should learn to tell the difference and divide one’s life into segments. Few careers reward diversity of knowledge. Fewer still pay well. My tentative title was going to be, “Great Careers for Polymaths,” but the idea made me queasy. Why, I asked myself, do I need to justify having multiple interests with the language of making money?

Because, I realized, we value wealth first. What I mean by “first” is that the goal is to be “secure” financially before seeking career satisfaction, getting in shape or getting married. Wealth is the elusive gateway to a complete life, but many mistake it for a complete life in and of itself.

Read the rest of this entry »


Historical Crime Fiction, Agency, and the Contemporary Mindset

September 30, 2010

I am a big reader of crime novels. From Agatha Christie to Simon Kernick to the ubiquitous Steig Larsson, they delight me with their rapid plot progression and suspenseful chapter endings, which is something I’ve never been very good at writing myself. Nonetheless, in my daydreams, I have envisioned a series of my own that would combine two of my principal interests: crime novels and historical fiction. It would ideally contain all the charm of the nineteenth-century novel, with the plot development of a modern thriller. Call it Pride & Prejudice & Warrant Cards.

I have occasionally forayed into the world of historical crime fiction, and been disappointed with what I’ve read each time, less because of the lack of CSI-type technology (and have you also noticed that computer hackers of some kind are now essential to move the plot along?) than because they are such unrealistic representations of the eras they claim to inhabit. Admittedly, I am more of a stickler for historical accuracy than most, but it strikes me that there are some major barriers to writing a character in the past that most authors overcome simply by ignoring them. Moreover, being a feminist of sorts, I would insist that the protagonist of my series be female. This choice, of course, compounds the difficulty of the situation – there are a number of alterations that would need to be made to maintain accuracy, all of which would seriously compromise my character’s agency.

Read the rest of this entry »


The Necessity of Traditions

September 20, 2010

A few weeks ago, I had the wonderful, life-changing experience of being a bride. The positive (and stressful) aspects of the wedding are obvious, but I was surprised by the constant presence of the past – my own, my family’s, and that of those around me – throughout the planning process. It seemed as though the wedding and associated events almost ran themselves. I discovered that much of what weddings are about is set, if not by law then by the dictates of formality and tradition. From the timing, content, and participants in the ceremony to the many, many superstitions about the day (such as the irritatingly ubiquitous “something old, something new, something borrowed, something blue”), weddings are almost like pre-determined packages, complete with expected players who must attend, observe, or otherwise provide their input.

Being a happily modern woman, I had (and exercised) the option of picking and choosing to suit my taste — no three-tiered fruit cake or garter, but large bunches of flowers and the fancy white dress were okay — but I felt throughout the process that unless I protested something and deliberately carved a new and different path, it was considered to be included by default.

It was a powerful reminder of the role that tradition plays in our lives, especially at those times considered particularly significant. Traditions are things to fall back on. They are the unspoken way things are, promoting a shared understanding among their followers. They provide comfort and consistency across generations and cultures, which is reassuring to all those they impact.

Read the rest of this entry »


The Rise and Fall of the Grand Narrative

August 12, 2010

Those of you who read my blog regularly will know how frequently I lament the increasing specificity required of academic writing, and how it threatens to render the profession obsolete due to lack of readership or general interest in the subject matter. My thoughts were echoed in a recent book review which, in discussing the life of Hugh Trevor-Roper, a prominent historian, remarked that he could never be the great academic he wanted to be – an E.P. Thompson, or a Thomas Macauley, or an Edward Gibbon – because of two key factors. The first was the passing of the “grand narrative” approach to history, which is now seen as unprofessional, or worse, imperialistic in the Marxist teleological sense. The second was a result of his being British, and, as the article notes, “By Trevor-Roper’s day … Britain had become too insignificant to provide the subject of a grand narrative of progress in the style of Macaulay.”  The only nation that could conceivably produce historians claiming to write the story of its own empire today would be the United States, and those who do are usually right-wing polemicists who garner little respect in academic circles.

It’s true that the grand narrative has its drawbacks, as I’ve written before. Huge swaths of history that don’t fit in can be glossed over or ignored entirely in order to weave a tight story. And the grand narrative remains a common way for writers to (consciously or otherwise) impose a single, usually Western, trajectory upon world events that can be interpreted as modern intellectual imperialism. But it remains an anchoring lens through which historical events can be contextualized and patterns examined, and is usually more interesting than a narrow study. So what has caused the violent turn away from the grand narrative?  Is it justified?

Read the rest of this entry »