Archive for the ‘identity’ Category

Look at your phone. Go on, look at it. What is it?

It’s a clock. It’s a text-messaging glass slab. It’s a dynamically updating map/tracking device. It’s a ticket. It’s a late-night magazine. It’s an alarm clock. It’s a camera, photo album and publishing platform. It’s a gaming device, newsfeed(s), and a tether keeping work with you 24 hours a day.

Your laptop: it’s forty tabs open at once, word processing documents, music libraries (if you’re old), an EVEN BETTER gaming device, a TV and movie-watching platform, an audio editing suite, and, uh, other forms of entertainment.

You use these devices for dozens of different purposes, out of convenience and functional capacities. What I want you to think about is who you are in each of those purposes, and for whom you are in those purposes.

One of the most intriguing findings from my dissertation research (read it! become a member of a tiny club!) lo these four years ago was the degree to which students segregated audiences by medium. As I put it, they “use different communications technologies in their interactions with social, familial and academic audiences, in part as a manner of combatting the context collapse taking place on social network sites and mediated communications generally.” More directly: they talked to their friends via text message and Facebook message, called their parents on the phone, and only and ever talked to their professors in person and via email. That was, as they say, interesting, and something worthy of further study.

Well: I didn’t. But while the particular practices have shifted in the intervening time, these behaviors are no less intriguing or worthy of study and contemplation.

Cross-medium behavioral research is rare for a number of reasons. It’s expensive, difficult, time-consuming, methodologically fraught, ethically fraught. But I think the main limiting factor is that in any given moment, the incentives for any organization or individual performing research is to answer their central questions, as quickly/cheaply as possible. For an advertising firm: how did a given campaign deliver on KPIs as promised to the client? For an academic researcher: how does X behavior impact on my hopefully-tenure-securing line of research? For a membership organization: what were the A/B test results on a fundraising solicitation?

And to be crystal clear, this is NOT a problem solved by “Big Data.” Few but the most world-spanning organizations have the capacity to iteratively formulate hypotheses, expand data collection across boundaries, and act on findings. And the evidence suggests that even those world-spanning organizations don’t really know what to do with their endless reams of data. But, really, that’s neither here nor there: if you aren’t inside one of the world’s larger walled gardens of behavioral data, you’re still left with the same question. Namely: just who are your users, and who (and when, and how) are you to your users?

One of the foremost issues is attention. There are two ways of looking at attention: as something to maintain, and as something to be acquired. From your perspective, dear reader, you of course want to maintain sustained attention – on relationships, on work, on engaging culture. An advertiser, on the other hand, wants to capture your attention. Chartbeat – which makes a fantastic suite of products for publishers, that I’ve used and enjoyed – is part of a tech vanguard that recognizes this. As they put it:

Online publishers know clicks don’t always reflect content quality.

But research shows more time spent paying attention to content does.

Advertisers know click-through rates don’t matter for display or paid content.

Research shows 2 things matter for getting a brand’s message across: the ad creative and the amount of time someone spends with it.

The Attention Web is about optimizing for your audience’s true attention.

From their perspective, attention equals quality, and a shift to focusing on quantifying attention means better quality content (oh and also more clients). It’s a compelling thesis – but then, it is your attention that they’re selling, to advertisers. Others are more interested in selling your attention to, well, you:

As our computing devices have become smaller, faster, and more pervasive, they have also become more distracting. The numbers are compelling: Americans spend 11 hours per day on digital devices, workers are digitally interrupted every 10.5 minutes, with interruptions costing the U.S. economy an estimated $650 Billion per year. That’s a lot of distraction.

Device makers have largely turned a blind eye to this issue, building distractions in to the very devices we need for work. We address this challenge with tools that simply and effectively reduce digital distractions. Our software interrupts the habitual cycle of distraction associated with social media, streaming sites, and games.

Attention is basically an adversarial dynamic: your devices and the advertiser-supported content therein yelling at you while you struggle to maintain concentration. Many or most of us are in this stage of managing our relationships with digital communicative prostheses – a struggle. It’s not a struggle without benefits, but nor is it one without costs – study after study shows the costs to both productivity and personal health and well-being of a consistently-interrupted existence.

A central part of this struggle is creating a hierarchy – either explicit or implicit – of attention. When do you respond to a text message? It depends when you receive it, and from whom. Do you return an email? Again: who sent it, work or personal, when did it get received? And then: what do you read, or listen to? That also depends – how did you get there? A link from a friend, an immediately-forgotten source on your social media timeline, through a series of unreproducible clicks? The depth, length, and quality of the attention devoted depends on all these factors and more – but I believe it’s impossible to understand the meaning of a given interaction without looking at how these hierarchies are created.

Read Full Post »

Earlier this week I went to an excellent discussion put on by danah boyd and her Data & Society Research Institute, entitled “Social, Cultural & Ethical Dimensions of ‘Big Data.’” Right off the top, I have to give major kudos to danah for organizing a fantastic panel that incorporated a great combination of voices – who, not for nothing (indeed, for a lot) were not just a bunch of white dudes (only one white dude, in fact) – from across different disciplines and perspectives. I’ll do a brief play-by-play to set the table for a couple of larger thoughts.

Following a rigorously on-message video from John Podesta and fairly anodyne talk (well, except for this) from Nicole Wong from the White House Office of Science and Technology Policy, danah led off with introductory remarks and passed off to Anil Dash, who served excellently as moderator (mostly by staying out of the way, as he made a point of noting). Alondra Nelson from Columbia University was first up, giving an account by turns moving, terrifying, and engaging on the state of play and human consequences flowing from DNA databases – both those managed by law enforcement and the loopholes that allow privately-managed data repositories to skirt privacy protections. She was followed by Shamina Singh from the MasterCard Center for Inclusive Growth, who provided several on-the-ground examples of working with governments, NGOs, and poor people to more efficiently deliver social benefits. In particular, she focused on a MasterCard program to provide direct transfers of cash to refugee populations, cutting out the vastly inefficient global aid infrastructure network.

Singh was followed by Steven Hodas from the New York City Department of Education, who laid out an illuminating picture of the lifecycle of data in education systems, the ways in which private actors subvert and undermine public privacy, and – not just a critic – offered a genuinely thought-provoking new way of thinking about how to regulate dissemination of private information. The excellent Kate Crawford batted cleanup, discussing predictive privacy harms and what she called “data due process.” Dash facilitated a very long and almost entirely productive audience question and discussion session (45 minutes, at the least), and I left with many more things on my mind than I entered with. I’d had the privilege of listening to eight different speakers, each from a background either subtly or radically different from one another. Not once did a speaker follow another just like them, and no small value came in the synthesis from those differing perspectives and those of the audience.

This week also saw the relaunch of FiveThirtyEight.com under its new ESPN/Disney instance. It was launched with a manifesto from founder Nate Silver, entitled “What the Fox Knows,” which is a bit meandering but generally comes down as setting FiveThirtyEight as opposed to both traditional journalism and science research, based on some fairly blithe generalizations of those fields. What it doesn’t quite do, oddly for a manifesto, is state just what FiveThirtyEight is for other than a sort of process and attitudinal approach. Marx (or even Levine/Locke/Searls/Weinberger) it ain’t.

Silver has come in for no small criticism, and not just from his normal antagonists. Emily Bell laid out the rather less-than-revolutionary staffing makeup of the current raft of new-media startups, led by Ezra Klein, Glenn Greenwald, and Silver. And Paul Krugman detailed some rather serious concerns about Silver’s approach:

you can’t be an effective fox just by letting the data speak for itself — because it never does. You use data to inform your analysis, you let it tell you that your pet hypothesis is wrong, but data are never a substitute for hard thinking. If you think the data are speaking for themselves, what you’re really doing is implicit theorizing, which is a really bad idea (because you can’t test your assumptions if you don’t even know what you’re assuming.)

These two critiques are not unrelated. Bell called out Silver for his desire for a “clubhouse,” and rightly so, because groupthink clubhouses – whether of insiders or outsiders – are the most fertile breeding grounds for implicit theorizing. Krugman revisited and expanded his critique, saying:

I hope that Nate Silver understands what it actually means to be a fox. The fox, according to Archilocus, knows many things. But he does know these things — he doesn’t approach each topic as a blank slate, or imagine that there are general-purpose data-analysis tools that absolve him from any need to understand the particular subject he’s tackling. Even the most basic question — where are the data I need? — often takes a fair bit of expertise.

Which brings me around to the beginning of this post. The value in Monday’s discussion flowed directly from both the diversity – in professional background, gender, ethnicity – and the expertise of the speakers present. They each spoke deeply from a particular perspective, and while “Big Data” was the through-line connecting them, the content which animated their discussion, approach, and theorizing was specific to their experience and expertise. The systems that create data have their own biases and agenda, which only discipline-specific knowledge can help untangle and correct for. There is still no Philosopher’s Stone, but base metals have their own stories. Knowing their essential properties isn’t easy or quick, but little is easy that’s of lasting and real value.

Read Full Post »

Following on news from the Guardian that Facebook saw a nearly 2% decline in active UK users over the holidays, I thought I’d briefly cover some of the implications of this news, from my perspective.

  • Obviously this has been coming for quite a while in core markets. As the Guardian notes, in the UK Facebook has 53% market penetration, second only to the US at 54%; in terms of gross users, the US has 169M, Brazil 65M, India 63M. Clearly the play is hoping on further expansion in the latter two markets – but that proposition is tenuous, both because the of the fast-growing but still-smaller middle classes there, and because,
  • Facebook still doesn’t get mobile. Its apps are still only-OK in terms of usability, and as witnessed by the Instagram terms-of-service clusterfcuk – which resulted in more than 50% decline in users – Facebook has a fairly poor understanding of the mobile user. Which is especially unfortunate for its future expansion in emerging markets – e.g., Brazil, India – as connectivity there is primarily through mobile devices, and not desktops.
  • Facebook as public company has always been a questionable proposition, as its whole model of ad-rate-growth-driven-by-traffic-growth-driven-by-user-growth is inherently untenable given that… at a certain point you run out of users. Also, the fact that every social network site so far has seen long-term time-on-site decline from its core users. Basically: if you’ve been shorting $FB, you’ve got to be feeling pretty good right now.
  • Facebook as social utility isn’t going anywhere anytime soon. Too many people, content providers, websites, and the general infrastructure of the Web have too much locked in for that to happen. But there are various ways that it can evolve from here. I’m still convinced that long-term there will be a competitive market for identity hosting, and that Facebook’s best move is to get in front of that in both setting open standards and providing a premium service; but we shall see.


Read Full Post »

Success and Failure

From an excellent Telegraph story on the cratering of the denim trade in China:

In one desolate room, a former factory boss sat on a stool in shame: having lost all of his family’s money, he was too ashamed to return home for the Chinese New Year holiday.

And from a recent episode of Planet Money:

the U.S. system makes it easier for people to start over, and to keep their financial lives going. Our financial system is set up to embrace failure.

I can’t really separate myself from national identity, here – I’m an American through and through, and can neither deny my cultural immersion or a certain degree of chauvinism on this point. I think it’s great that US bankruptcy laws (though less good than they used to be) allow people, by and large, to start again after things fall apart. Indeed, in Silicon Valley failure is often a badge of pride.

So I won’t make a normative judgment on this cultural difference but rather note that I think this is another example of entity versus incremental theories of self in action.To the extent that people can define their professional successes or failures as not self-confirming or -indicting evidence of an essential self but rather as a series of events from which they can learn and improve regardless of outcome, I think that’s a good thing.

Read Full Post »

Your Voices, Our Selves

One of the best ongoing investigations of thought and the universe is Radiolab, a show produced at WNYC by Jad Abumrad and Robert Krulwich (no small point of pride to me, both Oberlin grads). One of their very best and most mind-blowing episodes came a couple months back, called “Words.” I’d recommend you listen to the show in its entirety, and there are dozens of strands I could pull out and discuss all day. For now, I’d like to focus on the (intentionally) provocative claim made by Charles Fernyhough, a writer and psychologist at Durham University (UK):

“I don’t think very young children do think.”

Spinning this out in a later podcast led to (to my total delight) an in-depth discussion of L.S. Vygotsky’s theories of self and child development, especially on the internalization of speech – learning to refer to oneself as one refers to others.  The podcast focuses on non-normative variations in development – how sometimes, people internalize not just their voice but other voices as part of their internal monologue. Or dialogue. This can in its worst instantiations lead to things like schizophrenia, which is bad.

But I’d like to move one degree further, and think about how these issues relate to ideas of ourselves, and to our shifting media consumption and discussion habits.

Contra the much-discussed Death of Reading, the media landscape today in fact represents the apogee of reading in all of human history. More people are literate today than ever before, and they consume more written text than ever before. That they do not all do so through a printed medium called “book” or “newspaper” is beside the point, as is the fact that they also watch television. Words are being consumed and produced internally in simply staggering amounts, and a great deal of many people’s days – both in the developed world and less-developed countries – involves people, themselves, consuming and producing words internally.

What is the effect, then, of all these internal words on our own personal monologues? What is the effect, in particular, of the chatter of social media, where the voice is not our construction of anonymous authority (or not) from some Media Source but people that we know, whose actual – both written and spoken – voices we are familiar with?

One of to my mind the most elegant definitions of self (also referenced in “Words“) is that it is nothing more than a continuous story we tell: one thing happened, then another, then another, all in the same voice, and that’s how I’m me. Schizophrenia and similar disorders are so terrifying because that basic premise is violated – all of these voices are competing for attention, and it becomes impossible to determine what is real, or who you are.

Pulling all of these threads together, then, the question becomes: what happens to the story of ourselves becomes the story of ourselves? When the “I” is spending so much time with the “we” and the “they” inside our skulls? As a purely personal anecdote, I do know that while I know more specific and timely things than I used to, source attribution is often murky. Did I hear that on the radio, or when talking to a friend? Did I think it myself, or read a blog? Does it matter?

This is not a new question or problem, entirely – the tension between individualism and communitarianism stems from the same dynamic. But the scale of this shift in our internal voices is unprecedented, as is the breadth of effect in the day-to-day lives of people in our technologically-mediated culture. While I tend to eschew both Utopian and Dystopian readings of technology’s effects on us (the Internet being, like Soylent Green, made of people), I do think that it’s worth considering (agnostically) what the longer-term effects of a society-wide shift in the kinds of internal voices we maintain might entail. Probably a big deal.

Read Full Post »

“The world is changed… much that once was is lost, for none now live who remember it.”

I’ve lately had the sensation of living in the future – not the future of robots and flying cars (both in still-tragic short supply) but the future of my life, the future of something New and Different. This has caused me, in turn, to consider just what it is that is new or different, and just what is meant by Future, Past and Present.

We are all of us the epic heroes of our own tales, the centers of action and narrative direction, the most dramatis of personae. So it is fairly obvious to see why my internal libretticist would determine this to be a turning point in the action: some months of great preparation leading to a grande moment, followed by a change of scene and a journey into the unknown. The curtain falls, the screen fades to black, End of Book Two – resume in medias res some months or years along when sufficient interest and tension has built along my next act.

Human that I am, I look for patterns to justify this perception, and believe that I have found them. From where I stand now, the 2000s look like a migraine-filled interregnum – citizen of a country making awful decisions, resident of a planet trundling into irreparable change, confused twentysomething unsure of my place in the world or in myself. The Bush years, even while ongoing, always had the eerie unreality of a dream state. That they were succeeded by the election as President of a black man named Barack Hussein Obama was no less hallucinatory, even if I have the pictures on my cell phone to prove it.

And now awake, and the dream was more and less true for good and bad, but we must live with what was wrought through sleepwalkery. I am an adult (or something like it) in this world after kidhood in the pleasant-smelling 1990s, but even while history spins around again the wheel’s not quite the same for its rotation. Anti-government zealots were just as crazy and well-funded in the bad old days of the Arkansas Project and Tim McVeigh, but today’s wingnuts are self-consciously the stars of their own reality television shows and the media an ever-more-efficient conduit for that effluent.

But then there’s always authoritarians, aren’t there, no matter the names they use or the shirts they wear. My villains of copyright maximalization, seedbank patent-squatters and cynical political operatives sure seem to be wearing black hats: everyone does in silhouette.

I can’t really worry about that, though – can’t have access to more than one subjectivity, can’t have the cut-shot pan-over Cinemascope wide angle. Acting As If is the best I can manage.

So for me, right now, I’ve arrived in the future. Things change always, but a period of flux is over and a new dynamic will be the setting for our action over the next little while. It’s a world where the benefits of communications technology accrue in innumerable ways to increasingly huge numbers of the world’s people, but where material economic growth will remain stagnant for the forseeable future – especially for those of us who already have more than our fair share (but not those with way, way, way more than their fair share). It’s a world where despite these unmistakable improvements to our everyday lives (all of us: next year or the one after, more than half of the citizens of Earth will be able to call each other on the phone; soon after, two out of three, and we haven’t even begun to think about How This Changes Everything), the main task of my professional career and political life will be fighting a rearguard action against Know-Nothings who reject a rationalist worldview: people for whom evidence is bias or proof of its opposite. It’s a world where the institutions – national and international – that have done such a good job getting us Here (for good and for ill), are terribly ill-suited to getting us to some better definition of There. Some of those will get better: many will get worse.

But here we are, innit? And what is my role in this world, this Future? I’ll greatly enjoy figuring that out.

Read Full Post »


Tom Shales notes what was unavoidable in last night’s Super Bowl:

An oddly recurring theme had to do with men asserting their masculinity, or attempting to assert it, as well as the perpetual male fear of emasculation. In an ad for a very portable television called FloTV, a man was seen being dragged through a torturous shopping trip by his girlfriend while sportscaster Jim Nantz ridiculed him… [this in particular disappointed me – shame, Jim Nantz]

Men in their underwear kept popping up — in a Coke ad, a man sleepwalks in the wilderness, clad in boxer shorts and a T-shirt. His odyssey ends only after he finds a cold bottle of Coke.

An ad for Dockers was keyed to the mantra “I wear no pants!” and featured men in their underwear romping around aimlessly. A funny ad for Career Builder.com, depicting the notion of Casual Friday run amok, showed men and women, most of them anything but physically fit, spending a day at the office in their undies.

Men and their traditional roles were also mocked, but somehow also celebrated, in adsintroducing Dove for Men, a line of toiletries. A man raced through a recitation of the chores and good deeds he had obediently done to the tune of Rossini’s “William Tell Overture,” once the theme of “The Lone Ranger” on radio and TV.

An ad for Dodge Charger called the muscle car “Man’s Last Stand” after depicting a supposedly put-upon male who listed all the nice things he did for his female mate. Were these ads for a post-feminist age? They seemed to have a retro appeal — for better and worse. Probably worse.

Not coincidentally are these numbers:

The top red line is unemployment among workers with less than high school education; dark yellow is male unemployment; light yellow is female unemployment; and purple unemployment for those with college educations or more.

Even progress will likely stabilize employment patterns more along these lines than previous patterns – the industries recovering first (service, health care) are traditionally and disproportionately female, whereas the industries hardest hit (construction, manufacturing) are traditionally and disproportionately male. Contra a lot of doomsaying, manufacturing is fine in the US – we just make more stuff with many fewer jobs than we used to, and so even with huge manufacturing growth we’ll have a yet more robust sector with fewer jobs than before (see, e.g., Chris Anderson’s recent piece on distributed manufacturing).

Construction has been hit hard by the inflation and then rapid popping of the housing bubble, and we shouldn’t want those particular jobs to return. But there’s plenty of stuff to build: repairing and improving our electric grid and crumbling bridges, sewers and other basic infrastructure. And then there’s “green jobs,” new energy generation, etc.

But when anyone talks about what the jobs of the 21st Century are going to be, it’s all about the “knowledge economy”, science and research: jobs that require education. And the numbers aren’t on men’s side here, either:

So yes, there’s “something out there” that advertising firms (who are not dumb) are picking up on, a reworking of previous patterns of gender roles in our new economy. Backlash always comes first. And a lot of people talk about how to “fix” the problem of boys/men being left behind. But that presumes that it is a problem, an assumption which takes the patriarchal status quo ante as some combination of natural, just and correct.

I’m not arguing that a society where increasing numbers of men are un- or under- educated and employed is a good or desirable thing: it’s pretty clear that over the long term that leads to undesirable outcomes including but not limited to violence and reactionary political movements (the above set of trends is most definitely one of the things fueling the Tea Parties). But more women in greater positions of economic and political power in this country would be a good thing – and given the macroeconomic trends, it seems likely to be part of our future.

Read Full Post »

Older Posts »