Feeds:
Posts
Comments

Archive for the ‘identity’ Category

Look at your phone. Go on, look at it. What is it?

It’s a clock. It’s a text-messaging glass slab. It’s a dynamically updating map/tracking device. It’s a ticket. It’s a late-night magazine. It’s an alarm clock. It’s a camera, photo album and publishing platform. It’s a gaming device, newsfeed(s), and a tether keeping work with you 24 hours a day.

Your laptop: it’s forty tabs open at once, word processing documents, music libraries (if you’re old), an EVEN BETTER gaming device, a TV and movie-watching platform, an audio editing suite, and, uh, other forms of entertainment.

You use these devices for dozens of different purposes, out of convenience and functional capacities. What I want you to think about is who you are in each of those purposes, and for whom you are in those purposes.

One of the most intriguing findings from my dissertation research (read it! become a member of a tiny club!) lo these four years ago was the degree to which students segregated audiences by medium. As I put it, they “use different communications technologies in their interactions with social, familial and academic audiences, in part as a manner of combatting the context collapse taking place on social network sites and mediated communications generally.” More directly: they talked to their friends via text message and Facebook message, called their parents on the phone, and only and ever talked to their professors in person and via email. That was, as they say, interesting, and something worthy of further study.

Well: I didn’t. But while the particular practices have shifted in the intervening time, these behaviors are no less intriguing or worthy of study and contemplation.

Cross-medium behavioral research is rare for a number of reasons. It’s expensive, difficult, time-consuming, methodologically fraught, ethically fraught. But I think the main limiting factor is that in any given moment, the incentives for any organization or individual performing research is to answer their central questions, as quickly/cheaply as possible. For an advertising firm: how did a given campaign deliver on KPIs as promised to the client? For an academic researcher: how does X behavior impact on my hopefully-tenure-securing line of research? For a membership organization: what were the A/B test results on a fundraising solicitation?

And to be crystal clear, this is NOT a problem solved by “Big Data.” Few but the most world-spanning organizations have the capacity to iteratively formulate hypotheses, expand data collection across boundaries, and act on findings. And the evidence suggests that even those world-spanning organizations don’t really know what to do with their endless reams of data. But, really, that’s neither here nor there: if you aren’t inside one of the world’s larger walled gardens of behavioral data, you’re still left with the same question. Namely: just who are your users, and who (and when, and how) are you to your users?

One of the foremost issues is attention. There are two ways of looking at attention: as something to maintain, and as something to be acquired. From your perspective, dear reader, you of course want to maintain sustained attention – on relationships, on work, on engaging culture. An advertiser, on the other hand, wants to capture your attention. Chartbeat – which makes a fantastic suite of products for publishers, that I’ve used and enjoyed – is part of a tech vanguard that recognizes this. As they put it:

Online publishers know clicks don’t always reflect content quality.

But research shows more time spent paying attention to content does.

Advertisers know click-through rates don’t matter for display or paid content.

Research shows 2 things matter for getting a brand’s message across: the ad creative and the amount of time someone spends with it.

The Attention Web is about optimizing for your audience’s true attention.

From their perspective, attention equals quality, and a shift to focusing on quantifying attention means better quality content (oh and also more clients). It’s a compelling thesis – but then, it is your attention that they’re selling, to advertisers. Others are more interested in selling your attention to, well, you:

As our computing devices have become smaller, faster, and more pervasive, they have also become more distracting. The numbers are compelling: Americans spend 11 hours per day on digital devices, workers are digitally interrupted every 10.5 minutes, with interruptions costing the U.S. economy an estimated $650 Billion per year. That’s a lot of distraction.

Device makers have largely turned a blind eye to this issue, building distractions in to the very devices we need for work. We address this challenge with tools that simply and effectively reduce digital distractions. Our software interrupts the habitual cycle of distraction associated with social media, streaming sites, and games.

Attention is basically an adversarial dynamic: your devices and the advertiser-supported content therein yelling at you while you struggle to maintain concentration. Many or most of us are in this stage of managing our relationships with digital communicative prostheses – a struggle. It’s not a struggle without benefits, but nor is it one without costs – study after study shows the costs to both productivity and personal health and well-being of a consistently-interrupted existence.

A central part of this struggle is creating a hierarchy – either explicit or implicit – of attention. When do you respond to a text message? It depends when you receive it, and from whom. Do you return an email? Again: who sent it, work or personal, when did it get received? And then: what do you read, or listen to? That also depends – how did you get there? A link from a friend, an immediately-forgotten source on your social media timeline, through a series of unreproducible clicks? The depth, length, and quality of the attention devoted depends on all these factors and more – but I believe it’s impossible to understand the meaning of a given interaction without looking at how these hierarchies are created.

Read Full Post »

Earlier this week I went to an excellent discussion put on by danah boyd and her Data & Society Research Institute, entitled “Social, Cultural & Ethical Dimensions of ‘Big Data.’” Right off the top, I have to give major kudos to danah for organizing a fantastic panel that incorporated a great combination of voices – who, not for nothing (indeed, for a lot) were not just a bunch of white dudes (only one white dude, in fact) – from across different disciplines and perspectives. I’ll do a brief play-by-play to set the table for a couple of larger thoughts.

Following a rigorously on-message video from John Podesta and fairly anodyne talk (well, except for this) from Nicole Wong from the White House Office of Science and Technology Policy, danah led off with introductory remarks and passed off to Anil Dash, who served excellently as moderator (mostly by staying out of the way, as he made a point of noting). Alondra Nelson from Columbia University was first up, giving an account by turns moving, terrifying, and engaging on the state of play and human consequences flowing from DNA databases – both those managed by law enforcement and the loopholes that allow privately-managed data repositories to skirt privacy protections. She was followed by Shamina Singh from the MasterCard Center for Inclusive Growth, who provided several on-the-ground examples of working with governments, NGOs, and poor people to more efficiently deliver social benefits. In particular, she focused on a MasterCard program to provide direct transfers of cash to refugee populations, cutting out the vastly inefficient global aid infrastructure network.

Singh was followed by Steven Hodas from the New York City Department of Education, who laid out an illuminating picture of the lifecycle of data in education systems, the ways in which private actors subvert and undermine public privacy, and – not just a critic – offered a genuinely thought-provoking new way of thinking about how to regulate dissemination of private information. The excellent Kate Crawford batted cleanup, discussing predictive privacy harms and what she called “data due process.” Dash facilitated a very long and almost entirely productive audience question and discussion session (45 minutes, at the least), and I left with many more things on my mind than I entered with. I’d had the privilege of listening to eight different speakers, each from a background either subtly or radically different from one another. Not once did a speaker follow another just like them, and no small value came in the synthesis from those differing perspectives and those of the audience.

This week also saw the relaunch of FiveThirtyEight.com under its new ESPN/Disney instance. It was launched with a manifesto from founder Nate Silver, entitled “What the Fox Knows,” which is a bit meandering but generally comes down as setting FiveThirtyEight as opposed to both traditional journalism and science research, based on some fairly blithe generalizations of those fields. What it doesn’t quite do, oddly for a manifesto, is state just what FiveThirtyEight is for other than a sort of process and attitudinal approach. Marx (or even Levine/Locke/Searls/Weinberger) it ain’t.

Silver has come in for no small criticism, and not just from his normal antagonists. Emily Bell laid out the rather less-than-revolutionary staffing makeup of the current raft of new-media startups, led by Ezra Klein, Glenn Greenwald, and Silver. And Paul Krugman detailed some rather serious concerns about Silver’s approach:

you can’t be an effective fox just by letting the data speak for itself — because it never does. You use data to inform your analysis, you let it tell you that your pet hypothesis is wrong, but data are never a substitute for hard thinking. If you think the data are speaking for themselves, what you’re really doing is implicit theorizing, which is a really bad idea (because you can’t test your assumptions if you don’t even know what you’re assuming.)

These two critiques are not unrelated. Bell called out Silver for his desire for a “clubhouse,” and rightly so, because groupthink clubhouses – whether of insiders or outsiders – are the most fertile breeding grounds for implicit theorizing. Krugman revisited and expanded his critique, saying:

I hope that Nate Silver understands what it actually means to be a fox. The fox, according to Archilocus, knows many things. But he does know these things — he doesn’t approach each topic as a blank slate, or imagine that there are general-purpose data-analysis tools that absolve him from any need to understand the particular subject he’s tackling. Even the most basic question — where are the data I need? — often takes a fair bit of expertise.

Which brings me around to the beginning of this post. The value in Monday’s discussion flowed directly from both the diversity – in professional background, gender, ethnicity – and the expertise of the speakers present. They each spoke deeply from a particular perspective, and while “Big Data” was the through-line connecting them, the content which animated their discussion, approach, and theorizing was specific to their experience and expertise. The systems that create data have their own biases and agenda, which only discipline-specific knowledge can help untangle and correct for. There is still no Philosopher’s Stone, but base metals have their own stories. Knowing their essential properties isn’t easy or quick, but little is easy that’s of lasting and real value.

Read Full Post »

Following on news from the Guardian that Facebook saw a nearly 2% decline in active UK users over the holidays, I thought I’d briefly cover some of the implications of this news, from my perspective.

  • Obviously this has been coming for quite a while in core markets. As the Guardian notes, in the UK Facebook has 53% market penetration, second only to the US at 54%; in terms of gross users, the US has 169M, Brazil 65M, India 63M. Clearly the play is hoping on further expansion in the latter two markets – but that proposition is tenuous, both because the of the fast-growing but still-smaller middle classes there, and because,
  • Facebook still doesn’t get mobile. Its apps are still only-OK in terms of usability, and as witnessed by the Instagram terms-of-service clusterfcuk – which resulted in more than 50% decline in users – Facebook has a fairly poor understanding of the mobile user. Which is especially unfortunate for its future expansion in emerging markets – e.g., Brazil, India – as connectivity there is primarily through mobile devices, and not desktops.
  • Facebook as public company has always been a questionable proposition, as its whole model of ad-rate-growth-driven-by-traffic-growth-driven-by-user-growth is inherently untenable given that… at a certain point you run out of users. Also, the fact that every social network site so far has seen long-term time-on-site decline from its core users. Basically: if you’ve been shorting $FB, you’ve got to be feeling pretty good right now.
  • Facebook as social utility isn’t going anywhere anytime soon. Too many people, content providers, websites, and the general infrastructure of the Web have too much locked in for that to happen. But there are various ways that it can evolve from here. I’m still convinced that long-term there will be a competitive market for identity hosting, and that Facebook’s best move is to get in front of that in both setting open standards and providing a premium service; but we shall see.

     

Read Full Post »

Success and Failure

From an excellent Telegraph story on the cratering of the denim trade in China:

In one desolate room, a former factory boss sat on a stool in shame: having lost all of his family’s money, he was too ashamed to return home for the Chinese New Year holiday.

And from a recent episode of Planet Money:

the U.S. system makes it easier for people to start over, and to keep their financial lives going. Our financial system is set up to embrace failure.

I can’t really separate myself from national identity, here – I’m an American through and through, and can neither deny my cultural immersion or a certain degree of chauvinism on this point. I think it’s great that US bankruptcy laws (though less good than they used to be) allow people, by and large, to start again after things fall apart. Indeed, in Silicon Valley failure is often a badge of pride.

So I won’t make a normative judgment on this cultural difference but rather note that I think this is another example of entity versus incremental theories of self in action.To the extent that people can define their professional successes or failures as not self-confirming or -indicting evidence of an essential self but rather as a series of events from which they can learn and improve regardless of outcome, I think that’s a good thing.

Read Full Post »

Your Voices, Our Selves

One of the best ongoing investigations of thought and the universe is Radiolab, a show produced at WNYC by Jad Abumrad and Robert Krulwich (no small point of pride to me, both Oberlin grads). One of their very best and most mind-blowing episodes came a couple months back, called “Words.” I’d recommend you listen to the show in its entirety, and there are dozens of strands I could pull out and discuss all day. For now, I’d like to focus on the (intentionally) provocative claim made by Charles Fernyhough, a writer and psychologist at Durham University (UK):

“I don’t think very young children do think.”

Spinning this out in a later podcast led to (to my total delight) an in-depth discussion of L.S. Vygotsky’s theories of self and child development, especially on the internalization of speech – learning to refer to oneself as one refers to others.  The podcast focuses on non-normative variations in development – how sometimes, people internalize not just their voice but other voices as part of their internal monologue. Or dialogue. This can in its worst instantiations lead to things like schizophrenia, which is bad.

But I’d like to move one degree further, and think about how these issues relate to ideas of ourselves, and to our shifting media consumption and discussion habits.

Contra the much-discussed Death of Reading, the media landscape today in fact represents the apogee of reading in all of human history. More people are literate today than ever before, and they consume more written text than ever before. That they do not all do so through a printed medium called “book” or “newspaper” is beside the point, as is the fact that they also watch television. Words are being consumed and produced internally in simply staggering amounts, and a great deal of many people’s days – both in the developed world and less-developed countries – involves people, themselves, consuming and producing words internally.

What is the effect, then, of all these internal words on our own personal monologues? What is the effect, in particular, of the chatter of social media, where the voice is not our construction of anonymous authority (or not) from some Media Source but people that we know, whose actual – both written and spoken – voices we are familiar with?

One of to my mind the most elegant definitions of self (also referenced in “Words“) is that it is nothing more than a continuous story we tell: one thing happened, then another, then another, all in the same voice, and that’s how I’m me. Schizophrenia and similar disorders are so terrifying because that basic premise is violated – all of these voices are competing for attention, and it becomes impossible to determine what is real, or who you are.

Pulling all of these threads together, then, the question becomes: what happens to the story of ourselves becomes the story of ourselves? When the “I” is spending so much time with the “we” and the “they” inside our skulls? As a purely personal anecdote, I do know that while I know more specific and timely things than I used to, source attribution is often murky. Did I hear that on the radio, or when talking to a friend? Did I think it myself, or read a blog? Does it matter?

This is not a new question or problem, entirely – the tension between individualism and communitarianism stems from the same dynamic. But the scale of this shift in our internal voices is unprecedented, as is the breadth of effect in the day-to-day lives of people in our technologically-mediated culture. While I tend to eschew both Utopian and Dystopian readings of technology’s effects on us (the Internet being, like Soylent Green, made of people), I do think that it’s worth considering (agnostically) what the longer-term effects of a society-wide shift in the kinds of internal voices we maintain might entail. Probably a big deal.

Read Full Post »

“The world is changed… much that once was is lost, for none now live who remember it.”

I’ve lately had the sensation of living in the future – not the future of robots and flying cars (both in still-tragic short supply) but the future of my life, the future of something New and Different. This has caused me, in turn, to consider just what it is that is new or different, and just what is meant by Future, Past and Present.

We are all of us the epic heroes of our own tales, the centers of action and narrative direction, the most dramatis of personae. So it is fairly obvious to see why my internal libretticist would determine this to be a turning point in the action: some months of great preparation leading to a grande moment, followed by a change of scene and a journey into the unknown. The curtain falls, the screen fades to black, End of Book Two – resume in medias res some months or years along when sufficient interest and tension has built along my next act.

Human that I am, I look for patterns to justify this perception, and believe that I have found them. From where I stand now, the 2000s look like a migraine-filled interregnum – citizen of a country making awful decisions, resident of a planet trundling into irreparable change, confused twentysomething unsure of my place in the world or in myself. The Bush years, even while ongoing, always had the eerie unreality of a dream state. That they were succeeded by the election as President of a black man named Barack Hussein Obama was no less hallucinatory, even if I have the pictures on my cell phone to prove it.

And now awake, and the dream was more and less true for good and bad, but we must live with what was wrought through sleepwalkery. I am an adult (or something like it) in this world after kidhood in the pleasant-smelling 1990s, but even while history spins around again the wheel’s not quite the same for its rotation. Anti-government zealots were just as crazy and well-funded in the bad old days of the Arkansas Project and Tim McVeigh, but today’s wingnuts are self-consciously the stars of their own reality television shows and the media an ever-more-efficient conduit for that effluent.

But then there’s always authoritarians, aren’t there, no matter the names they use or the shirts they wear. My villains of copyright maximalization, seedbank patent-squatters and cynical political operatives sure seem to be wearing black hats: everyone does in silhouette.

I can’t really worry about that, though – can’t have access to more than one subjectivity, can’t have the cut-shot pan-over Cinemascope wide angle. Acting As If is the best I can manage.

So for me, right now, I’ve arrived in the future. Things change always, but a period of flux is over and a new dynamic will be the setting for our action over the next little while. It’s a world where the benefits of communications technology accrue in innumerable ways to increasingly huge numbers of the world’s people, but where material economic growth will remain stagnant for the forseeable future – especially for those of us who already have more than our fair share (but not those with way, way, way more than their fair share). It’s a world where despite these unmistakable improvements to our everyday lives (all of us: next year or the one after, more than half of the citizens of Earth will be able to call each other on the phone; soon after, two out of three, and we haven’t even begun to think about How This Changes Everything), the main task of my professional career and political life will be fighting a rearguard action against Know-Nothings who reject a rationalist worldview: people for whom evidence is bias or proof of its opposite. It’s a world where the institutions – national and international – that have done such a good job getting us Here (for good and for ill), are terribly ill-suited to getting us to some better definition of There. Some of those will get better: many will get worse.

But here we are, innit? And what is my role in this world, this Future? I’ll greatly enjoy figuring that out.

Read Full Post »

Masculinity

Tom Shales notes what was unavoidable in last night’s Super Bowl:

An oddly recurring theme had to do with men asserting their masculinity, or attempting to assert it, as well as the perpetual male fear of emasculation. In an ad for a very portable television called FloTV, a man was seen being dragged through a torturous shopping trip by his girlfriend while sportscaster Jim Nantz ridiculed him… [this in particular disappointed me – shame, Jim Nantz]

Men in their underwear kept popping up — in a Coke ad, a man sleepwalks in the wilderness, clad in boxer shorts and a T-shirt. His odyssey ends only after he finds a cold bottle of Coke.

An ad for Dockers was keyed to the mantra “I wear no pants!” and featured men in their underwear romping around aimlessly. A funny ad for Career Builder.com, depicting the notion of Casual Friday run amok, showed men and women, most of them anything but physically fit, spending a day at the office in their undies.

Men and their traditional roles were also mocked, but somehow also celebrated, in adsintroducing Dove for Men, a line of toiletries. A man raced through a recitation of the chores and good deeds he had obediently done to the tune of Rossini’s “William Tell Overture,” once the theme of “The Lone Ranger” on radio and TV.

An ad for Dodge Charger called the muscle car “Man’s Last Stand” after depicting a supposedly put-upon male who listed all the nice things he did for his female mate. Were these ads for a post-feminist age? They seemed to have a retro appeal — for better and worse. Probably worse.

Not coincidentally are these numbers:

The top red line is unemployment among workers with less than high school education; dark yellow is male unemployment; light yellow is female unemployment; and purple unemployment for those with college educations or more.

Even progress will likely stabilize employment patterns more along these lines than previous patterns – the industries recovering first (service, health care) are traditionally and disproportionately female, whereas the industries hardest hit (construction, manufacturing) are traditionally and disproportionately male. Contra a lot of doomsaying, manufacturing is fine in the US – we just make more stuff with many fewer jobs than we used to, and so even with huge manufacturing growth we’ll have a yet more robust sector with fewer jobs than before (see, e.g., Chris Anderson’s recent piece on distributed manufacturing).

Construction has been hit hard by the inflation and then rapid popping of the housing bubble, and we shouldn’t want those particular jobs to return. But there’s plenty of stuff to build: repairing and improving our electric grid and crumbling bridges, sewers and other basic infrastructure. And then there’s “green jobs,” new energy generation, etc.

But when anyone talks about what the jobs of the 21st Century are going to be, it’s all about the “knowledge economy”, science and research: jobs that require education. And the numbers aren’t on men’s side here, either:

So yes, there’s “something out there” that advertising firms (who are not dumb) are picking up on, a reworking of previous patterns of gender roles in our new economy. Backlash always comes first. And a lot of people talk about how to “fix” the problem of boys/men being left behind. But that presumes that it is a problem, an assumption which takes the patriarchal status quo ante as some combination of natural, just and correct.

I’m not arguing that a society where increasing numbers of men are un- or under- educated and employed is a good or desirable thing: it’s pretty clear that over the long term that leads to undesirable outcomes including but not limited to violence and reactionary political movements (the above set of trends is most definitely one of the things fueling the Tea Parties). But more women in greater positions of economic and political power in this country would be a good thing – and given the macroeconomic trends, it seems likely to be part of our future.

Read Full Post »

Context and Power

Charles Platt, guest-blogging at BoingBoing, shared the following:

The picture above is of me, finishing my shift at the world’s largest retailer. How did I move from being a senior writer at Wired magazine to an entry-level position in a company that is reviled by almost all living journalists?

It started when I read Nickel and Dimed, in which Atlantic contributor Barbara Ehrenreich denounces the exploitation of minimum-wage workers in America. Somehow her book didn’t ring true to me, and I wondered to what extent a preconceived agenda might have biased her reporting. Hence my application for a job at the nearest Wal-Mart.

The job was as dull as I expected, but I was stunned to discover how benign the workplace turned out to be. My supervisor was friendly, decent, and treated me as an equal. Wal-Mart allowed a liberal dress code. The company explained precisely what it expected from its employees, and adhered to this policy in every detail. I was unfailingly reminded to take paid rest breaks, and was also encouraged to take fully paid time, whenever I felt like it, to study topics such as job safety and customer relations via a series of well-produced interactive courses on computers in a room at the back of the store. Each successfully completed course added an increment to my hourly wage, a policy which Barbara Ehrenreich somehow forgot to mention in her book.

Somehow that kind of news is never as popular as denunciations of the free market written by professional handwringers such as Barbara Ehrenreich.

Charles Platt, like the vast majority of WalMart’s management – senior corporate headquarters, store-located, everything – is a white man. Perhaps this difference between himself and Ehrenreich crossed his mind and he chose to ignore it, or maybe it never came up. But to point out the very obvious: white men have a massively disproportionate about of power in the United States. Whites generally have the lower rate of unemployment and higher rate of compensation; white men have higher rates of compensation and lower rates of unemployment than white women. Non-Hispanic white men comprise ~34% of the U.S. population and the era of Obama notwithstanding, continue to control the vast, vast majority of U.S. wealth, political, cultural, [FILL IN THE BLANK] power.

Platt isn’t a racist for not acknowledging these issues, nor a misogynist, nor do I claim any particular valor for foregrounding my white male privilege. It’s a marker of how ingrained and powerful that privilege is that an intelligent guy like Platt – a man who’s made his bread as writer and cultural observer for going on 40 years – could be quite so blind to it. The most revealing turn of phrase in his account is when Platt acknowledges that “Somehow [Ehrenreich’s] book didn’t ring true to me.” Well of course it didn’t – that’s kind of the point. It’s not about Platt – it’s about the two-thirds of the population (more, when you take into account class distinctions) that aren’t Platt and don’t have his built-in racial and/or gender advatages.

More in-depth treatment of other issues of race and achievement later this week, but this kind of set me off.

Read Full Post »

Education

This morning has been chock-a-block full of exciting and hopeful news on the K-12 education front.

First, I was excited to hear “controversial” Chancellor of Schools for Washington, D.C. on the Diane Rehm Show (you can listen to the whole hour there). Suffice to say, the moribund state of the DC schools demanded some kind of radical action, and Rhee has been undaunted in moving forward in the teeth of sometimes-strong opposition from some elements of the teacher’s union. I’m extraordinarily skeptical of any school “reform” efforts that have as their first priority some sort of union-busting (either de jure or de facto) measures, and that I find much of the school reform movement to be basically a politically-motivated attack on teachers’ unions. That being said, the pathetic performance of DC education and too many shameful actions undertaken or defended by the DC teacher’s union have lost them the benefit of the doubt at least for now. Rhee makes a convincing case for performance pay and against purely experience-based granting of tenure. I wasn’t entirely blown away by Rhee – she made a pretty lazy rhetorical defense of an attack by a cller accusing her of creeping school privitization by noting that she was also getting attacked by charter school advocates. But overall, I remain very excited for the new direction Rhee is charting for DC’s public schools.

Following almost directly on that was President-elect Obama’s announcement of Arne Duncan as the Secretary of Education. Steven Levitt is excited and that’s a pretty good indicator as far as I’m concerned:

He is smart as hell and his commitment to the kids is remarkable. If you wanted to start from scratch and build a public servant, Arne would be the end product.

Sounds pretty good! Of course, Secretary of Education is a position that has highly variable levels of influence depending on who’s President, but given Obama’s record at the state and Senate level, combined with having school-age children, and the frequency with which he talks in an impassioned way about education issues (including at today’s presser), it would seem to be the case that Duncan will be pretty well empowered.

Finally, I stumbled across this article at the Washington Post, detailing how my K-12 school system, Montgomery County (MD) Public Schools, is doing away with the “gifted” classification:

The label of gifted, as prized to some parents as a “My Child Is an Honor Student” bumper sticker, is about to be dropped by the Montgomery County school system.

Officials plan to abandon a decades-old policy that sorts second-grade students, like Dr. Seuss’s Sneetches, into those who are gifted (the Star-Belly sort) and those who are not. Several other school systems in the region identify children in the same manner. But Montgomery education leaders have decided that the practice is arbitrary and unfair.

Two-fifths of Montgomery students are considered gifted on the basis of aptitude tests, schoolwork, expert opinion and parents’ wishes. Officials say the approach slights the rest of the students who are not so labeled. White and Asian American students are twice as likely as blacks and Hispanics to be identified as gifted.

The article doesn’t explicitly mention Dweck’s and Steele’s research, but the implications of “gifted” versus non- on theories of self and stereotype threat is hard to ignore. Telling kids that they’re “gifted” leads to entity theories of self, which leads to difficulty later – telling kids that they’re “not gifted” leads to academic disengagement – and kids seeing for themselves the racial and ethnic disparities in gifted classification leads to the formation and reinforcement of pernicious stereotypes about intelligence and academic performance. A change in semantics isn’t going to change all of this, but it’s an absolutely excellent first step.

Though to be fair, this is not the best example:

Montgomery officials say their formula for giftedness is flawed. Nearly three-quarters of students at Bannockburn Elementary School in Bethesda are labeled gifted, but only 13 percent at Watkins Mill Elementary in less-affluent Montgomery Village are, a curious disparity given that cognitive gifts are supposed to be evenly distributed.

As a proud alumnus of Bannockburn Elementary School, I profess to exactly zero surprise that we have so thoroughly whipped those little slackers from Watkins Mill.

Kidding! Kidding! Doing away with gifted-ness is an excellent thing, no caveats.

Read Full Post »

What is Facebook for?

Alice Marwick directed me to an interesting analysis on Facebook’s redesign, which posits that,

Facebook’s new design, as many of us have been noting since the company began testing it months ago, seems to emphasis features also seen in trendy new web services favored by us self-styled “early adopter” types.

Mark Slee of Facebook, in talking about the redesign, says:

The profile is very personal; it’s important to us that everyone have control over their own profile. Along those lines, once you’ve published stories or posted content, you can adjust the size to promote the things you care about most, and demote the stories you don’t find as interesting.

And in discussing the redesign with my friend BC yesterday, he noted that the redesign eliminated clutter, and that the news update had eaten the rest of Facebook: no more static pages. Taken together, it’s a clear change in stance for Facebook. Just as they’d re-adjusted to saturation within college campuses by opening it to everyone, they’re now positioning themselves in response to the rise of micro-blogging services like Twitter by centralizing the News Feed in its presentation – trying to corral those users they see spending more time and energy elsewhere.

I think this may be a key misstep by Facebook – because as “hot” as Twitter and similar services are, their actual user bases are still very, very small (a few percent of Facebook’s, even with explosive growth), if amplified by disproportionate representation in early-adopter communities like bloggers. But it’s not a shock that Facebook would be looking for a solution to some problem, at this point in its history: as my colleague Fred Stutzman noted last November, Facebook’s blessing is also its curse:

Ego-centric social network sites all suffer from the “what’s next” problem. You log in, you find your friends, you connect, and then…what? Social networks solve this problem by being situationally relevant. On a college campus, where student real-world social networks are in unprecedented flux, Facebook is a social utility; the sheer amount of social information a student needs to manage as they mature their social networks makes Facebook invaluable.

… What happens when a social network is no longer situationally relevant? Use drops off.

…Try as they might, once ego-centric social networks lose situational relevance, its pretty much impossible for them to retain their status.

…The coolest tools, the best exclusive media – these are only “fingers in the dam” to keep users in non-situationally relevant spaces.

This clarifies the problem. Facebook’s situational relevance for many users after the initial high-use, friend-finding phase – as an ego-centric social network, based on one’s connections to other individuals – either at college or in the post-college working world is not primarily about finding out what your friends are doing. Rather, it’s a low-involvement way of tracking where they are and where they go, and how to keep in touch with them. Ongoing research that Fred and I are conducting shows that even among current college students, the intensity of Facebook use and identification follows the familiar pattern of decline over time, even if it’s not abandoned entirely.

And Facebook is caught in a bind, because, having both accepted venture capital infusions and sold off a sliver to Microsoft, they now have a very particular interest to keep chasing: increased profit growth. For them, this means, exclusively, increased advertising revenues. synedochic has a long, detailed analysis of why this is a bad/hopeless place to be for social media enterprises (which are slightly different than social networking sites, but similar lessons apply here), but long story short: you can’t squeeze blood out of a rock, and chasing increased ad revenue with a user base whose use is already declining is a very self-defeating proposition.

The vast majority of Facebook users are still Digital Natives who’ve never had (high school or) college without also having Facebook. For them, it’s not simply a case of Facebook becoming passé – it’s a matter of their changing social needs, of shifting situational relevance. As they move into different social contexts – college, work, new cities – there may be bursts of activity where they add and approve new friendships, but it won’t be a “place” to spend time in the same way. Having a new Facebook where they can “adjust the size [of items] to promote the things you care about most, and demote the stories you don’t find as interesting” is beside the point if your relevant social reality is mostly taking place elsewhere – indeed, it’s just more things to ignore.

For all the worries about Digital Natives’ social lives moving to endless hours in front of the computer screen, preliminary research is showing very different effects: “a strong association between use of Facebook and… social capital.” Facebook, and social media generally, are a way to connect with friends, but not the place to connect with friends: that still happens mostly IRL, and ultimately it may be the case that after a “Facebook phase” of socially connecting, offline socialization may increase in aggregate over time (though that’s pure speculation on my part, for now). But the bottom line is that already-experienced Facebook users aren’t going to take on the characteristics of techie “early adopters,” and they aren’t going to go back to “hanging out” on Facebook. Redesigning layout to foreground the News Feed won’t change that.

Why are these issues worth such thorough examination? I believe that as especially Digital Natives come of age in a world of networked publics, where increasingly even offline actions are archived or accessible online, it’s important to follow the shape that the infrastructures of these publics take. While ultimately I do not believe that this redesign of Facebook will achieve the desired goal of restoring high usage levels among long-time users, it will, without a doubt, create a very different experience for newer users of Facebook (which has and will retain a central role in the social lives of millions of young people). For them, Facebook will become mostly about watching other people, and seeing what they do. Will they like it? Will it turn them into voyeurs, or make them more susceptible to suggestion and peer pressure? Will they identify more naturally in groups rather than as individuals? As always – more questions on which only time will tell.

(cross-posted at Digital Natives)

Read Full Post »

Older Posts »