Feeds:
Posts
Comments

Human Freedom

Lost in Friday-news-dump-land among all the election mishegas was this:

Reversing a longstanding policy, the federal government said on Friday that human and other genes should not be eligible for patents because they are part of nature. The new position could have a huge impact on medicine and on the biotechnology industry.

The new position was declared in a friend-of-the-court brief filed by the Department of Justice late Friday in a case involving two human genes linked to breast and ovarian cancer.

“We acknowledge that this conclusion is contrary to the longstanding practice of the Patent and Trademark Office, as well as the practice of the National Institutes of Health and other government agencies that have in the past sought and obtained patents for isolated genomic DNA,” the brief said.

Regardless of whatever happens on Tuesday, this is a huge win for the future of human freedom and well-being. As we further our knowledge of genetics, leading to even greater advances in potential human wellness, those windfalls should not be the property of any individual or corporation, but rather should accrue to humanity in general. This one ruling won’t ensure that, but it reflects a necessary and welcome shift towards a more basically just future in what will be one of the most important industries and areas of development of this century.

Advertisements

Cell Phones and Polling

As Nate Silver reported the other week:

Pew Research issued a study suggesting that the failure to include cellphones in a survey sample — and most pollsters don’t include them — may bias the results against Democrats. Pew has addressed this subject a number of times before, and in their view, the problem seems to be worsening. Indeed, this is about what you might expect, since the fraction of voters who rely on cellphones is steadily increasing: about 25 percent of the adult population now has no landline phone installed at all.

Clearly, this is a major problem in survey research — and one that, sooner or later, every polling firm is going to have to wrestle with. What isn’t as clear is how much of a problem it is right now.

He goes on to cover several of the key issues that are specific to this case and time. but I’ll focus for a minute on the larger-scale issues. I’ve talked about some of these ideas before, and indeed we were talking about cell-phone undercounting on the Dean campaign in 2003 and Kerry in 2004 (not, as it turned out, the biggest problem in either of those cases). But as Nate says: this is a major problem that sooner or later everyone is going to have to deal with, it’s just a question of when.

Will that be this year? Hopes of Democrats aside, probably not – or at least, not provably, given the substantial problems in constructing likely voter screens this cycle. But when the dust settles and post-election analyses are done, all the pollsters are going to have to take a good, long look at their numbers and at results, and through the lens of Pew’s results, begin to (or further) adjust their approaches. Because by 2012, an even larger share of the voting-age population will be living in cell-phone-only households, due both to continued abandonment of landlines by older demographics and the maturation of millions more who’ve never had a landline (and mostly never will).

This isn’t an impossible problem, but it’s also not solvable with a silver bullet. Polling, like any sort of research, is going to need to become more multi-modal, faster-thinking and -responding, in order to reflect anything like a generalizable sample of the population. This means working harder, thinking more and understanding better the ways in which all different sorts of people use different kinds of communications technologies.

Your Voices, Our Selves

One of the best ongoing investigations of thought and the universe is Radiolab, a show produced at WNYC by Jad Abumrad and Robert Krulwich (no small point of pride to me, both Oberlin grads). One of their very best and most mind-blowing episodes came a couple months back, called “Words.” I’d recommend you listen to the show in its entirety, and there are dozens of strands I could pull out and discuss all day. For now, I’d like to focus on the (intentionally) provocative claim made by Charles Fernyhough, a writer and psychologist at Durham University (UK):

“I don’t think very young children do think.”

Spinning this out in a later podcast led to (to my total delight) an in-depth discussion of L.S. Vygotsky’s theories of self and child development, especially on the internalization of speech – learning to refer to oneself as one refers to others.  The podcast focuses on non-normative variations in development – how sometimes, people internalize not just their voice but other voices as part of their internal monologue. Or dialogue. This can in its worst instantiations lead to things like schizophrenia, which is bad.

But I’d like to move one degree further, and think about how these issues relate to ideas of ourselves, and to our shifting media consumption and discussion habits.

Contra the much-discussed Death of Reading, the media landscape today in fact represents the apogee of reading in all of human history. More people are literate today than ever before, and they consume more written text than ever before. That they do not all do so through a printed medium called “book” or “newspaper” is beside the point, as is the fact that they also watch television. Words are being consumed and produced internally in simply staggering amounts, and a great deal of many people’s days – both in the developed world and less-developed countries – involves people, themselves, consuming and producing words internally.

What is the effect, then, of all these internal words on our own personal monologues? What is the effect, in particular, of the chatter of social media, where the voice is not our construction of anonymous authority (or not) from some Media Source but people that we know, whose actual – both written and spoken – voices we are familiar with?

One of to my mind the most elegant definitions of self (also referenced in “Words“) is that it is nothing more than a continuous story we tell: one thing happened, then another, then another, all in the same voice, and that’s how I’m me. Schizophrenia and similar disorders are so terrifying because that basic premise is violated – all of these voices are competing for attention, and it becomes impossible to determine what is real, or who you are.

Pulling all of these threads together, then, the question becomes: what happens to the story of ourselves becomes the story of ourselves? When the “I” is spending so much time with the “we” and the “they” inside our skulls? As a purely personal anecdote, I do know that while I know more specific and timely things than I used to, source attribution is often murky. Did I hear that on the radio, or when talking to a friend? Did I think it myself, or read a blog? Does it matter?

This is not a new question or problem, entirely – the tension between individualism and communitarianism stems from the same dynamic. But the scale of this shift in our internal voices is unprecedented, as is the breadth of effect in the day-to-day lives of people in our technologically-mediated culture. While I tend to eschew both Utopian and Dystopian readings of technology’s effects on us (the Internet being, like Soylent Green, made of people), I do think that it’s worth considering (agnostically) what the longer-term effects of a society-wide shift in the kinds of internal voices we maintain might entail. Probably a big deal.

“The world is changed… much that once was is lost, for none now live who remember it.”

I’ve lately had the sensation of living in the future – not the future of robots and flying cars (both in still-tragic short supply) but the future of my life, the future of something New and Different. This has caused me, in turn, to consider just what it is that is new or different, and just what is meant by Future, Past and Present.

We are all of us the epic heroes of our own tales, the centers of action and narrative direction, the most dramatis of personae. So it is fairly obvious to see why my internal libretticist would determine this to be a turning point in the action: some months of great preparation leading to a grande moment, followed by a change of scene and a journey into the unknown. The curtain falls, the screen fades to black, End of Book Two – resume in medias res some months or years along when sufficient interest and tension has built along my next act.

Human that I am, I look for patterns to justify this perception, and believe that I have found them. From where I stand now, the 2000s look like a migraine-filled interregnum – citizen of a country making awful decisions, resident of a planet trundling into irreparable change, confused twentysomething unsure of my place in the world or in myself. The Bush years, even while ongoing, always had the eerie unreality of a dream state. That they were succeeded by the election as President of a black man named Barack Hussein Obama was no less hallucinatory, even if I have the pictures on my cell phone to prove it.

And now awake, and the dream was more and less true for good and bad, but we must live with what was wrought through sleepwalkery. I am an adult (or something like it) in this world after kidhood in the pleasant-smelling 1990s, but even while history spins around again the wheel’s not quite the same for its rotation. Anti-government zealots were just as crazy and well-funded in the bad old days of the Arkansas Project and Tim McVeigh, but today’s wingnuts are self-consciously the stars of their own reality television shows and the media an ever-more-efficient conduit for that effluent.

But then there’s always authoritarians, aren’t there, no matter the names they use or the shirts they wear. My villains of copyright maximalization, seedbank patent-squatters and cynical political operatives sure seem to be wearing black hats: everyone does in silhouette.

I can’t really worry about that, though – can’t have access to more than one subjectivity, can’t have the cut-shot pan-over Cinemascope wide angle. Acting As If is the best I can manage.

So for me, right now, I’ve arrived in the future. Things change always, but a period of flux is over and a new dynamic will be the setting for our action over the next little while. It’s a world where the benefits of communications technology accrue in innumerable ways to increasingly huge numbers of the world’s people, but where material economic growth will remain stagnant for the forseeable future – especially for those of us who already have more than our fair share (but not those with way, way, way more than their fair share). It’s a world where despite these unmistakable improvements to our everyday lives (all of us: next year or the one after, more than half of the citizens of Earth will be able to call each other on the phone; soon after, two out of three, and we haven’t even begun to think about How This Changes Everything), the main task of my professional career and political life will be fighting a rearguard action against Know-Nothings who reject a rationalist worldview: people for whom evidence is bias or proof of its opposite. It’s a world where the institutions – national and international – that have done such a good job getting us Here (for good and for ill), are terribly ill-suited to getting us to some better definition of There. Some of those will get better: many will get worse.

But here we are, innit? And what is my role in this world, this Future? I’ll greatly enjoy figuring that out.

Planet Money reports:

Phones running Google’s Android operating system outsold the iPhone in the first quarter of this year. What’s more, BlackBerry phones outsold both iPhones and phones running Android.

BlackBerry phones, which run an operating system from Research In Motion, had 36 percent market share, according to NPD group, a research company. Android phones (including the widely advertised Droid) had a 28 percent share. And iPhones, which run Apple’s own operating system, had a 21 percent share.

It was the first time Android phones outsold iPhones.

…and iPhones will never again outsell Android phones, until and unless Google renames Android. Here’s why.

The iPhone is a great tool, and as Charles Stross has been pointing out for a while, Apple is making a big bet that their future is not as a hardware company with ancillary software but as a platform company with ancillary hardware.  Because Apple is Apple, they want to control this platform totally, so this means they make the hardware that runs the platform, and they control the price-point. This works up to a point, but simple economics on both ends (theirs and consumers’) dictates that there’s necessarily a ceiling for both the number of iPhones they can make and the number of people who might buy iPhones. Apple survives by making those numbers as similar as possible, but it’s never going to be approaching 100% – or 50, even. Twenty percent market share is pretty substantial, but I wouldn’t anticipate Apple’s share of any market getting bigger than that.

Research in Motion’s continued dominance of the smartphone market is pretty impressive, and they’ve wisely kept their sights firmly focused on doing one thing and doing it well: making a business- and email-centric device that just plain works, and that its users stick with through multiple generations and structure their digital lives around.  The Blackberry appears to have staying power, but with a substantial caveat: it’s a perfect device for email (and texting) but not for Web2.0 and the social web.  That’s fine – there will always be a business and power-user market – but it’s tough to see RIM’s market share increasing much beyond where it is now (I’d expect it to shrink and stay put at a lower level), because as my research shows, young people don’t really have email as the central communications method of their digital lives.  Phones are central, texting especially so, and the social web after that. Email is for professors and the professional world, so for those that head that direction, Blackberries are in their future.

But Android really is the future of the mobile environment, over the next several years.  Like the Apple ecosystem, it’s an app-heavy and social-web-facilitating environment, but unlike Apple and RIM, Google is happy to let anyone (on any mobile network) make phones that run its OS – and thus experience the mobile web how Google would like you to do so. Which is, for the moment at least, preferable: no censorship in its app store, and a wide (and widening) range of hardware choices on your preferred mobile carrier.  Anything that fights against the Web turning into a set of walled gardens, I can heartily endorse. Android will also push prices down for all smartphones and for access to the mobile web by offering experiences comparable to the iPhone and Blackberry without the single-system lock-in, and that’s (clearly) preferable, too.  While the jury is still out on Google’s entrance into the hardware world, it’s not as important as their introduction and support of an open and high-quality platform for the Web to move to mobiles without intermediation and without sacrificing its values and variety.

Forcing the Party Line

When telephones were first invented, you didn’t just call a number and get a person on the other end: usually, first you’d talk to an operator, who would then connect you to a local loop where the desired party resided.  There were special rings within each loop to distinguish who was getting the call, but if someone else on your loop or the loop you were calling wanted to listen in, you couldn’t stop them.  This was a function of cost – it was pretty expensive to get a residential telephone line before the federal government guaranteed universal access and then deregulated the phone companies.

This was, it’s pretty well agreed, a bad system notwithstanding the excellent fodder it produced for light farce.  The residential system that replaced it was pretty problematic, too, leading as it did to:

a 70 year or so period where for some reason humans decided it was socially acceptable to ring a loud bell in someone else’s life and they were expected to come running, like dogs.

So, not the best either, even though it too did produce some great songs.

The growth of electronically-mediated and time-shifted communications may have a mixed record on a lot of issues, but it’s an unambiguous good in terms of individual control over their method, mode and timing of response to communications. I think this is a good thing. Communications where the sender is unsure of the extent of the audience, or the receiver potentially forced into confrontation, are not beneficial for either the clear conveyance of meaning or social cohesion.

Which is why Facebook’s recent actions are both troubling and perplexing.  By making all connections public for all users, they are ambiguating audience and forcing potential confrontations (between managed identities, work and personal lives, etc.) for all their users.  The shift in Facebook privacy settings takes as its central premise that the advances in telephone communications of the past century were a bad idea. It is forcing all of its users into an always-on global party line, where the conversations are transcribed and sold to all interested parties.  That’s not good.

Digital technologies allow us the ability to talk to basically whomever we want (and only them) whenever we want (and only then).  That Facebook would consider these to be bad things is deeply weird, and makes a compelling case against using it as a central mode of communication.

Police States are Bad Ideas

This is why:

Lower Merion School District employees activated the web cameras and tracking software on laptops they gave to high school students about 80 times in the past two school years, snapping nearly 56,000 images that included photos of students, pictures inside their homes and copies of the programs or files running on their screens, district investigators have concluded.

In most of the cases, technicians turned on the system after a student or staffer reported a laptop missing and turned it off when the machine was found, the investigators determined.

But in at least five instances, school employees let the Web cams keep clicking for days or weeks after students found their missing laptops, according to the review. Those computers – programmed to snap a photo and capture a screen shot every 15 minutes when the machine was on – fired nearly 13,000 images back to the school district servers.

If authorities have the ability to behave badly, some of them always will. Which is why stuff like this is especially bad:

The MPAA and RIAA have submitted their master plan for enforcing copyright to the new Office of Intellectual Property Enforcement. As the Electronic Frontier Foundation’s Richard Esguerra points out, it’s a startlingly distopian work of science fiction. The entertainment industry calls for:

  • spyware on your computer that detects and deletes infringing materials;
  • mandatory censorware on all Internet connections to interdict transfers of infringing material;
  • border searches of personal media players, laptops and thumb-drives;
  • international bullying to force other countries to implement the same policies;
  • and free copyright enforcement provided by Fed cops and agencies (including the Department of Homeland Security!).

The Fourth Amendment has been gutted pretty extensively over the past generation, but if “unreasonable search and seizure” has any meaning at all, it should mean that neither the government nor private corporations should be legally empowered to constantly monitor our activities through our own computers.

I’ve thought for a while that the coming divide in our politics is not going to be one of conservatism versus liberalism, but about authoritarianism versus a politics of individual liberty. News like this goes to reinforce that belief, as does the acrimony of our current political climate in the US. More on the latter, later, but I’ll reiterate the main point: it’s a bad idea to give authorities unlimited surveillance powers because they will always, always be abused.