Saturday, January 30, 2010

on homophobia, classism, and the politics of rape: Don Belton and Bloomington's Pride Film Festival

I want to talk about Don Belton.

Belton, you may remember, was the Indiana University professor who was found stabbed to death in his home on Christmas day. He was the gay Indiana University professor; his killer, ex-Marine Michael Griffin, has not only confessed but has explained his motive for stabbing Belton:

The former military man told police that Belton, who was openly gay, sexually assaulted him in front of his girlfriend, while they were both intoxicated on Christmas Day. And because the assistant professor of English refused to "show remorse," Griffin stabbed him to death, according to court documents.

Bloomington's LGBTQ community was hit fairly hard by the news of Belton's death. In part, this is because Belton was well liked; and in part, this is because the killing repeats the old message that nobody wants to be reminded of: It's (still) not safe to be gay in America.

A web site was built and called "Justice for Don Belton." Vigils were held. Press releases (here, here) were circulated mourning Belton's death and noting the loss to the IU community. And this year's Pride Film Festival, an annual LGBTQ event held in downtown Bloomington, has been dedicated to Belton's memory.

All of this for someone who has officially been identified as a perpetrator of sexual assault.

If this were a hetero situation, and the killer were a woman who claimed to have killed a man after two incidents of sexual assault, there would be no vigils. There would be no websites. There would be no film festival dedicated to the dead man's memory. And rightly so: After centuries of struggle, we have finally started to evolve into a society that does its best to side with the alleged victim in cases of sexual assault. We aren't a society that does its very best, of course, and you know, we sort of keep having to have the same conversation every time it comes up: Rape is not about sex. It's about power. And women who accuse a man of literal rape have been subjected to metaphorical rape by a court system that embraces a blame-the-victim mentality. And so on. But we're trying, and we're getting better at having these conversations.

And of course this isn't a hetero situation, and the gender, power, and sex issues don't map. We pretty much don't believe that Belton was a rapist or that Griffin was a victim; we believe--and, to be clear, I believe--that Belton was brutally murdered, and that the motive was homophobia. Homosexuality is a deep threat to heteronormative culture, to the status quo. It's dangerous and terrifying and the most insecure among us believe it must be blotted out. With violence, if needs be.

Belton's death is a reminder that no matter how far we've come, we're still a society that cannot guarantee the safety of its marginalized members. Bloomington was recently named America's 4th gayest city by the Advocate, which confuses me but let's go with it for now. And this year's Pride Festival,
which is deploying at the Buskirk-Chumley theater this very weekend, has drawn hundreds, if not thousands, of beautiful, joyous, and celebratory LGBTQ and LGBTQ-friendly community members. But all it takes is misreading one person, or showing up at the wrong bar at the wrong time, or acting a little too gay, or even just holding your partner's hand in public; and the Great Lie starts to unravel. It's not safe to be gay in America. It's not even always acceptable to be gay in America.


This isn't to say the reaction of the LGBTQ community to Belton's death is completely ick-free. There is the issue of classism. Part of the reason we don't believe Griffin is that Belton was so cultured. He was well educated. He was a writer. He was a professor, for godsake. He couldn't have possibly raped someone. I mean, just look at his picture:


Here's Griffin, an ex-Marine, 23 years old:



Leaving aside issues of race--not because I think we should leave those issues aside, but because I'm not qualified to talk about race--we craft a narrative around Belton and Griffin, and it's a narrative that points to deep class assumptions that hover above issues of gender and sexual orientation. It's the same sort of narrative that frames, for example, the story of Tiger Woods and his multiple mistresses ("Cocktail waitresses! Pancake servers! Why's Tiger rooting around in the trash?!?"), our attitudes toward celebrities ("Britney Spears--you can take the girl out of Hicksville, but...."), and the political decisions that undergird our social structure.

It's easier and simpler to use Belton's murder as a touchstone for conversations about the state of gay rights in America. In fact, this story, like all stories worth telling, is far more complicated and multithreaded. Like all stories worth telling, the work of interpreting the details is far less clearcut than it seems upon first blush.

Wednesday, January 27, 2010

RIP Howard Zinn

Best known, I suppose, for his A People's History of the United States, Howard Zinn was a relentless force for change. He fought for the poor, the underclass, the underprivileged, and the underheard.

In his 2002 book, You Can't Be Neutral on a Moving Train: A Personal History of Our Times, Zinn describes his own awakening into an awareness of the deep inequities built into American society. He writes:
As I began to realize, no pitifully small picket line, no poorly attended meeting, no tossing out of an idea to an audience or even to an individual should be scorned as insignificant.

The power of a bold idea uttered publicly in defiance of dominant opinion cannot be easily measured. Those special people who speak out in such a way as to shake up not only the self-assurance of their enemies, but the complacency of their friends, are precious catalysts for change.

Howard Zinn was 87 years old, Even so, the time we got to have him seems pitifully brief.

Related information:

Sunday, January 24, 2010

new technologies bore the crap out of me.

Despite what you may have heard, I'm not really all that into new technologies.

Typically, I find out about new technologies long after they're already old news. This is a constant source of shame for me. ('Hey,' I said in late 2009, 'this cloud computing thing sounds interesting. What is it?') As much as I would like to join the ranks of early adopters, I simply lack the constitution. ('Now, what's this DROID thing I've been hearing so much about this year? Oh, it's been around since 2007? Well, who makes the Droid? Oh, it's a mobile operating system and not actually a phone? Can you tell me again what a mobile operating system is?') My buddy Steve, who likes to find out about new technologies when they're still in prototype form, regularly subjects me to conversations I don't really understand about technologies that don't make sense to me. (Here I would insert a sample conversation if any single representative discussion had made enough sense to me to stick in my memory.)

Technologies bore me. I don't care about 3-D printers or 3-D TVs. I'm not interested in transparent computer screens. I don't want to know how my analog watch was made, and I don't care how light bulbs--even the new, energy-efficient ones--are manufactured.

Though I don't care about how things are made, I am interested in finding out how things work. This is a subtle but important distinction. I want to learn how lean, mean websites are built, and I want to build some of my own, even though I have absolutely no idea how my computer is able to translate hypertext markup language (html) into artifacts that humans can interpret. I don't know what a "3G network" is or how my new Droid phone uses it to give me constant access to its features, but I do want to know how to set up my phone so I can quickly access contact information for the people I communicate with the most. I would also like to know how to set up my calendar for weekly views instead of daily views.

It's not the technology that interests me, but its uses. And as long as I'm thinking about uses for a technology, I might as well think about how to manipulate its features to support practices that meet my needs.

Clay Shirky, despite his recent unfortunate foray into gender politics, is actually pretty smart when he's talking about things he's qualified to discuss. In his 2008 book Here Comes Everybody, he wrote that "communications tools don't get socially interesting until they get technologically boring." And he's absolutely correct: For socially interesting things to happen, widespread adoption of a technology is the first requirement, complemented by widespread use of the technology. The automobile has led to a reshaping of our roads, our communities, our attitudes toward travel--has, in short, become socially interesting--because its novelty has long since worn off as car ownership has inched toward universal. Cellphones have supported uprisings, protests, revolutions, because we've had them around long enough to know how to leverage them for purposes for which they were not originally intended.

In general, I've made uneasy peace with my apathy toward new technologies, with one caveat: It's early adopters who get the earliest, if not the most, say in how technologies are taken up as they become more widespread. And early adopters tend to be young, white, male, college-educated, and affluent. Which is fantastic for them but not so great for people whose needs and interests don't align with the needs and interests of the rich-young-educated-white-guy demographic.

Still, you just can't do things your body wasn't meant to. I don't guess I'll ever be able to force myself to care about 3G networks, but it's easy enough to start thinking about the social implications of a tool that's 3G-network enabled and pocketsized. Now we're talking about the possibility of near-limitless access to information and communication: the building blocks for fostering and supporting civic engagement, community participation, and the chance to dance up alongside those early adopters, join them for a while, and modify the music to make a tune that's easier to dance to.

Now we just need to figure out a way to get everyone dancing. We start by lowering the actual cost of admission (this is one way that early adopters help support the social revolution: They pay $499 so you don't have to!), then we start pounding down the perceived cost of admission:

  • technological Woody Allen-ism, the fear of trying a new tool for fear of looking stupid and / or causing irreparable harm to oneself or others;  
  • technological Pangloss Syndrome, the perception that the uses built into the tool are the best possible uses for that tool; and 
  • technological Morpheus Syndrome, the sense that uses for a tool have already been predetermined anyway, so even if there might be better uses we might as well just stick with destiny.

And--hey!--I think I just gave myself fodder for my next three blog posts.

Friday, January 22, 2010

george w. bush reaches one rotting hand from the grave

I believe campaign finance reform is the most important issue in contemporary American politics. The unfettered ability of corporate and interest-based groups to maintain or remove politicians is the reason behind every evil, bigoted, and short-sighted law our legislative bodies pass. It's the reason behind the slow pace of the political machine. It's the reason so many Americans feel there's no point in voting since they don't really have much of a say in who wins anyway. (And they're right to feel so.)

Now the Supreme Court of the United States has cleared the way for lobbyists and Big Business to grab even more control of our laws and our land by ruling, 5-4, to overturn previous campaign finance reform laws limiting the amount organizations could donate to political candidates.

President Obama said the ruling was "a major victory for big oil, Wall Street banks, health insurance companies and the other powerful interests that marshal their power every day in Washington to drown out the voices of everyday Americans."

Obama is right. This is a dark, dismal day for American politics. The Supreme Court's ruling has plummeted us back into the cesspool of corruption that we've spent a century trying to clamber our way out of.

The Bush administration, with its installation of a pair of corrupt, rightwing and deeply shortsighted justices, can claim its credit for this ruling. History will label the Bush tenure the last bastion of an era of brazen corruption, greed, and reckless narcissism. For those of us who loathed George W. Bush and his lackeys from the get-go, this cannot come soon enough.

It also won't feel anything at all like sweet revenge. We're all mired in the shit now, regardless of who we voted for.







Thursday, January 21, 2010

"Math class is tough!" a few thoughts on a problematic metaphor for learning

Academics, and especially academics who think about culture (which is to say, more or less, all academics), seem to really like metaphors and similes. Here's one that made me mad this week.


Jim Greeno: Learning how to participate is like being in a kitchen.
Situativity theorist Jim Greeno, in "Number Sense as Situated Knowing in a Conceptual Domain," considers how people develop conceptual models for participating in disciplinary communities (what he calls "conceptual environments"). He explains that
knowing how to construct models in a domain is like knowing how to work in an environment that has resources for a kind of constructive activity, such as a woodworking shop or a kitchen.

A shop or a kitchen has objects, materials, and tools that can be used to make things. Knowing how to work in such an environment includes knowing what objects and materials are needed for various constructive activities, knowing where to find those objects and materials in the environment, knowing what implements and processes are useful for constructing various things, knowing how to find the implements, and knowing how to use the implements and operate the processes in making the things that can be made.

In constructing conceptual models, the ingredients are representations of specific examples of concepts.... We can think of the conceptual domain as an environment that has representations of concept-examples stored in various places. Knowing where to find these, knowing how to combine them into patterns that form models, and knowing how to operate on the patterns constitute knowledge of the conceptual domain. The representations of concept-examples have to be understood in a special way. They are not only objects that are drawn on paper or represented in the mind. They are objects in the stronger sense that their properties and relations interact in ways that are consistent with the constraints of the domain.

This example would be fine if everybody agreed on a.) where everything belongs in a kitchen; b.) what everything in the kitchen should be used for; c.) what activities afforded by the kitchen are most appropriate; and d.) whether the kitchen is appropriately and effectively designed.

Let's say, just for kicks, that the cabinets are made for glass and have been installed at just the right height for someone who is, say, at least 5 feet 9.2 inches tall. I'm 5'3". If I want to get to the materials I need to, I'm going to need to find something to stand on.

If there's nothing to stand on (and if most people who use the kitchen stand around 5 feet 9 inches, there would be no reason to keep stepstools or the like around), I might try to climb up onto the counters. I might try to find some sort of utensil--a spatula, maybe, or a wooden spoon--to help me access the ingredients I need. If I'm really desperate, I might try to throw things in an effort to shatter the glass cabinet.

To an outside observor, none of the above activities would appear appropriate in the kitchen setting. The spatula is made for cooking, not for prying open cabinets. And shattering glass cabinets--that's just destructive.

You see my point, I hope.

Then there's the unavoidable issue of choice of metaphor. Greeno offers a kitchen or a woodworking shop, which we might say is a nice way to offer one example for each gender! But though it's true that Greeno doesn't take it a step farther to prescribe who gets to enter which type of space, the gendered nature of the examples is undeniable. These examples are not neutral, just as the practices that occur in the examples are not benign, at least not always, and not for everybody.

Metaphors do lots of good work for us; indeed, it may be that our entire culture rests on a bed of shared metaphors. As Bonnie Nardi and Vicki O'Day write in their 2000 book Information ecologies: using technology with heart,

Metaphors are a useful form of shorthand.... But it is important to recognize that all metaphors channel and limit our thinking, as well as bring in useful associations from other contexts. That is the purpose of a metaphor, after all--to steer us to think about the topic this way rather than some other way.

What are you doing? Stop--stop throwing soup cans at the cabinets! You're liable to break something!

To which you respond: I never liked tomato soup much anyway. And I sure as hell hate glass cabinets. Good riddance, you say, even as you're being hustled out of the kitchen. Good--

And that's when you realize they've shut the door behind you. Maybe even locked it. See what kind of trouble metaphors get us into?

Monday, January 18, 2010

technologies as sleeping policemen: or, how I learned to stop worrying and...

Nicholas Burbules and Thomas Callister worry for us. Or, at least, they were worried, over 10 years ago when they offered up their take on new technologies in a paper called The Risky Promises and Promising Risks of New Information Technologies for Education. Among their concerns: that too many people adopt a "computer as panacea approach" to new technologies. This is uniquely problematic in education, they argue, where
(r)ather than acknowledge the inherent difficulty and imperfectability of the teaching-learning endeavor, rather than accept a sloppy pluralism that admits that different approaches work in different situations—and that no approach works perfectly all the time—educational theorists and policy makers seize upon one fashion after another and then try to find new arguments, or new mandates, that will promote widespread acceptance and conformity under the latest revolution.

As problematic as the "computer as panacea" approach is, it pales in comparison to the relativistic "computer as neutral tool" approach, the one that has people saying that any technology can be used for good or for evil. Burbules and Callister explain that:

this technocratic dream simply errs in the opposite direction from the first. Where the panacea perspective places too much faith in the technology itself, the tool perspective places too much faith in people's abilities to exercise foresight and restraint in how new technologies are put to use; it ignores the possibilities of unintended consequences or the ways in which technologies bring with them inherent limits to how and for what purposes they can be used. A computer is not just an electronic typewriter; the World Wide Web is not just an on-line encyclopedia. Any tool changes the user, especially, in this instance, in the way in which tools shape the conception of the purposes to which they can be put. As the old joke goes, if you give a kid a hammer they'll see everything as needing hammering.

They prefer a middle approach, which assumes that a simple cost-benefit analysis fails to account for the possibility that benefits and costs are highly dependent on perspective. They offer as proof the history of antibiotics, which through widespread use greatly decreased humanity's likelihood of dying from bacterial infection but in the process led to the emergence of drug-resistant forms of bacteria. ("That is a very bad thing," they write.)

Though it's fairly simplistic to compare new information technologies to antibiotics, I'll go with the analogy for now, mainly because I agree with the authors' effort to problematize attitudes toward new technologies. It's perhaps more accurate to consider the social effects of antibiotics: they have led to a general increase in life expectancy, but in the process have enabled imperialistic societies (cf. the United States) to effectively colonize cultures, communities, and countries worldwide. In the same way, new technologies offer unprecedented access to information, communities, and tools for mobilization, but they simultaneously support new forms of colonization, both across and regardless of national borders.

Which brings me to the metaphor of technologies as sleeping policemen.

The sleeping policeman: In America, we call it a "speedbump." It looks like this:




The speedbump's intended effect is to get drivers to slow the hell down, and it's commonly used in neighborhoods and suburban areas with lots of kids. And it does get people to slow the hell down, primarily because they have no choice. There are also tons of unintended effects: Parents feel more comfortable letting their kids play outside. And, as this post points out, kids playing outside tend to get to know each other better. They--and, by extension, their parents--connect with other neighborhood residents, and everybody feels more connected: "Parents come to know the nearby children. And, inevitably, they come to know those childrens’ parents. They begin trading favors like driving children around. They become neighborly."

There are potential negative effects, too. Using sleeping policemen to slow drivers down changes driving practices in unintended ways. When a driver hits the last speedbump, she hits the gas and jets on down the road. This might increase the risk of an accident just beyond the range of the speedbumps. Drivers may choose to avoid areas with speedbumps, thereby increasing traffic through other areas--even, potentially, nearby neighborhoods whose streets lack speedbumps. And when a driver is not forced to monitor her own driving practices, the decision to simply drive more slowly in neighborhoods is taken away from her, thereby increasing the possibility that she will not adopt slower driving as a general practice.

Still, I think we can all agree that the benefits outweigh the costs. Nobody sees the speedbump as a panacea, and I don't imagine many people see the speedbump as a neutral technology.

So why do we worry so much more about the emergence and increasing ubiquity of new media technologies than we do about sleeping policemen or antibiotics?

One reason is that it's easier to see new media technologies as actors that shape our practices than it is to see how speed bumps and antibiotics have shaped us.

Actors: Any person or tool that exerts force upon any other person or tool, thereby shaping its use or practice. In Actor-Network Theory, everything is a potential actor, everything a potential actant.

Speed bumps act upon cars, drivers, kids, parents, neighborhood dynamics. Antibiotics have acted upon people, policies, government spending, and attitudes. We live longer now. We therefore reshape our lives, our goals, and our relationships to others. It's all very chaotic and complicated, because our reshaped attitudes in turn act upon our use of antibiotics. Everything mediates everything.

Because new media technologies have emerged and been adopted so quickly, their role in reshaping thought and action--and even, it's becoming clear, physiology--is clear, even if the outline of how this reshaping is shaking out remains quite fuzzy. New technologies as sleeping policemen: They shape not only how we drive, but how we think about driving. We move them, we reshape them, we add more or take a few away, we develop cars with better suspension...and it goes on down the rabbit hole.

Sunday, January 17, 2010

I'm kind of appalled by Clay Shirky

You may have read Clay Shirky's recent post, "a rant about women." You may also have read, heard, or participated in the chaos and conversation that sprung up around it. And rightly so, given this representative chunk of Shirky's post:
Remember David Hampton, the con artist immortalized in “Six Degrees of Separation”, who pretended he was Sydney Poitier’s son? He lied his way into restaurants and clubs, managed to borrow money, and crashed in celebrity guest rooms. He didn’t miss the fact that he was taking a risk, or that he might suffer. He just didn’t care.

It’s not that women will be better off being con artists; a lot of con artists aren’t better off being con artists either. It’s just that until women have role models who are willing to risk incarceration to get ahead, they’ll miss out on channelling smaller amounts of self-promoting con artistry to get what they want, and if they can’t do that, they’ll get less of what they want than they want.

There is no upper limit to the risks men are willing to take in order to succeed, and if there is an upper limit for women, they will succeed less. They will also end up in jail less, but I don’t think we get the rewards without the risks....

And it looks to me like women in general, and the women whose educations I am responsible for in particular, are often lousy at those kinds of behaviors, even when the situation calls for it. They aren’t just bad at behaving like arrogant self-aggrandizing jerks. They are bad at behaving like self-promoting narcissists, anti-social obsessives, or pompous blowhards, even a little bit, even temporarily, even when it would be in their best interests to do so. Whatever bad things you can say about those behaviors, you can’t say they are underrepresented among people who have changed the world.

There are enough smart people out there responding to this piece that I don't need to add more noise to the cacophony. But I do want to speak up as a dues-paying member of Women Who Run With The Arrogant Self-Aggrandizing Jerks And Sometimes Behave Like Arrogant Self-Aggrandizing Jerks Themselves.

Did I mention I'm a dues-paying member?

Because it's not easy to self-promote. It's not easy to stand up and say things that might be seen as stupid--or worse, dismissed because they come from a woman. It's not easy to announce to people "You should listen to me because I am awesome and the work I do is also awesome." It's not easy, in part because it takes extreme confidence (or at least something that looks to other people like confidence) to stand up and ask for attention, respect, recognition; and it's also not easy because the backlash is often so great, and simultaneously so subtle, that it sometimes feels like a one-step-forward, two-steps-back kind of deal.

My experience, in academia anyway, is that the tradeoff is this: If you want respect, authority, and platforms for broadcasting your ideas to a wider public, you have to self-promote; and if you succeed in gaining respect, authority, and platforms for speaking it's often at the cost of personal and professional relationships.

Let me say it more clearly: If you're a woman and you want to be heard, especially in academia, you have to knock on every door, announce your presence to everyone, and holler your qualifications at everyone in earshot. And if you do it right, people will hate you.

It will be harder to get daily work accomplished, because your colleagues will be stiff and formal with you. Male colleagues will challenge your knowledge and authority, and if that fails they will simply demean you in front of others. Female colleagues--and this is the really painful part--will shrink from you because in speaking so loudly, you've drowned out their voices. Some women, in an attempt to ally themselves with the people in charge, will also attempt to challenge and demean you.

(Another way to gain respect and authority, by the way, is to ally yourself with the people in charge, who in this case are primarily white guys. The backlash that comes out of this type of effort, though, is that you risk losing your place at the table the minute you misbehave. Then you have to come grovelling back, apologizing with downcast eyes, and take what scraps you can.)

Sure, men who self-promote risk hostility and resentment--but it's a different kind of hostility and resentment than what women experience. As members of the dominant cultural group, men who self-promote may be seen as a threat to specific people, but they certainly don't represent a threat to the established social order.

Women who are aware of the social positioning of women as a non-dominant group (and not all women are aware of this positioning, which is fine but sort of sad) develop a complex relationship to the decisions they make in crafting their public personae. They may engage in the kind of "arrogant, stupid" behavior that Shirky says is the best way to get ahead, but they do so knowing that some people (including, apparently, Shirky) will see this as "behaving more like men." They may choose to self-promote far less aggressively than Shirky would probably find useful, and to either accept that they will have trouble getting heard or find platforms for speaking where less self-promotion, less arrogance, is perfectly okay.

Or--and this is what lots of women, including me, do--they may adopt multiple identities, more identities, with more complicated politics, than those that men choose or are forced to adopt, in order to manage the competing demands on their behavior. This is not, lest I be misunderstood, the kind of identity cultivation that allows people to say "I have multiple identities! I'm an academic, and I'm also a mother, and I'm also a sister, and I'm also a friend." This is something much more complicated: It's "I'm this sort of academic-mother-sister-friend in this type of context, and I'm this sort of academic-mother-sister-friend in that type of context, and I'm this sort of academic-mother-sister-friend in that type of context with this person removed" and so on.

I don't quite know how to end this post, except to say that lots of people I like and respect think that Shirky is right on the money. And to add that my opinions are mine alone and not necessarily representative of all women, and that--and this is really important--I'm speaking from a position of relative privilege, since I'm a white, well-educated woman. I'm also thin, young, and not in any way physically disabled. I can't imagine how much more complicated this gets for someone who's even one step further removed from the dominant group than I am.

Monday, January 11, 2010

on ageism, sexism, and bad behavior: what we can learn from Dave Winer

Over at scripting.com, ageism is becoming an issue for Dave Winer.

Here's how it went down, in Winer's own words:

Earlier today I was listening to Talk of the Nation on NPR and heard an interview with Keli Goff from the Huffington Post. The interview started with an explanation that linked Reid's embarassing words (about Obama's race) to his age. She went out on a limb, way too far, although later in the interview she walked it back a bit.
This led to an afternoon of heated exchanges on Twitter. Lots of nasty stuff was said about people of my age, most of them untrue. What troubles me is that there is no general acceptance for insults based on race, religion or gender, but age-based insults have no taboo.

Dave Winer got attacked on Twitter today, no doubt about it. But what Winer doesn't point out here is that he gave as good as he got: He came out absolutely swinging, excoriating Groff and smacking back at anyone who disagreed with him, insulted him, or--and you can imagine the temptation was just too great for some of Winer's followers--notified him that he's too old to know what he's talking about. Here's a clip of Winer's twitter feed:







More than one person responded to Winer with some version of this:


I have to admit, it's kinda tough to disagree with @miniver.

Before I go on, I want to cop to my own bad behavior with respect to ageism: In the past, and on this very blog, I have offered up Rupert Murdoch as "further proof of why old people should not be allowed to run media conglomerates." That was blatant ageism, pure and simple, and it was wrong, and it's not okay, even when used as a rhetorical device. (In retrospect, I should have offered up Rupert Murdoch as further proof of why hopelessly avaricious people should not be allowed to run media conglomerates.) I am sorry. I promise to try harder from here on out to avoid such wrong-headed attitudes and discriminatory language.

Now then.

In some ways, ageism is similar to sexism in that it's brutally apparent to those who are the victims of it, even if others (non-victims) don't see how a person might take offense. (I imagine, but don't know for sure, that the same comparison could be made to other forms of prejudice--I'm just sticking with what I know best here and leaving the rest to others who know better than I.) People who react with anger to sexism, as to ageism, are treated like they just have their panties on a little too tight. "It's just a fact that women are better at raising children." "It's just a fact that older people don't understand the digital revolution." Both disempower the target. Both are destabilizing. And both are treated as socially acceptable in lots of situations where everyone should know better. (By the way, here's the mp3 of the Talk of the Nation conversation--Winer's right to take issue with what's clearly blatant ageism.)

Oh, but while I'm at it, I should also mention that women are exposed to sexism throughout their lives, so they're used to it and develop strategies for coping with it as it happens and afterward. But ageism is perhaps as startling and frustrating as it is for the simple reason that its victims are experiencing a prejudice that's entirely new to them. Of course, women who are the targets of ageism get hit with the double-disempowerment that comes with not only being female but being an old (read: asexual and therefore irrelevant) female. I can't imagine how the prejudice gets compounded when the target of ageism is nonwhite, nonstraight, or otherwise out of the mainstream.

Winer self-identifies as white and he's presumably straight (though a significant web presence has insinuated that he is, among other things, gay--about which more in a second), which perhaps explains his extreme outrage at the ageism directed straight at him today. (I agree with Anil Dash, who offered Winer this advice via Twitter:

@davewiner The way you are saying what you're saying is undoing the argument you're trying to make. Take a deep breath, come back in a bit.)


If you've rarely, or never, experienced the prick of arbitrary bigotry, then the first prick stings perhaps all the more deeply, scalds all the more powerfully. I still remember my first encounter with sexism, when my fourth grade teacher told me I couldn't climb the playground swingsets to unwind the swings because I was a girl. It never gets less galling. It's just that we do our best to get better at responding, in word and in deed. The fourth grade me, surprised, gave in and went inside. The 32-year-old me, by contrast, might climb the swingset anyway, in direct defiance of her teacher. (The 32-year-old me may, incidentally, actually be worse at getting what she wants.)

There's a side note to this story, an interesting one: Dave Winer apparently carries a reputation for bad behavior in online communities. I didn't know this until this evening, when I started researching Winer's backstory for this post. I started following him on Twitter because of his status as a pioneer in weblogging, and until today knew next to nothing else about him. But here's a sample of what I learned about how Winer responds to criticism:
  • Software developer and writer Mark Pilgrim decries Winer's propensity for personal attacks against those who criticize his work (here's What's your Winer number? an algorithm for determining how your experience of Winer's verbal abuse compares to the experiences of the hordes of others who have fallen victim to it, and here's a post where Pilgrim makes public Winer's response to the Winer number post--take a look at Winer's comments below the post).
  • Here's Matthew Ingram on Winer's response to public criticism of a post Winer wrote called "Why Facebook Sucks." When Stowe Boyd disagreed with Winer's post, Ingram writes, Winer apparently called Boyd "a creep" and "an idiot."
  • Here's Jason Calacanis, who names Winer as a friend but still offers his experience of "getting Winered" during a public presentation.

The list goes on. The ridiculous "gay" insinuation--well, that sort of bad behavior is what people resort to when they feel people in positions of power are acting in violation of the public trust--when they see arrogance, pettiness and rudeness from someone who has no reason to act so poorly.

There are at least two lessons to draw from the Winer / ageism story: First, that the worn grooves of prejudice and discrimination are so, so easy for humans, flawed as we are, to fall into, and that it is our responsibility to guard against taking that easy path; and second, that bad behavior in communities of practice is still not okay, no matter who you are. The difference these days, of course, is that reputation not only precedes you but follows behind you like a little yipping terrier. It's getting harder and harder to walk into a room you've never entered without everyone noticing the constant bark of that little dog.

Saturday, January 9, 2010

smacking down Jaron Lanier & 'World Wide Mush'

Normally, I wouldn't take on such a revered, well credentialed authority as Jaron Lanier,* but his recent Wall Street Journal piece on why the movement toward collectivism, collaboration, and openness is doomed to failure leaves him cruising for a bruising.

"Most people know me as the 'father of Virtual Reality technology'," Lanier announces. (This boast is repeated in his presumably self-crafted or at least author-approved bio at the bottom of the article.) He was also, however,
part of a circle of friends who tried to imagine how computers would fit into the peoples' lives, including how people might make a living in the future. Our dream came true, in part. It turns out that millions of people are ready to contribute instead of sitting passively on the couch watching television. On the other hand, we made a huge mistake in making those contributions unpaid, and often anonymous, because those bad decisions robbed people of dignity. I am appalled that our old fantasies have become so entrenched that it's hard to get anyone to remember that there are alternatives to a framework that isn't working. 
 The "mistake" of making participation in online communities free and unpaid is only the first half of Lanier's frustration. He also hates what he calls the "collectivist" nature of online communities:
Here's one problem with digital collectivism: We shouldn't want the whole world to take on the quality of having been designed by a committee. When you have everyone collaborate on everything, you generate a dull, average outcome in all things. You don't get innovation.

If you want to foster creativity and excellence, you have to introduce some boundaries. Teams need some privacy from one another to develop unique approaches to any kind of competition. Scientists need some time in private before publication to get their results in order. Making everything open all the time creates what I call a global mush.
Lanier's first mistake is treating Wikipedia culture--or, as Stephen Colbert puts it, Wikiality--as a stand-in for all forms of collective problem-solving. Lanier of all people should know that the often anonymous, often trivial and often problematic Wikipedia model is only one approach to collaboration, and one that works--and often works very well--for fairly low-stakes and longer-term goals. You don't cite Wikipedia in a White House briefing on Islam in Nigeria, for example, but you do cite Wikipedia in proving to your brother-in-law that Breakfast at Tiffany's was released just at the very beginning of the 1960's, just like you said it was. (Did you get that, Dean? 1961.)

Lanier also finds Google Wave problematic because he sees it as another iteration of wikiality; it "encourages you to blur personal boundaries by editing what someone else has said in a conversation with you, and you can watch each other as you type so nobody gets a private moment to consider a thought before posting.

"And if you listen to music online," he continues, "there's a good chance your listening will be guided by statistical analysis of Internet crowd preferences."

Ultimately, I suppose, and in true Wikipedia form, we all get the reality we expect to perceive. Lanier, in his mistrust of collective action, zeroes in on specific communities and specific features of specific resources that he believes prove that collectivism is a terrible, dehumanizing mistake. Along the way, he ignores the vast variety of successful projects of collective action, including:
  • This year's DARPA challenge winners, based out of Sandy Pentland's Human Dynamics Laboratory at MIT, who successfully crowdsourced the search and retrieval of 100 red balloons hidden across America in a flabbergasting 8 hours and 52 minutes (researcher Riley Crane points out that while the balloons themselves were an arbitrary target, the process by which the balloons were retrieved might be applicable to more significant social problems);
  • The recent Netflix Prize, which crowdsourced the development of a more accurate system for identifying Netflix members' potential interests based on previous titles viewed;
  • And, oh yeah, the entire open source software movement, starring Linux, WordPress, Mozilla Firefox, Ubuntu, and a host of lesser characters playing supporting roles.

And those are just the easily identified successes of collective action. It's more challenging, but equally important, to identify and champion local or concentrated examples of collective action. When people learned of a proposed bill in Uganda that would legalize execution of homosexuals, they spread the word. Bad press for any country, even worse when prominent Americans appear to be involved behind the scenes. Video of Rachel Maddow brutally smacking down "ex-gay" evangelist Richard Cohen, suspected to have worked in support of the law, was viewed nearly 60,000 times on YouTube. While the bill is moving forward through Uganda's Parliament, the international outcry makes it more likely that a.) the law, if passed, will go largely unenforced; b.) international relations with Uganda, with governments or NGOs, will be strained; c.) any other country interested in passing a similar law will think twice about garnering such negative press; and d.) prominent anti-gay American politicians and evangelicals will think twice about joining up with a similar cause.

Over on Twitter, Dean Shareski announced he had been charged hundreds of dollars by his cellphone carrier when his daughter began texting new friends without adding them to the company's My5 plan. As he explains in a recent post, Twitter saved him $764.13: His tweets about the issue were retweeted by followers, and the company noticed and erased the extra charges.

Small and large, the possibility for collective action is empowering people to resist injustice, call for change in politics in attitudes, develop new tools to improve human lives. You don't get innovation? You can't foster creativity? I say bollocks to that.

Now for the second part of Lanier's argument, that the model of unpaid contribution has led to a general decline in quality and social and technological progress. He writes:
The "open" paradigm rests on the assumption that the way to get ahead is to give away your brain's work—your music, writing, computer code and so on—and earn kudos instead of money. You are then supposedly compensated because your occasional dollop of online recognition will help you get some kind of less cerebral work that can earn money. For instance, maybe you can sell custom branded T-shirts.

We're well over a decade into this utopia of demonetized sharing and almost everyone who does the kind of work that has been collectivized online is getting poorer. There are only a tiny handful of writers or musicians who actually make a living in the new utopia, for instance. Almost everyone else is becoming more like a peasant every day.
Yeah, sure, the economics of creativity are shifting--but what Lanier sees as a kind of serfication of the creative class (what? really? really? really?) others see as an equitable redistribution of socially valuable goods. John Brockman reminds us of the words of McLuhan himself:

McLuhan had pointed out that by inventing electric technology, we had externalized our central nervous systems; that is, our minds. (John) Cage went further to say that we now had to presume that 'there's only one mind, the one we all share.' Cage pointed out that we had to go beyond private and personal mind-sets and understand how radically things had changed. Mind had become socialized. 'We can't change our minds without changing the world,' he said. Mind as a man-made extension became our environment, which he characterized as 'the collective consciousness,' which we could tap into by creating 'a global utilities network.'

If you believe we are all part of one social mind, the world's mind (and remember that we get the reality we expect to perceive),  then it follows that the segmenting off of important ideas, the claiming ownership over ideas, the holding them separate from the rest of the social mind is the worst kind of sin: It is relentless and causeless self-abnegation, a sin against oneself.

It's true that some of the contributors to some of the projects I list here were paid for their participation; but they were paid for their time, and not for their ideas.  The rewards for joining in a collaborative effort are not, despite Lanier's assertion to the otherwise, reduced to either praise or money; the rewards are in contributing to a socially meaningful activity, in being a productive part of something larger than oneself.  Indeed, Lanier's very argument that "millions of people are ready to contribute" but that they are ready to do so even though their contributions are "unpaid, and often anonymous" shows that kudos or money are not the only motivators in collaborative spaces.

I'm an intermittent contributor to Wikipedia, and I do it without the expectation of money or kudos; I do it knowing nobody knows or cares about my contributions, except that what I add makes something neat a tiny bit better than it was before. There's value in that, to me and to the hordes of people who are perfectly comfortable contributing to collaborative projects anonymously and for free. Some things we do for money, after all, and some we do for love.






*jk jk jk I have absolutely no problem taking on such a revered, well credentialed authority as Jaron Lanier.

Tuesday, January 5, 2010

on Cory Doctorow on how to say stupid things about social media

"There are plenty of things to worry about when it comes to social media," says writer Cory Doctorow in his fantastic Guardian piece, "How to say stupid things about social media." Social media environments, he continues,
are Skinner boxes designed to condition us to undervalue our privacy and to disclose personal information. They have opaque governance structures. They are walled gardens that violate the innovative spirit of the internet. But to deride them for being social, experimental and personal is to sound like a total fool.
Yet plenty of perfectly smart people who should know better say exactly the foolish kinds of things Doctorow rightly decries in his post. Mainly, lately, the stupid things have been leveled at Twitter: It's trivial. It's banal. It's too voyeuristic, or it's a weak imitation of real relationships, or--and this is the one that really gets me--I try to use it in smart, deliberate, consequential ways, even though lots of my followers don't.

Partially, people who take stances like the above fail to see that the majority of the communication on sites like Twitter falls into the category of what Doctorow calls "social grooming." He writes:
The meaning of the messages isn't "u look h4wt dude" or "wat up wiv you dawg?" That's merely the form. The meaning is: "I am thinking of you, I care about you, I hope you are well."
Doctorow compares the "banality" of conversations on Twitter and Facebook to the conversations we have with coworkers. We ask a coworker if she had a good weekend, he writes, not because we care about how her weekend went but because we care about developing bonds with the people around us.

Yes, though that's only part of the answer. In choosing to communicate via Twitter, I'm not only saying "I am thinking of you, I care about you, I hope you are well," but I am also publicly announcing: "I am thinking of him, I care about her, I hope he is well." These announcements are interspersed with my Twitter interactions with
people who are not close friends or even necessarily acquaintances--people I care about only in the most abstract sense. I follow just under 350 people, after all, and am followed by around the same number--a far higher number than I am equipped to develop deep relationships with. And lots of people follow and are followed by far greater numbers than I.

The creaming together of the personal and the professional, the public and the private, means that 'trivial' social interactions in online social networks, however much they seem to replicate those that pepper our physical interactions, actually represent a new social animal whose form we have yet to fully sketch. We're all kind of blindly feeling our way around the elephant here. We who embrace social media technologies can scoff at the person who says an elephant is like a water spout after feeling only its trunk, or the person who has felt a little more and argues it's like a moving pillar topped off by a shithole, but we would do well to remember that in this parable, everyone who tries to describe the elephant, no matter how much of it he has touched, can only describe it by comparing it to objects he has previously encountered. Twitter is similar to a lot of things, but in the end it's its own elephant, identical to nothing else we've seen before.

This is why, as Doctorow points out, people rely on personal experience and therefore read Twitter and similar networks as trivial and banal instead of deeply socially meaningful. But it's also why we need to take care to treat the social meaning as different from that which emerges through other types of (digital or physical) social interactions.





Monday, January 4, 2010

the sleeping alone review of films: Avatar (3D)

summary: holy effing effety eff.


Avatar 3D: Holy effing effety eff. That is all.

Click here to find showtimes for Avatar at a theater near you.

Sunday, January 3, 2010

the sleeping alone review of films: The Curious Case of Benjamin Button

summary: Just cut out the middle man and watch Forrest Gump instead.

It’s a strange but true fact that I’m far more likely to post a review of a film I didn’t like than I am to post one of a film I enjoyed. There are, she protests, lots of reasons for this, but that’s a blogpost for another day. Right now I just want to tell you why The Curious Case of Benjamin Button is such a bad, bad movie.

I finally climbed aboard the Benjamin Button bandwagon in the dead week between the Christmas and New Year’s holidays. Since I’m not all that into wistful, you-can-do-it love stories, I was mainly motivated by a desire to see how the filmmakers dealt with the movie’s main conceit: A protagonist who ages backward, Mearth-style.


To the film’s credit, it refuses to shy away from the most potentially ridiculous aspects of this conceit. It’s as brazen in pointing an old-man-baby at the camera as it is in presenting, near the end of the film, a literally baby-faced old man. It does appear that we’re getting better at artificially aging people, too: Benjamin, played with consistent age-appropriate physicality by Brad Pitt, never once looks like a young, handsome man coated in plastic, which is what we've come to expect from any prematurely aged actors after decades of bad special effects. When the 20-something (but 60-ish-looking) Benjamin has an affair with a 40-something woman, he is depicted pitch-perfectly as a thin-haired, disheveled and declining man whose smile belies a boyish inner youth. The affair, and those that follow with comparatively younger women, seem to flow naturally from the radiance, youth, and energy that shine from Benjamin’s eyes and smile, if not his skin.


If you set aside the special effects, though, there isn’t much left. The plot is assembled around so many layers of unnecessary meta-story that the resulting mass of MacGuffins end up feeling like a combination of pointless distractions and the unfortunate side effects of a poorly adapted novel. There’s an ancient lady dying in a hospital as a major hurricane threatens its landfall. There’s the woman’s prodigal daughter, who has apparently returned home at the last minute to say her goodbyes. There’s a blind clockmaker, white, who marries a black Creole woman and whose only son is killed in a far-off war, and who therefore decides to build a clock that runs backward. He gives a short speech about wanting to turn time backwards, then he disappears, god knows why. There’s the black woman who raises Benjamin as her son, and who spends her entire working life employed at a retirement home, god knows why. And there’s a diary, penned by Benjamin Button, which is introduced as his Last Will and Testament. God knows why.

Now: organize all of the above details in order of novelty and interest, then cross off all but the two or three most boring and clichéd and you end up with the only details that actually end up mattering. Then, on top of the hot mess that presents itself as plot, there’s a lengthy and confusing mini-story wherein Benjamin travels by tugboat to Russia, meets a British woman whose husband is a spy, kind of falls in love with the woman and, it appears, carries on a kind of tepid affair of indeterminate length that ends for a vague set of reasons. The sole purpose of this entire intermission appears to be to shoehorn Tilda Swinton into the cast. Which, fair enough. But as my mom put it, It’s…kind of a long movie as it is.

So there you have it: Cool-ish special effects; confusing, slow, overcomplicated and irritatingly intricate plot. Add one more negative: In this film, people are treated as nameless diversions, as things that happen to people to advance the story. Early in the movie, an elderly resident of the old people's home tells Benjamin "We're meant to lose the people we love. How else would we know how important they are to us?"

Not only is the quote itself inane--love is deepened and refined by the inevitability of loss, but saying we're meant to lose people so we may know how much we love them is putting the cart miles before the horse--but the movie doesn't even care to try to bear this limp aphorism out. Characters come and go, flowing through Benjamin's life like a river, and it's never clear that the loss of any one of them is any more painful to him than the loss of any other. In fact, Benjamin seems to actively avoid developing deep relationships with people, and at a crucial moment he even walks away from someone he loves instead of having to deal with loss. Indeed, Benjamin himself muses that "It's funny how sometimes the people we remember the least make the greatest impression on us." In what universe could that even be possible?

This is a movie that wants it both ways: It wants us to believe that our lives are characterized by the people we love deeply, and it wants us to believe that our lives are characterized by how they are shaped by the people who flow across our paths like buffeting wind--all of them different and finally the same, not a one distinct from any other.

In the end, though, Benjamin Button's biggest crime is that it tries too hard to be Forrest Gump. But Gump did it better: neater special effects, tighter plot, better character development. And lest Forrest be accused of treating people like nameless diversions, I offer the following: Forrest’s excuse was his IQ. And besides, no matter how Forrest thought of the people who flowed through his life, there was always, for him, Jenny. We love Forrest, we root for him, because he loves Jenny, he pines for Jenny, he comes up with ways to pass the time until he can see Jenny again. Benjamin has Daisy, who this film would very much like to convince us is the deep and lasting, abiding love of his life. But at the risk of spoiling the movie, she’s really not. She's another diversion, another way to pass the time, another thing that happens to Benjamin. We're supposed to believe that Benjamin is the hero of the story, but in fact Daisy proves herself to be strong, resilient, flawed but kind. In other words, she's complex and layered, like all good movie heroes should be. Benjamin is flat and shiny, kind of like...a button, I suppose.

Daisy might very well be more of a hero than Jenny, but Benjamin is no Forrest. And this film, despite what you may have heard to the contrary, is not Oscar material.
 

All content on this blog has been relocated to my new website, making edible playdough is hegemonic. Please visit http://jennamcwilliams.com and update your bookmarks!