Sunday, May 31, 2009

last voyage of Robert Falcon Scott

In 1912, English Naval Officer Robert Falcon Scott and his party of explorers died after a failed attempt to be the first to reach the South Pole. Scott's diary, recovered with the bodies of his co-explorers, indicate that he was the last to die.


Left to Right: Wilson, Evans, Scott, Oates and Bowers


The first was Petty Officer Edgar Evans, who had suffered extreme frostbite and wandered away from camp. Scott wrote:

I was first to reach the poor man and shocked at his appearance; he was on his knees with clothing disarranged, hands uncovered and frostbitten, and a wild look in his eyes. Asked what was the matter, he replied with a slow speech that he didn't know, but thought he must have fainted. We got him on his feet, but after two or three steps he sank down again. He showed every sign of complete collapse. Wilson, Bowers, and I went back for the sledge, whilst Oates remained with him. When we returned he was practically unconscious, and when we got him into the tent quite comatose. He died quietly at 12.30 A.M. On discussing the symptoms we think he began to get weaker just before we reached the Pole, and that his downward path was accelerated first by the shock of his frostbitten fingers, and later by falls during rough travelling on the glacier, further by his loss of all confidence in himself. Wilson thinks it certain he must have injured his brain by a fall.


The second to die was Captain Titus Oates, who suffered such extreme physical deterioration as a result of severe frostbite, hunger, and exhaustion that he was nearly completely unable to walk. In the end he was such a burden to the other members of the expedition that he asked to be left behind. Here's how Scott's diary describes Oates's final hours:

Should this be found I want these facts recorded. Oates' last thoughts were of his Mother, but immediately before he took pride in thinking that his regiment would be pleased with the bold way in which he met his death. We can testify to his bravery. He has borne intense suffering for weeks without complaint, and to the very last was able and willing to discuss outside subjects. He did not - would not - give up hope till the very end. He was a brave soul. This was the end. He slept through the night before last, hoping not to wake; but he woke in the morning - yesterday. It was blowing a blizzard. He said, 'I am just going outside and may be some time.' He went out into the blizzard and we have not seen him since... We knew that poor Oates was walking to his death, but though we tried to dissuade him, we knew it was the act of a brave man and an English gentleman. We all hope to meet the end with a similar spirit, and assuredly the end is not far....


When an expedition discovered Scott's body, along with those of his companions Lieutenant Henry Bowers and Dr. Edward Wilson, they also recovered Scott's diary. A copy of the last page, with a translation, is included below.


We shall stick it out
to the end, but we
are getting weaker, of
course, and the end
cannot be far.
It seems a pity, but
I do not think I can
write more.
R. Scott
Last entry
For God's sake look
after our people.

Breaking: Sony Entertainment CEO Michael Lynton "cannot subscribe to the views of those online critics who insist that I 'just don't get it' "

Subhead: Sony Entertainment CEO Michael Lynton just doesn't get it.


Michael Lynton wants guardrails for the internet in the name of preserving creativity. At least, that's what he says he wants. If you read his recent piece in the Huffington Post, you quickly understand that what he really wants is to preserve his company's ability to profit from the creativity of others.

Lynton went viral after making the following assertion: "I'm a guy who sees nothing good having come from the Internet. Period." And in the HuffPost piece, he explains that he welcomes the "Sturm und Drang" that resulted from that statement, because it allows him to make the following point:

the major content businesses of the world and the most talented creators of that content -- music, newspapers, movies and books -- have all been seriously harmed by the Internet.


He's right, of course. But any attempt to roll back the appropriation, remix, and--sometimes--piracy practices enabled by new media will fail, and one big reason for this is that people like Lynton can't see that the internet simply cannot be regulated the way we've traditionally approached culturally transformative inventions.

Lynton compares the Internet to the national highway system developed under the Eisenhower administration. He explains the comparison thus:

Contrast the expansion of the Internet with what happened a half century ago. In the 1950's, the Eisenhower Administration undertook one of the most massive infrastructure projects in our nation's history -- the creation of the Interstate Highway System. It completely transformed how we did business, traveled, and conducted our daily lives. But unlike the Internet, the highways were built and operated with a set of rational guidelines. Guard rails went along dangerous sections of the road. Speed and weight limits saved lives and maintenance costs. And officers of the law made sure that these rules were obeyed. As a result, as interstates flourished, so did the economy. According to one study, over the course of its first four decades of existence, the Interstate Highway System was responsible for fully one-quarter of America's productivity growth.

We can replicate that kind of success with the Internet more easily if we do more to encourage the productivity of the creative engines of our society -- the artists, actors, writers, directors, singers and other holders of intellectual property rights -- yes, including the movie studios, which help produce and distribute entertainment to billions of people worldwide.


It makes sense for someone like Lynton to compare the internet to a literal highway--he, and many of his ilk, continue to think of the internet as an "information superhighway" that can be maintained and paid for via a simple system of tolls, speed limits, and regulations on what kinds of vehicles will be allowed to operate, and when, and by whom. This is precisely why the information superhighway metaphor has fallen into disuse by the majority of internet users: It simply does not apply to a system that is far more complex, and far less regulated and regulatable, than the metaphor suggests.

Mind-bogglingly, Lynton believes that "without standards of commerce and more action against piracy, the intellectual property of humankind will be subject to infinite exploitation on the Internet." He wonders:
How many people will be as motivated to write a book or a song, or make a movie if they know it is going to be immediately stolen from them and offered to the world with no compensation whatsoever? And how many people whose work is connected with those creative industries -- the carpenters, drivers, food service workers, and thousands of others -- will lose their jobs as piracy robs their business of resources?


Seriously? The head of one of the most new media-reliant entertainment companies in the world is so oblivious to the creativity that is enabled by social media that he really, honestly believes that the social practices that are emerging around these technologies are going to destroy humanity's creative impulse?

On the other hand, this is perhaps an apt approach for the head of a company that makes its bones on defining creativity as "stuff that can make money for whoever owns the rights to it (e.g., Sony Entertainment)."

Lynton would have us believe that he's in this fight for the good of mankind, that he and others like him are humanitarians along the lines of this cartoon I pirated from the internet:



Alternately, we might view his motives as more closely aligned with this cartoon I pirated from the internet:



Lynton wants us to know that he is not a Luddite, not "an analogue guy living in a digital world." I am fully convinced of that. I also believe that his impulse to set up internet guardrails is not quite as humanitarian as even he himself might think. He seems to be confusing the notions of "creative impulse" with "the drive to make money off of creativity." As anybody who's been paying attention for the last couple of decades knows, in the internet era, these aren't the same thing. They aren't even in the same category. If that makes it harder for behemoths like Sony to survive by standing on the shoulders of the creative types it exploits, then so be it.

Saturday, May 30, 2009

headline: Hillary Kolos brings the awesome. Awesomeness ensues.

One of my favorite young media scholars is Hillary Kolos, a graduate student in MIT's Comparative Media Studies Program. Because I have had the great, great luck to get to work with her over the last year as part of my day job, I've had the joy of watching her blossom as a thinker, writer, and media scholar.

Recently, Hillary had a personal essay posted to Henry Jenkins's blog. The piece, "Bouncing Off the Walls: Playing with Teen Identity", focuses on her experience with gender and identity play through a consideration of how she decorated the walls of her bedroom as an adolescent. The piece is a gem. One tantalizing snippet:

As a teen, I used many resources to play with new identities. Fashion ads served as inspiration. My walls were a place to exhibit them. I did also, on occasion, leave my room where I had other experiences that helped shape the woman I am today. But having a space of my own to play and then reflect was very important to my process of identity formation. What seemed like goofing off at the time was actually a process of exploring who I thought I was at the time, as well as who I thought I should be.

My experience in my room is one of countless examples of how teens use their available resources to explore potential identities through play. This kind of play can happen in private, but often young people use media to capture their experiments and share them with others. In this way, they can gauge reactions and refine their performances. I used my walls to reach a limited audience, but today teens can easily reach millions of people online and receive feedback instantly on how they represent themselves. It will be interesting to see the new possibilities, as well as the new concerns, that emerge as teens use new resources to play with their identities online.

You can find the rest on Henry's blog here. As I mention in the title of this post, the piece is filled with awesome and well worth the read.

the MIT budget-crunch cheer

RAH RAH REE! KICK EM IN THE KNEE!
RAH RAH RASS! DON'T LET THE DOOR HIT YOU ON THE WAY OUT!


As an employee of the Massachusetts Institute of Technology, I regularly receive email communications from MIT President Susan Hockfield. Recently, I got an end-of-year message that made a strong attempt to put a positive spin on what's been a very difficult year for the Institute.

The letter starts by acknowledging the pressures of operating during an economic recession, pointing to successes in meeting those pressures without mentioning, in this opening paragraph or anywhere else, the number of MIT employees who have been sacrificed in the name of this success. Hockfield writes:

Around the world and in every sector, fundamental economic assumptions have dramatically changed over the last eight months. At MIT, we responded swiftly to the evolving economic downturn. Last November, anticipating a dramatic decline in our endowment’s value, we set out a plan to reduce our $1 billion General Institute Budget by $150 million, or 15%, within two to three years. Thanks to extraordinary work in every MIT unit, we have achieved 5% cuts for Fiscal Year 2010 (FY10), which begins July 1, 2009; some units and departments have already reached or even exceeded the targets set out for them. In addition, we have in place a thoughtful, deliberate process to achieve the full $150 million reduction by FY12.

We can take enormous pride in the ongoing work across the entire Institute to reset our base budget for what may be a protracted period of slow economic growth.


It's tough, admittedly, to acknowledge the human cost of belt-tightening at the institutional level. And it's likely that this letter isn't the place for it. But in addition to the innovative cost-cutting approaches deployed by MIT (convening a 200-member task force, opening up an Idea Bank where faculty, staff, and students can submit and rank ideas for increasing MIT's operating efficiency), administrators relied on a tried-and-true approach to budget cuts: layoffs.

Layoff numbers are not readily available--in fact, may not be available at all, as far as I can tell--but my experience and my colleagues tell me that layoff numbers are at least in the dozens and probably much higher. As far as we can tell, no faculty have been let go (though an Institute-wide pay freeze means that faculty and staff alike received no pay raises this year), which means the burden of these layoffs rests on the shoulders of administrative and support staff.

The tone of Hockfield's letter is "we did it, together." The little people who have fallen by the wayside in the "doing" of "it" get no mention, here or anywhere else. (As I wrote in an earlier post, previous letters from Hockfield take the same "we're just going to ignore the fact that we have to lay people off in order to meet our financial goals" approach.) This makes it mighty hard to get into the team spirit mode that Hockfield would like to see.

Things are tough all over, but that doesn't mean it's fair or right to pretend to the world that everybody at the Institute banded together in a joint innovative approach to cutting costs. In this respect, MIT is no different from any other bulky, expensive institution, as much as it would like to make the world believe otherwise.

Thursday, May 28, 2009

why you should invite me to your next party

(hint: because I will entertain your guests with talk of the social revolution)


I was at a party last week when someone asked me what I do for a living. I used the opportunity to engage in what, in retrospect, may have been an ill-timed impromptu pronouncement about the status of the social revolution.

It turns out I'll need to rethink how I use that phrase "social revolution," at least in mixed company, because a tubby drunk man wearing a confusing hat walked up to me and tried to steer the conversation toward war atrocities.

"You can't tell me," he bellowed, "that the atrocities that are happening during the Iraq War are any different from the ones that happened during World War II. It's just that we have more media coverage now."

As I wrote in an earlier post, this is what I've decided to call the Space Odyssey mistake. This particular kind of error is explained by Clay Shirky, who describes a scene from 2001 in which

space stewardesses in pink miniskirts welcome the arriving passenger. This is the perfect, media-ready version of the future--the technology changes, hemlines remain the same, and life goes on much as today, except faster, higher, and shinier.


Lately I've been finding Christopher Kelty's notion of a "recursive public" useful in thinking about what, other than hemlines, have changed. As Kelty describes it in Two Bits (available for download, online browsing, and modulation for free online),
A recursive public is a public that is vitally concerned with the material and practical maintenance and modification of the technical, legal, practical, and conceptual means of its own existence as a public; it is a collective independent of other forms of constituted power and is capable of speaking to existing forms of power through the production of actually existing alternatives.


More to the point, a recursive public is a group of people who exist outside of traditional institutions (governments, churches, schools, corporations) and, when necessary, use this outsider status to hold these entities in check. The engagement of these publics goes far beyond simply protesting decisions or stating their opinions. Kelty, writing about geek culture as a recursive public, explains it thus:

Recursive publics seek to create what might be understood, enigmatically, as a constantly “self-leveling” level playing field. And it is in the attempt to make the playing field self-leveling that they confront and resist forms of power and control that seek to level it to the advantage of one or another large constituency: state, government, corporation, profession. It is important to understand that geeks do not simply want to level the playing field to their advantage—they have no affinity or identity as such. Instead, they wish to devise ways to give the playing field a certain kind of agency, effected through the agency of many different humans, but checked by its technical and legal structure and openness. Geeks do not wish to compete qua capitalists or entrepreneurs unless they can assure themselves that (qua public actors) that they can compete fairly. It is an ethic of justice shot through with an aesthetic of technical elegance and legal cleverness.


This is precisely the difference between 1945 and 2009. It's not just that we have more media coverage but that, as Shirky proclaims, everybody is a potential media outlet--everyone has the potential to join a recursive public, whether impromptu or planned.

In fact, the notion that we can all engage in reportage is perhaps a bit too simplistic, at least until we can adjust what we mean by "journalism." When Facebook users joined up in opposition to a change in Facebook's terms of service and successfully pressed administrators to rethink and reword the terms of service agreement, that was the work of a recursive public, loosely banded and easily disbanded once their purpose had been achieved (if necessary, they will quickly gather again in their virtual space and just as quickly disband). We don't recognize this as journalism, often don't even recognize it as civic engagement--but for those who joined this Facebook knotwork, it's certainly some kind of engagement. And what could be more civic-minded than fighting to define the uses of a public space?

The atrocities of war are approximately the same (though, as always, new technologies mean new modes of torture and murder). What's different is the following:



All in all, it was a good party. Near the end, someone produced a Donald Rumsfeld piñata. We were going to hoist it up and smash it, but it seemed kind of...irrelevant.

Wednesday, May 27, 2009

I almost joined the Mile High Club for bloggers.

Man, I love to blog. I love it so much that after nearly a week away recently, I started fantasizing about what I would blog about the first time I was able to get back online again. Last night, I even had a dream about it. I do not find this in any way dysfunctional.

I have to confess to having mixed feelings, however, upon learning that AirTran, my airline of choice because of its high-quality travel accommodations, was set to become the first airline to offer WiFi during its flights (in a predictable move, Virgin America, Delta, and other airlines are following AirTran's lead). Soon, I thought with anxiety, there will be no respite from the onslaught of technology. Soon there will be no place in the country where internet access is not readily available.

That was what I was thinking on the flight out. By the time I boarded the plan to return to Boston, I needed a blogging fix. It was only the Cheapo McCheaperson in me that impeded my intentions. (Why pay $9.95 for something that, as Jon Stewart reminds us, is free?) Instead, I listened to what I believe is the best Bonnie "Prince" Billy album, "Master and Everyone", on my iPod.



We sure do worry, don't we, about the invasion of technology into the spaces of our everyday lives? We're nostalgic for the days of coffee and the morning paper in a breakfast nook. We miss books you could touch, music that scratched--or, if you're a little younger, warbled or skipped--we miss Must-See TV that everybody saw. Those were the things, we are wont to say, that made us a culture: that made us cohere.

And this isn't all. We still tend to think of engagement with media as a passive experience, akin to watching too much TV or spending all night trying for the high score in Pac-Man. (Interesting, by the way, how staying up all night reading a book, which is in many ways far more passive than playing a video game, doesn't get the same dismissive eye-roll--or maybe the time of scorn for engagement with the printed word has passed out of collective memory.)

But a large portion--maybe even the entire portion--of engagement with new media is generative, civic, creative in nature. For all our anxiety over drowning in an ocean of stuff, it does appear that for people in possession with a certain set of dispositions, "getting online" is not drowning but waving.

Clay Shirky talks about the phenomenon of "cognitive surplus," moments of cultural shift so drastic that they rearranged our relationship to time, such that we had more cognitive energy than we had places to put it. Shirky says the sitcom was invented to handle this surplus and that during the industrial revolution, gin served this same purpose.

If Shirky is right in this, then he is also right that we are now finding ways to deploy our mental energies toward collaborative knowledge-building and collective meaning-making. What to some resembles watching hours on end of "Perfect Strangers" and "Full House" is actually something more akin to pitching, casting, and producing the shows themselves--no, even more than that, collaborating on reshaping the sitcom altogether. We don't yet know the outlines or boundaries of what the internet and social media will afford, and every time someone gets online our culture gets a brand new opportunity to edge a toe toward the outer limits.

Which is pretty neat, if you ask me.

Tuesday, May 19, 2009

I wish fake feminists would cut it out.

"You can't claim to be a feminist simply because you're a woman."--Julie Bindel



"There is no such thing as a bad feminist." --Jess McCabe

Being controversial may not always be fun, but it certainly guarantees that people will pay attention. This is exactly what happened with Double X, the new site launched by Slate earlier this month. Double X describes itself with a slight nod toward feminism without explicitly mentioning the dirty F-word itself:
Double X is a new Web magazine, founded by women but not just for women, that Slate launched in spring 2009. The site spins off from Slate's XX Factor blog, where we started a conversation among women—about politics, sex, and culture—that both men and women listen in on. Double X takes the Slate and XX Factor sensibility and applies it to sexual politics, fashion, parenting, health, science, sex, friendship, work-life balance, and anything else you might talk about with your friends over coffee. We tackle subjects high and low with an approach that's unabashedly intellectual but not dry or condescending.


Double X targets its demographic with both barrels smoking, presenting itself in pastels, pinks, and purples and offering stories on motherhood ( "Why are moms such a bummer?"), breast cancer ("Enough with patenting the breast cancer gene"), and first-person, "it-happened-to-me" testimonials (I Wanted to be Blondie. Now I Write for Colbert").

By now, it may be clear that this is not your mother's feminism. The site is playful, mouthy, and just a little self-indulgent--normally exactly my cup of tea, except...well, if you were, say, a 30-ish, self-described feminist living an out-of-the-mainstream lifestyle, you might be a little worried.

This is not necessarily about topic choices--it's about the fights Double X has picked in its opening weeks. As this Guardian article by Amelia Hill and Eva Wiseman points out, Double X galloped out of the gate, chasing down and pummeling the popular site Jezebel. In "How Jezebel is Hurting Women", Linda Hirshman explains that

[t]he Jezebels are clearly familiar with the rhetoric of feminism: sexism, sexual coercion, cultural misogyny, even the importance of remembering women’s history. But they are also a living demonstration of the chaotic possibilities the movement always contained.... From removing the barriers to women working to striking down the criminal laws against birth control and abortion, feminism was first and foremost a liberation movement. Liberation always included an element of sexual libertinism. It’s one of the few things that made it so appealing to men: easy sexual access to women’s bodies. (And to their stories about sex, which helps explain why 49 percent of Jezebel’s audience is men.)

But unregulated sexual life also exposes women to the strong men around them, and here, the most visible of the Jezebel writers reflect the risks of liberation.... How can women supposedly acting freely and powerfully keep turning up tales of vulnerability—repulsive sexual partners, pregnancy, sexually transmitted diseases, even rape? Conservatives have long argued against feminism by saying women are vulnerable, and we need to take care of them. Liberals say there’s no justification for repressing sexual behavior.


The Guardian article highlights the conflict that has erupted out of this attack (Jezebel, of course, struck back, prompting a response from Double X...there is, so far, no end in sight), pointing to the heart of the issue: A struggle over how to define feminism in 2009.

It's a struggle that strikes close to my heart, as I see the term, if not the ideals, taken up in disheartening, even terrifying ways by friends and colleagues. Men calling themselves feminists push for "open," commitment-free relationships, since "that's what women want." Women shove their way to the top of corporations at the expense of (male and female) coworkers, and proclaim victory in the name of feminism.

For that matter: Women who call themselves feminists and push for "open," commitment-free relationships, and men shoving their way to the top of corporations at the expense of (male and female) coworkers.

It's no wonder so many young women and men are so loath to consider sexism as an ongoing issue: Feminism has been co-opted in vile ways for the purpose of self-advancement. Why would anybody want to associate with a movement whose name is responsible for so much abominable behavior?

Feminism, at its heart, is not about political justification of personal behavior. At its very best, feminism is about setting aside petty personal interests and considering what's best for an entire culture--and considering the best approaches for making the kinds of changes that will enable this culture to emerge. The Double X-Jezebel debate threatens to obscure this larger point beneath vitriol and, on the part of anti-feminist observers, the most loathsome kind of schadenfreude.

Sunday, May 17, 2009

our rock stars are not your mother's rock stars

I saw this video for the first time yesterday while watching the season finale of "Fringe" on Hulu.



Our rock stars, the commercial explains, aren't like your rock stars.

Let me tell you a story: In my day job, I work with media scholar Henry Jenkins. Last year, I was sent to the South by Southwest Interactive Festival, where Henry and Steven Johnson were scheduled to hold a discussion at the front of an enormous room, which filled up fast. As Henry and I were chatting before the event, a woman walked up to Henry and girlishly asked, "would you mind if I got a picture of me standing next to you?"

"Sure," said Henry, as if this sort of thing happened all the time. (I later found out that it does.)

Here's Henry Jenkins:




What I like about the Intel commercial is that it points to an interesting characteristic of our culture: that when people are as immersed in a field as the Intel employees of the commercial clearly are, they have their own set of rock stars who aren't like "real" rock stars. Well, at least they don't look like rock stars. But they do share certain common features with pop icons: They are generally very, very talented; they have achieved something we fantasize about; and they are famous among people who care about their field.

I'm going to come clean and admit that I have my own rock stars, and I suspect that many of my readers do too. Here's the deal: After I list my rock stars, you have to list yours. But make sure to identify your field, so we can all know how and why these people are superstars to you. I chose to stick with 3 because I didn't want to look like some sort of crazy groupie, though of course I could go on.

Jenna's Rock Stars

field: education / new media / participatory culture / technology / creativity
1. Educational researcher, games expert, and social justice theorist Jim Gee
2. Technology writer Clay Shirky
3. Insane genius philosopher Bruno Latour


Your turn.

Saturday, May 16, 2009

Dan Choi is gay. So is "don't ask, don't tell."

It almost goes without saying that the Obama administration's decision to discharge National Guard 1st Lieutenant Dan Choi for being gay is perhaps the most scathingly lame thing Obama has done since taking office.

Choi was one of 38 West Point grads who publicly came out in March in support of a repeal of the military's 16-year-old "don't ask, don't tell" policy, which allows gay men and women to serve their country as long as they don't tell anybody about their disgusting sexual preferences. It also means that other military folks aren't allowed to ask. After publicly identifying as gay on the Rachel Maddow show in March, Choi--who studied Arabic languages while at West Point--received notice that he would be discharged. He says he will fight his dismissal and has so far done so publicly, presumably to the surprise of military officials. Maybe they figured it's so shameful to be gay that Choi would never take his fight so public (they perhaps forgot that he came out in a public forum in the first place). Maybe they figured they could easily swat him away, since gay men don't know how to fight anyway.

The Daily Show With Jon StewartM - Th 11p / 10c
Dan Choi Is Gay
thedailyshow.com
Daily Show
Full Episodes
Economic CrisisPolitical Humor


You guys, it's 2009. The notion that homosexuality is a choice, that same-sex partnerships are an abomination, that sexual orientation is anybody's business or that we need to avoid making room for gay men and women to take their rightful place in the cultural, political, legal, and social life of our country--that's so 1998.

In the early 1900's, the NAACP waged a war against lynching, not just by arguing that beating, then slinging up, black men was atrocious behavior (if a person didn't accept this stance, then there was no point in arguing) but also by building anti-lynching sentiment by simply stating the facts: keeping a running tally of lynchings across the U.S. via newspaper ads and banners in major cities. Eventually, the anti-lynching arguments became redundant--a critical mass of Americans had simply come to accept the premise that killing a human being based on skin color was abominable behavior.



There comes a point where words fail because they've reached all the people they possibly can. There comes a point where further argument is useless, and all you can do is wait for a critical mass to join together in overthrowing a hopelessly bigoted, hopelessly outdated stance. If the outrage over Choi's discharge and shifting sentiment toward the "don't ask, don't tell" policy are any indication, we may have reached this point with gay rights. Thank Christ--it's about time.

PSA: in support of post-punk laptop rap

Here's a new release from MC Lars, who calls himself a "post-punk laptop rap artist."



I found out about MC Lars's work via my day job as educational researcher for Project New Media Literacies. I worked with literary scholar Wyn Kelley and media scholar Henry Jenkins, among others, to develop a teachers' guide for integrating new media literacy practices into the classroom while teaching Moby-Dick. Henry introduced us to an MC Lars song called "ahab," and later, he posted an interview with Lars on his blog.

Here's the "Ahab" video, which makes me so very happy. My favorite part is when Queequeg dances, surrounded by the crew of the Pequod.

Tuesday, May 12, 2009

Oxford University is a lame parent wearing clothes it bought from Forever 21.

In the face of the revolution caused by emergent technologies and the practices they enable, brick-and-mortar universities often come off as the perplexed parent who wants to seem cool: They know the kids are doing something interesting with their time, they don't know exactly what it is, but they figure if they can get in on it they'll score points.

In this case, as this Guardian article explains, traditional universities are considering how to integrate social media--include the most popular sites like Twitter and Facebook--into their learning environments. The goal behind this is not only to keep up with the times, but also to prepare learners for the great big social networking world just outside university gates. Academics are right to pursue this goal, though they won't succeed unless they can wipe the perplexity and fear off of their faces.

In many ways, as the Guardian piece points out, the potential for networking, collaboration, and collective problem-solving made possible by new media results in an ethos that jives nicely with that of academia at its finest. As education technologist Brian Kelly explains it, universities are aligned with the spirit of Web 2.0:
The social web is about openness and trust, which is a key part of what academic life is involved in, says Kelly. "Initially we did this in research, with open access work and making publications openly available, and now it's teaching and learning resources."

On the other hand, the internet offers a significant threat to the very cultural capital that makes universities essential: The practices that have emerged around new media technologies have resulted in a bottom-up approach to meaning-making. As the Guardian piece points out, "[t]here is a still a question over whether a well-respected blog is the same as having peer-reviewed research articles" and new cultural practices have emerged so quickly that norms are yet to develop. Kelly puts it this way,
"We've had no time to develop a culture. Everyone knows how to answer a telephone but it takes time for those conventions to come about, and there are no conventions for cyberspace."

What academics may consider drawbacks are, to many, just the everyday features of what media scholar Henry Jenkins has labeled a "participatory culture": Everyone with a computer and Internet access becomes a potential media outlet, and everyone gets to join in on deciding what's culturally valuable and meaningful. The barriers are low, mentorship opportunities are rampant, and everybody feels like they can contribute if they wish.

In participatory cultures, research no longer needs to be peer-reviewed before it's published; information becomes more valuable the more widely it spreads; and the groups that grow out of the use of new technologies work alongside of--and sometimes in resistance to--formal institutions. This flies in the face of traditional values espoused by formal universities, which make their bones on proving that the knowledge contained within their gates is better than the riff-raff that's out there. (For proof, see this article about the Guardian's ranking of UK universities, which relies on fairly traditional standards such as students' entrance exam scores, faculty/student ratio, and spending per pupil.)

Fortunately, all is not lost for the traditional university: There is still a role for formal education, even in a participatory culture--maybe especially in a participatory culture. Now more than ever, equipping students with the proficiencies and skills that enable ethical, responsible, and critical participation in new media is an essential element of a practical education.

For most of their existence, traditional universities have served as gate-keepers of knowledge: In order to gain access to the information kept behind their walls, you had to first get through the front gate. Young people who have grown up inside of a culture that makes most information available to most people, most of the time, are less and less willing to accept this elitist approach.

This doesn't mean, of course, that universities should drop admissions requirements. But it does mean new models are called for. Perhaps it's time to follow the approach of Open University, MIT's Open CourseWare System, and other universities that adhere to the principles of the open education movement--free exchange of ideas, educational materials, and pedagogies. Perhaps it's time to make the information made available to those who make it through the gates available to those who can't get through, as well.

I say "perhaps," but readers of this blog know that I really mean "for sure." As in "For sure it's time to follow the approach of universities that adhere to the principles of the open education movement." In a participatory culture, trying anything less is just...embarrassing.

Monday, May 11, 2009

branding sleeping alone and starting out early

My colleague and fellow evil genius Nick Seaver has been helping me work through the issues related to branding. He provided me with this image to serve as a logo for sleeping alone and starting out early. I'm undecided on whether this aligns with the image I'm attempting to convey of a serious thinker with a sense of humor. I'd love to know what you think.

Sunday, May 10, 2009

this is cool: my BBC interview

The neatest thing that has happened to me to date was being interviewed and broadcast on BBC radio. I was invited to speak about Rupert Murdoch's intent to begin charging readers to access online content as a result of the blogpost I wrote about this for the Guardian.

My co-interviewee, Joshua Benton, is amazing: Here's his bio, from the Nieman Journalism Lab website:
Joshua Benton is director of the Nieman Journalism Lab. Before spending a year at Harvard as a 2008 Nieman Fellow, he spent 10 years in newspapers, most recently at The Dallas Morning News. His reports on cheating on standardized tests in the Texas public schools led to the permanent shutdown of a school district and won the Philip Meyer Journalism Award from Investigative Reporters and Editors. He has reported from 10 foreign countries, been a Pew Fellow in International Journalism, and three times been a finalist for the Livingston Award for International Reporting. Before Dallas, he was a reporter and rock critic for The Toledo Blade. He is a big nerd who started blogging when Bill Clinton was still president.

Joshua recently uploaded an mp3 of our conversation. You can access it here.

Awesomeness. Extreme awesomeness.

if I can't be free of bullies, I don't want to be part of your revolution

In an interesting show of poor timing, the New York Times celebrates Mother's Day by considering why female executives are such obnoxious bullies.

It turns out female bosses are perceived as bullies almost as commonly as male bosses are. A full 40 percent of workplace bullies are women, and 70 percent of the time, they choose women as their targets.

This, of course, comes as no surprise to most working stiffs out there. Bullying from bosses knows no gender and is therefore not constrained by it. But when it comes to an examination of why women are viewed as bullies, and how their "bullying" behavior compares to the behavior of male bosses, it gets a little complicated.

The Times first considers this phenomenon from a pure numbers standpoint. One reason women bully may be because it's still excruciatingly difficult for them to break into the upper echelons of the country's top corporations:
After five decades of striving for equality, women make up more than 50 percent of management, professional and related occupations, says Catalyst, the nonprofit research group. And yet, its 2008 census found, only 15.7 percent of Fortune 500 officers and 15.2 percent of directors were women.

The article also suggests, though, that gender stereotypes makes us more likely to see a female boss as "overly aggressive" than we might perceive a male boss engaging in the same kind of behavior.
Research on gender stereotyping from Catalyst suggests that no matter how women choose to lead, they are perceived as “never just right.” What’s more, the group found, women must work twice as hard as men to achieve the same level of recognition and prove they can lead.

Yes, okay, fair enough. But let's look at it another way: While assertive or aggressive female bosses are more likely to be perceived as bullies, then we can assume that the female employees who largely perceive themselves as targets are also victims of stereotypes. If female bosses are perceived as bitchy or pushy if they assert themselves too strongly, then female employees are likely to be perceived as whiny or gossipy for complaining about behavior that feels inappropriate or excessively spiteful or unjust. This is why lots of woman-on-woman bullying, I believe, never gets reported. Or, if it does, it gets reported when the working relationship is so bad that one or both women are probably on their way out.

Then there's the issue of the male heads of organizations, the people who often adjudicate bullying complaints. Many of these men self-identify as either feminist or sympathetic to the feminist movement. Many have done what they could to help their female employees advance. They know how hard it is to be a female boss, and because of this they're likely to support an embattled woman even in the face of multiple accusations of bullying. First, they may carry around that father complex, the one that makes them want to take care of the ladies who need them; second, their politics require them to defend the woman from charges of bullying because it's just so hard to be a female boss. This, we might say, is the soft bigotry of kneejerk feminism.

Fortunately, a change is on the horizon. The emergence of participatory cultures and new valued practices means we can and must develop new models for formal and informal organizations. Increasingly, effective collaboration, collective meaning-making, and the ability to tap into expertise that's distributed across networks of people and tools are far more important than being the single visionary of a company. The old, single-genius model is less and less relevant, and bosses--male or female--who adhere to this model will bully themselves right out of a job as the social revolution takes hold.

Thursday, May 7, 2009

brb smacking down Rupert Murdoch: Why a pay-per-view approach to online content won't work

**update: 5/09/09: I got featured in a BBC radio interview opposite Nieman Lab Director Joshua Benton! They called me a "young blogger for the Guardian." Neatness.**

You guys, I've mentioned in an earlier post that I got picked up by the Guardian, a UK newspaper, to write for their internet division. I feel like a young Della Frye, except for the part where she's stuck in the middle of a ridiculous movie and my life is really happening. I'm also shorter than she is.

The following post went live at the Guardian's Comment is Free section today! You can view it here. A lively debate has picked up quickly; it turns out those Brits get all up in arms when it comes to Rupert Murdoch. Who knew?


Here's the post. They edited the American English out of it and made some other tweaks; this is it in its original form.


In further proof of why old people should not be allowed to run media conglomerates, media magnate Rupert Murdoch recently announced that News Corporation's newspaper websites will begin charging for access within a year.

The move to charge for accessing online content is an effort to keep newspapers profitable amid declining subscriptions and ad revenues. Murdoch called the current model, in which newspaper websites offer their content for free, a "malfunctioning" model, and one that's unsustainable.

Murdoch also opposed the recent decision by the New York Times, The Boston Globe, and the Washington Post to work with Amazon to develop a version of the e-reader the Kindle tailored for reading newspapers, magazines, and other periodicals.

The issue here is not whether the current model of offering free content to all is financially viable--clearly, it's not. The issue is the odious assumption implicit in Murdoch's stance: that centralized control of information flow is somehow better than the decentralized model embraced by the public.

Murdoch and others subscribe to the notion that leveling the playing field by offering free access to content was a regrettable mistake. Here's how the New York Times put it in a recent piece about the Kindle:

Perhaps most appealing about this new class of reading gadgets is the opportunity they offer publishers to rethink their strategy in a rapidly evolving digital world. The move by newspapers and magazines to make their material freely available on the Web is now viewed by many as a critical blunder that encouraged readers to stop paying for the print versions.


You know how the movement toward mass literacy was spearheaded by the church, in an effort to get the word of God into the hands, mouths, and minds of every citizen? I wonder if church officials called it a "critical blunder" when they figured out that learning how to read meant being able to make decisions about what to read, and when, and how. While it may not necessarily be true that mass literacy leads to a better society, it's certainly the case that if your power rests on the ability to tell people what to think, a little knowledge is a dangerous thing. And a lot of knowledge is explosive.

We're in the middle of a revolution made possible through the rapid spread of and relatively easy access to a vast store of human knowledge. Newspapers' decision to make their material available online for free--what print media types are now calling a "critical blunder"--was a crucial factor in making the revolution possible. I'm sure it seemed like a good idea at the time: For journalists, after all, news is important, and more access by more people to more news could only be better for everyone, right?

Clay Shirky writes that "it's not a revolution if nobody loses," and the first losers in this particular revolution were broadcast media outlets (TV, newspapers, magazines) and cultural elites whose social status relied on the ability to control who had access to the news, what stories they had access to, and what they did with that information.

Murdoch asserted that "[t]he current days of the internet will soon be over." If he's right, it will only be because a small handful of corporations own the vast majority of media outlets. My sense, though, is that he's wrong: That even if newspapers return to a pay-for-view model, the people will rise up against and then roll right over it by making the same content available for free elsewhere online and developing new uses for social media that subvert the efforts of Murdoch and others.

If print media executives want to keep pace with the social revolution, they need to begin by letting go of the outdated assumption that their job is to first filter and then broadcast information for the public good. From now on, we'll decide what matters, thank you very much, and if newspapers know what's good for them, they'll do what they can to not get in the way.

Wednesday, May 6, 2009

I can't believe the Boston Globe is staying open

because now I not only have to change the titles of two recent blogposts, but I have to rail against the lameness of going through all this work to keep something open based purely on its nostalgia factor.

The announcement of the Globe's continued, if tenuous, existence came after parties on each side of the dispute--New York Times, Inc., the conglomerate that owns the paper, on one side; union representatives on the other--offered up a series of hefty concessions. It seems likely, in fact, that the entire hullaballoo over the potential closure of New England's largest paper was intended to generate exactly the kind of public sentiment it garnered: Outrage, disbelief, incredulity and the stubborn insistence that a community isn't a community without a newspaper to bind it.

Nostalgia only takes an argument--or a money pit, even when it's black, white, and read all over--so far. Eventually you come up against some cold, hard truths: That newspapers aren't economically viable, and haven't been for years; that whatever statistics people may pull out, people just don't read newspapers anymore; and that even with the decrease in newspaper readership, we still have by most accounts a better informed and more civically engaged public than we've ever seen.

It's hard to see this, of course, if you depend on traditional models of "informed" and "civically engaged." Today's citizen is more likely to comment on a news story or publish a blogpost than to vote. Today's citizen is more likely to sign an online petition via Facebook or MySpace than to attend a rally. Today's citizen is more likely to post a video to YouTube than to write a letter to the editor of a local newspaper. These actions may not seem like "civic engagement" to the old-school journalist (cf. Russell Crowe in State of Play), but these activities are leading to a real shift in how and when the elites--corporations, school boards, government officials, and so on--both can and choose to listen to the people.

"Society doesn't need newspapers," says Clay Shirky. "What we need is journalism." Okay, sure, the Boston Globe will stick around, at least for now. But media moguls, analysts, and the general public need to stop asking what needs to happen to make newspapers viable. That's no longer a relevant question. The new question is: What journalistic models can replace the traditional newspaper model, and what effects will this have on the public? This is a question that's already being answered, if the so-called "media experts" would just put down their end-is-nigh megaphones and listen to what's going on around them.

Tuesday, May 5, 2009

thank goodness the Boston Globe is shutting down

or I'd have to smack it down big-time for this editorial arguing that we shouldn't standardize and measure achievement on so-called 21st-century skills. The op-ed offers further proof--as if we needed it--that the Globe's editorial board has no idea how the playing field has been utterly transformed by participatory culture.

The impetus behind the op-ed is a move by the state Department of Elementary and Secondary Education to put its money where its mouth is. The department recently awarded a $146 million contract to the designer of the MCAS, the standardized test mandated in the commonwealth of Massachusetts by No Child Left Behind, and part of that money is earmarked for integration of 21st-century skills assessment. This is a problem, as the Globe's editorial board will point out momentarily.

But first, it uses state MCAS scores as proof of public school rigor. As it explains,

Massachusetts stands apart in public education precisely because it created high academic standards, developed an objective measure of student performance and progress through the MCAS test, and required a passing grade in order to graduate. Students, as a result, rank at or near the top of standardized testing not just nationally but on tough international achievement tests in math and science. Any retreat from this strategy would be a profound mistake.


So to summarize: Massachusetts students are among the top in the nation because their achievement on standardized tests prepares them to...score well on standardized tests. It's like the iconic example of circular reasoning: The MCAS is useful because it prepares them for future learning. How do you know? Because Massachusetts students do well on other standardized tests. What prepares them to do well on those tests? Doing well on standardized tests, of course.

Given the Globe's wholehearted genuflection at the altar of bubble tests, one wonders why this editorial might oppose integrating assessment of 21st-century skills in addition to traditional subjects. It turns out their concern is less about whether we should measure 21st century skills than it is about how doing so on the MCAS will affect test scores in general. As the editorial points out,
[s]tate education officials have done a generally poor job of defining 21st-century skills - which can include interdisciplinary thinking and media literacy - or explaining how to test them statewide.


The problem for the Globe, it turns out, is that if we develop mediocre assessment strategies it'll ruin the MCAS for all of us. Because 21st-century skills can only be measured subjectively, the Globe argues, an "objective" test like the MCAS is an inappropriate place to assess achievement. Instead,
MCAS testers should concentrate on accurately measuring math ability and reading comprehension, which surely correlate with a student's success in the workplace.


Let's leave, just for now, the outrageous assumption that a standardized test could conceivably be considered "objective." Let's leave the assumption that a standardized test could "accurately" measure student ability in anything other than the ability to engage in the weird and peculiar game of test-taking. Which leaves just one last question:

In what world can anybody make the argument that achievement in math and reading without the accompanying facility with 21st-century proficiencies prepares any learning for any workplace worth the energy of applying for employment in the first place?

It's such a weird argument to make, that literacy practices like reading, writing, and doing math can be somehow isolated from the 21st-century contexts that make them meaningful. It's like asking someone if she knows how to tie her shoe, then making her
prove it by writing a detailed step-by-step description of how to do it. It's like asking someone to prove he can build a fire: But is the fire for warmth, for signaling for help, or for burning the whole house down?

Same with math: Knowing how to "do" fractions doesn't mean a learner is equipped to, say, resize a .jpg for a blogpost.

Arguing that we should keep 21st-century skills out of standardized tests in order to keep the tests objective is as lame as the argument that standardized tests are objective in the first place. Neither one makes any logical sense. Neither one gets you anywhere.

Awesomeness: Project New Media Literacies' spring conference: Learning in a Participatory Culture

There was awesomeness going on at MIT this weekend, as my colleagues and I at Project New Media Literacies put on a conference called Learning in a Participatory Culture.

If you've never planned a conference before, I can't say I recommend the experience--though when one goes well, as this conference did, the stress and exhaustion that pile on top of you in the lead-up suddenly turn into a fair trade-off. All day, my coworkers and I got to be surrounded by the smartest educators and educational researchers ever, and we got to hear them say all kinds of insanely awesome things.

As part and parcel of the pure awesomeness of the day, I scored two key personal / professional victories: First, I slam-dunked an opening presentation on design and development of Project NML's Teachers' Strategy Guide, garnering not one, not two, but three separate thumbs-ups from the people I most hoped to impress: My sensei Dan Hickey, my boss Henry Jenkins, and my close, close friend, colleague, and fellow Fireside Moonbat Katie Clinton. I only wish Katie had received more recognition for her contribution to the project--somehow, I've been given her share of the credit and I want to find a way to put it back where it belongs.

I've included a QuickTime version of my presentation below, though it admittedly loses something without the audio. I'll see what I can do about adding the audio in once we have it processed from the day.

video

A second key victory was in getting a back channel going, via a #NML09 hashtag on Twitter, for the day. We had set up a TweetGrid and the hashtag going into the conference but had no specific plans for supporting and integrating the technology, but before I gave my opening presentation I offered up a quick tutorial on how to Tweet using hashtags and my colleagues and I spent the day monitoring and engaging in a rapidfire Twitter conversation that extended participation in really nice ways. As the man Henry Jenkins himself said to me midway through the day, the fact that we didn't need to plan for or organize participation in social media but that it worked anyway when the tools and the energies were in place proves something important about the nature of participatory culture.

This is the artifact of my tutorial:

Finally, I want to shout out to all the participants who made the conference such a roaring success. Energy, enthusiasm, and engagement were high from beginning to end. I don't have the words to articulate what an amazing experience it was.

Monday, May 4, 2009

Hurry up and say goodbye to the Boston Globe


It looks like the Boston Globe's days are numbered after down-to-the-wire negotiations between the Globe's union and its parent company, New York Times, Inc., failed to produce a tenable solution to the budget crunch stemming from faltering subscriptions and ad revenue.

Readers of this blog know how I feel about newspapers' old-media attempts to remain viable in the midst of a new media revolution (hint: annoyed and dismissive). This doesn't, of course, mean that I'm a fan of media outlets shutting down entirely; after all, it's true that new media journalism models rely on the work of reporters in the field, gathering stories and reporting them via some platform. But the traditional approach to journalism--two to three major papers in every major city--is no longer viable and may be unnecessary in a culture where everyone's a potential media outlet. We don't yet know what kinds of journalistic models are sustainable, or what approaches to reporting will matter to those who are joining the social revolution.

In related news, my blog got noticed by an editor at The Guardian, who solicited me to write for an online section of the paper called Comment is Free. My first post, a condensed version of my review of State of Play, is up here. I don't find my thrill at the awesomeness of getting picked up by a major media outlet at all contradictory to my stance on print media.

RIP Augusto Boal

"Oppression is a relationship in which there is only monologue. Not dialogue."

Brazilian theater director, playwright, and activist Augusto Boal has died of respiratory failure at the age of 78.

Boal was best known for his work in establishing the "Theatre of the Oppressed," an approach to public performance intended to draw spectators in to the action. In becoming active members of the live theatre, the public, the nominal "oppressed," have the chance to participate in and transform the conditions of their everyday lives.

Boal wrote:
TO (Theatre of the Oppressed) was used by peasants and workers; later, by teachers and students; now, also by artists, social workers, psychotherapists, NGOs... At first, in small, almost clandestine places. Now in the streets, schools, churches, trade-unions, regular theatres, prisons...

Theatre of the Oppressed is the Game of Dialogue: we play and learn together. All kinds of Games must have Discipline - clear rules that we must follow. At the same time, Games have absolute need of creativity and Freedom. TO is the perfect synthesis between the antithetic Discipline and Freedom. Without Discipline, there is no Social Life; without Freedom, there is no Life.

The Discipline of our Game is our belief that we that we must re-establish the right of everyone to exist in dignity. We believe that all of us are more, and much better, than what we think we are. We believe in solidarity.

The world got Augusto Boal for 78 years, and all it had to do was work to confront injustice and oppression. What a bargain.

 

All content on this blog has been relocated to my new website, making edible playdough is hegemonic. Please visit http://jennamcwilliams.com and update your bookmarks!