

You guys, I really really want to get accepted. Will you go to the site and vote for me?
You can read my proposal here.
an occasional blog on culture, education, new media, and the social revolution
Without any special authority to do so, may I please give you a homework assignment? Would you please blog about why you choose to be open? What is the fundamental, underlying goal or goals you hope to accomplish by being open? What keeps you motivated? Why do you spend your precious little free time on my blog, reading this post and this question? If each of us put some thought and some public reflective writing into this question, the field would likely be greatly served. The more honest and open you are in your response, the more useful the exercise will be for you and for us.
While I think everyone in the field of “open education” is dedicated to increasing access to educational opportunity, there is an increasingly radical element within the field – good old-fashioned guillotine and molotov type revolutionaries. At the conference I heard a number of people say that things would be greatly improved if we could just get rid of all the institutions of formal education. I once heard a follow up comment, “and governments, too.” I turned to laugh at his joke, but saw that he was serious. This “burn it all down” attitude really scares me.
There won't be schools in the future.... I think the computer will blow up the school. That is, the school defined as something where there are classes, teachers running exams, people structured in groups by age, following a curriculum-- all of that. The whole system is based on a set of structural concepts that are incompatible with the presence of the computer... But this will happen only in communities of children who have access to computers on a sufficient scale.
Rejecting technological determinism should be a mantra in our professional conversations. It's really easy to get in the habit of seeing a new shiny piece of technology and just assume that we can dump it into an educational setting and !voila! miracles will happen. Yet, we also know that the field of dreams is merely that, a dream. Dumping laptops into a classroom does no good if a teacher doesn't know how to leverage the technology for educational purposes. Building virtual worlds serves no educational purpose without curricula that connects a lesson plan with the affordances of the technology. Without educators, technology in the classroom is useless.
As we talk about the wonderfulness of technology, please keep in mind the complexities involved. Technology is a wonderful tool but it is not a panacea. It cannot solve all societal ills just by its mere existence. To have relevance and power, it must be leveraged by people to meet needs. This requires all of us to push past what we hope might happen and focus on introducing technology in a context that makes sense.
Just because many of today's youth are growing up in a society dripping with technology does not mean that they inherently know how to use it. They don't. Most of you have a better sense of how to get information from Google than the average youth. Most of you know how to navigate privacy settings of a social media tool better than the average teen. Understanding technology requires learning. Sure, there are countless youth engaged in informal learning every day when they go online. But what about all of the youth who lack access? Or who live in a community where learning how to use technology is not valued? Or who tries to engage alone? There's an ever-increasing participation gap emerging between the haves and the have-nots. What distinguishes the groups is not just a question of access, although that is an issue; it's also a question of community and education and opportunities for exploration. Youth learn through active participation, but phrases like "digital natives" obscure the considerable learning that occurs to enable some youth to be technologically fluent while others fail to engage.
I will admit that there is a certain irony about having to refer people to my blog for an exchange that started on Twitter but couldn't really be played out within the character limits of that platform. But then, note that armique's very first post had to be broken into two tweets just to convey the emotional nuances he needed. And that's part of my point.
From the start, I've questioned whether Twitter was the right medium for me to do my work. I've always said that as a writer, I am a marathon runner and not a sprinter. I am scarcely blogging here by traditional standards given the average length of my posts. Yet I believe this blog has experimented with how academics might better interface with a broader public and how we can expand who has access to ideas that surface through our teaching and research.
Most often, the retweets simply condense and pass along my original Tweet. At best, I get a few additional words on the level of "Awesome" or "Inspiring" or "Interesting." So, in so far as Twitter replaces blogs, we are impoverishing the discourse which occurs on line.
there is an awful lot of relatively trivial and personal chatter intended to strengthen our social and emotional ties to other members of our community. The information value of someone telling me what s/he had for breakfast is relatively low and I tend to scan pretty quickly past these tweets in search of the links that are my primary interests. And if the signal to noise ration is too low, I start to ponder how much of a social gaff I would commit if i unsubscribed from someone's account.
@jennamcjenna can someone link me to an article that tells me something completely mind-blowing? It doesn't matter what topic.8:52 PM Jun 16th from web
@dizzyjosh: @jennamcjenna try http://bit.ly/eQf3m http://bit.ly/zCUQM http://bit.ly/Sh06v http://bit.ly/Ks9qG http://bit.ly/PgNqT http://bit.ly/PgNqT
are on prominent display in any large “state of the art” classroom. Rows of fixed chairs often face a stage or podium housing a computer from which the professor controls at least 786,432 points of light on a massive screen. Stadium seating, sound-absorbing panels and other acoustic technologies are designed to draw maximum attention to the professor at the front of the room. The “message” of this environment is that to learn is to acquire information, that information is scarce and hard to find (that's why you have to come to this room to get it), that you should trust authority for good information, and that good information is beyond discussion (that's why the chairs don't move or turn toward one another). In short, it tells students to trust authority and follow along.
This is a message that very few faculty could agree with, and in fact some may use the room to launch spirited attacks against it. But the content of such talks are overshadowed by the ongoing hour-to-hour and day-to-day practice of sitting and listening to authority for information and then regurgitating that information on exams.
When I speak frankly with professors all over the world, I find that, like me, they often find themselves jury-rigging old assessment tools to serve the new needs brought into focus by a world of infinite information. Content is no longer king, but many of our tools have been habitually used to measure content recall. For example, I have often found myself writing content-based multiple-choice questions in a way that I hope will indicate that the student has mastered a new subjectivity or perspective. Of course, the results are not satisfactory. More importantly, these questions ask students to waste great amounts of mental energy memorizing content instead of exercising a new perspective in the pursuit of real and relevant questions.
[t]o write a competent brief the student has to be able to read the text being briefed in much the same way as the professor does.... Students are not taught these reading skills—the ones necessary to be able to write briefs—directly. Briefs are not, for instance, turned in to the professor; they are written for the students' own use in class.... One of the basic assumptions of law school is that if students are not told overtly what to do and how to proceed, this will spur them on essentially to teach themselves. Minnis argues that this assumption does not, however, work equally well for everyone. Many students from minority or otherwise non-mainstream backgrounds fail in law school.
An outbreak of zombies infecting humans is likely to be disastrous, unless extremely aggressive tactics are employed against the undead. While aggressive quarantine may eradicate the infection, this is unlikely to happen in practice. A cure would only result in some humans surviving the outbreak, although they will still coexist with zombies. Only sufïŹciently frequent attacks, with increasing force, will result in eradication, assuming the available resources can be mustered in time.
This is, perhaps unsurprisingly, the ïŹrst mathematical analysis of an outbreak of zombie infection. While the scenarios considered are obviously not realistic, it is nevertheless instructive to develop mathematical models for an unusual outbreak. This demonstrates the ïŹexibility of mathematical modelling and shows how modelling can respond to a wide variety of challenges in ‘biology’.
In summary, a zombie outbreak is likely to lead to the collapse of civilisation, unless it is dealt with quickly. While aggressive quarantine may contain the epidemic, or a cure may lead to coexistence of humans and zombies, the most effective way to contain the rise of the undead is to hit hard and hit often. As seen in the movies, it is imperative that zombies are dealt with quickly, or else we are all in a great deal of trouble.
This is morning in America in the Internet age. After six to eight hours of network deprivation — also known as sleep — people are increasingly waking up and lunging for cellphones and laptops, sometimes even before swinging their legs to the floor and tending to more biologically urgent activities.
a relentless declaration that print is doomed, may be a problem in and of itself, making it easy to cast anyone who wants to save print as a Luddite.)
“For Twitter users, the outage meant no tweeting about lunch plans, the weather or the fact that Twitter was down.”
The site has played a big role in several recent news stories ranging from the social turbulence following the Iranian elections to the first picture of a plane that landed in New York’s Hudson River. Locally, the earliest headlines and photos after the tanker explosion on I-75 at 9 Mile were on Twitter....
Some use Twitter solely as an input channel, to update their Facebook status. Others value its conversational nature. I take advantage of both. Ultimately the biggest value of Twitter is a weeded out, faster connection to things I didn’t know before.
This is a normal sort of open source project. I’ll give you a minute to spot the women in the picture. Sorry, make that woman. She’s on the right. Can you see her?
If you’re working on a desktop app, recruit desktop users. If you’re writing a music sharing toolkit, recruit music lovers. Don’t worry about their programming skills. You can teach programming; you can’t teach passion or diversity.
"you don't just give up / you don't just let things happen / you make a stand / you say no //"
All content on this blog has been relocated to my new website, making edible playdough is hegemonic. Please visit http://jennamcwilliams.com and update your bookmarks!