Tuesday, April 25, 2006

Gilmore Girls Showrunners Won't Follow the Show to the CW

As a big fan of the WB's Gilmore Girls, I was saddened to read this interview with showrunners Daniel and Amy Sherman Palladino, who detail their harrowing experience over the last 6 years, and the network's refusal to meet their demands for the 7th season. Seems they've been working under year-to-year contracts all this time, without a dedicated staff director, and forced to take on multiple tasks more efficiently relegated to other staff. Their reward for all this hard work is the CW (the UPN/WB mash-up network coming next fall) refusing to negotiate, and dropping the show's creators for its next season.

Granted, this is a very one-sided perspective on the matter, but the result is perhaps the worst possible for fans of the series. GG may not have the high profile of other recent teen TV shows like Buffy the Vampire Slayer, but the show is consistently exceptional, and the Palladino's some of the most prolific writers on TV, having generated over 90 scripts for the show. That's not normal for TV, which is typically much more collaborative. If the series can be said to have an author, it's certainly these two. Once again, television networks show a basic minsunderstanding of the creative process.

Thursday, April 13, 2006

Class with Sherry Turkle: Evocative Objects and Childhood Development

Were children being raised smarter in the 80s? Have GUIs made us all intellectually lazy and less self-reflexive?

Sherry Turkle, MIT Media Lab researcher and author of seminal books about the psychological ramifications of technological change, visited my Media Theories and Methods class this week to talk about her work. In preparation, I read as much of the 20th anniversary edition of The Second Self as I could. I found a lot to admire, and a lot that was relevant to my own work on adolescence narratives.

To very quickly summarize that book, it presents the results of her study of children learning the LOGO programming language in the early 1980s. She describes how children were integrating their experiences of computers and computerized toys in their understanding of the human and their own minds, how computers pushed at the conventional boundaries of what we understand as alive and intelligent. In learning the abstract language of the machine, the children were necessarily articulating their own knowledge of themselves in a rather unprecedented way.

In her new introduction and epilogue to the book, she expresses disappointment that children are no longer taught programming, and that the graphical user interface pioneered by Apple, while pulling down many barriers to computer use, has made the way computers work far more opaque. Widespread adoption of the PC has come at the cost of a more shallow engagement with these evocative objects. This is also a key concept in her follow-up book, Life on the Screen.

In my class, and, by sheer coincidence, a class later in the day when we discussed the very same book, many of my classmates, some computer programmers, disagreed strongly. One argument went that all computer languages are, themselves, abstractions; very few people program in machine code. Graphical and WYSIWYG interfaces have allowed all user levels to work more masterfully with computers as tools. In particular, modding communities around video games have allowed those without deep programming skills to participate in the creation of simulations. And, in fairness, Turkle herself worried that she was overstating her objections, that she had become likeJoseph Weizenbaum, a curmudgeonly voice against technology.

But I believe her argument to be substantive, with reservations. Turkle's definition of programing seems to be very qualitative. Even though writing code is an abstraction, she differentiates between levels of abstraction. Coding in C is programming, writing HTML and designing furniture for Second Life isn't. While this seems subjective, that doesn't necessarily make it invalid. What seems to be important is that the level of abstraction be deep enough to provoke a comparison with mind. She describes coders using the language of programming, like "buffer" and "debugging," as metaphors for their mental processes. Is the "desktop" model as evocative? Do we say to ourselves, "wait a minute, I need to copy that to my clipboard" or "hold on while I open a new window?" Coding metaphors pull you into the machine, desktop metaphors push you back out into the world.

But the computer isn't the only "evocative object" out there, as I'm sure Turkle would agree. In particular, I would cite stories as both seductive and uncannily suggestive of mental processes, hence the fields of autobiographical memory and cognitive narratology. Storytelling, like coding, can put people into that space of meta-cognition, thinking about thinking. If anything, user-friendly computers, the Internet, and gaming have provided powerful new tools for telling and sharing stories.