updates @ m.blog

Encoding, Decoding, and… Monkeys?

What is it with monkeys and the web anyhow?

Lot of talk lately about Greasemonkey, which is essentially a filtering proxy server integrated into a Firefox extension. It lets users share scripts that will alter websites via pattern matching, typically in ways that privilege the reader’s experience (e.g. automatic redirection to the printer friendly version of a NYTimes.com article, removing useless ad-content from a weather.com page). For those who run filtering web-proxies such as Proximitron this is nothing new at all, but the change to a browser plugin methodology will result in it being more accessible to less technically inclined users.

For several years now (beginning with my MA thesis at NYU), my research has focused on precisely the question of user-modified “reading” of technological documents via automated content mutation. Those of you who follow my work (yeah, right) will know that I spoke about the topic at MEA two years ago, where I predicted that that tools for such methodologies would soon be integrated into the browser itself. Looks like that’s now precisely what is happening, evidenced in the popularity of Greasemonkey.

These tools expose the fundamental nature of communication in an environment characterized by encoding and decoding of information into/from standardized interpreted formats. To quote from the abstract of my MEA2003 paper:

Rather than being simply “read,” digital network streams are interpreted by a client’s software agent–allowing for the possibility of constructing divergent (cooked) meanings from a single homogeneous (raw) source. Thus, in digital networks, the point of experience (e.g., the visual document) is in many ways a constructed fiction—the “document” being composed of network streams that are only assembled into coherence in the space of the user’s own domain, taking place as a process within the reader’s personal computer at the moment of interaction. During this transformation from a computer-readable format to a human-readable format, there exists a large degree of interpretative freedom: the ability for the reader to influence how the raw data will be represented via exercising control over the decoding process.

That’s what tools like Greasemonkey do: let the user exercise control over the decoding process, reading a document the way they want. To put it another way: WYSIWYG is dead.

I expect to see these sort of features being integrated directly into the main featureset of mainstream browsers in the short-term. Heck, even Microsoft has now integrated a popup-blocker into MSIE.

The next step is moving from mere content-mutation to the production of recombinant documents, that pull freely from multiple heterogeneous sources to produce a single “perspective” for viewing. Think of it as a DJ remixing songs, only for the web, and in a persistent fashion. Increasing protocol standardization is going to be the glue that drives this one. It’ll happen, just you wait.

Comments