Thursday, June 07, 2007

The Manhattan Project

Rupert Murdoch’s announced offer to buy Dow Jones for $5 billion will be seen by future generations as a definitive point in the degradation of the mainstream news media. For a man with a record of prescience and cunning, Mr. Murdoch’s offer shows he no longer gets it. Major news media has suffered substantial blows in the past decade. The 18 to 24 age demographic would rather open their MacBooks than a hardcopy newspaper and, as a result, newspaper advertising has shown a significant decline. It takes 100 online subscribers to make up for every print reader who cancels her subscription.

Mainstream media is rushing to catch up. New York Times has begun their TimesSelect section on their website with exclusive web content. Murdoch’s News Corporation bought MySpace. CBS CEO claims he doesn’t care about the uploading of CBS content on YouTube, "As long as we get paid."

What these moguls do not understand is that the only solution for their troubles is complete investment in one realm or the other. Trying to straddle the hardcopy and software world only thins their efforts. In a column entitled “Final Thoughts About My Tenure and The Times’s Future” Public Editor Byron Calame
writes: “The Times’s effort to do more with the same size news staff and do it 24 hours a day, requires workload decisions that can affect quality, especially in editing […] With the expanding commitment to get stories online as soon as they are good enough to post, The Times will have to work very hard to keep the time pressure from eroding the quality of either the stories or the supplements." Quantity is increasing, while quality is stagnant.

Internet content in the past few years has shown a burst of creativity. Wordpress and Blogspot makes it faster to create a blog than it would take to, say, locate a soapbox. Comment threads on popular news sites frequently reach into the three digits. On Friday, May 18th, 2007, an Open Thread entry on the political blog DailyKos garnered 224 comments in a little over four hours, from 6:10 AM to 10:30 AM. While the quality of those threads may be trivial or even derogatory at times, the overwhelming responsiveness is more genuine than comment posts on New York Times blog that seem to require a digital copy of your PhD dissertation listed along with your username.

The proliferation and purchasing of user-generated websites that has recently occurred bears striking resemblance to the dot-com bubble of the 1990s with one major difference. Instead of taking the companies behind websites, investors are simply buying into the websites. Those websites, such as YouTube, MySpace, and Flickr offer, instead of a business, frameworks and infrastructure filled by user-generated content. This difference shows an increasing democratization of the Internet being challenged by the encroaching corporate interest. This encroachment does not produce fear so much as disappointment. The corporate model works well in establishing managerial, hierarchical and multinational structures to execute mindless labor, but it unfortunately stifles creativity necessary for web growth and interesting content. The headline
"States Seek MySpace's Sex-Offender List" reflects a loss of carelessness and spontaneity that formerly characterized the networking site.

There is an undeniable shift in the younger generations from centralized to decentralized forms of information. This rise of news aggregate sites and decline of conventional news comes from the gradual public realization that the emperor is wearing no clothes.
The digital manipulation of O.J. Simpson's mugshot on the cover of Time Magazine in 1994, the reporting circus the eve of the 2000 Presidential election and slanted and totally uncritical coverage of WMDs in Iraq have all shown the danger of human bias behind supposedly objective news outlets. Summed up best by Mike Conway from University of Illinois, "in a post-modern media environment, every communication zone—from opinion to hard news—has a spin."

So-called post-modern users are then put to a test. If we are to read news to objectively perceive the happenings of the world, what source would provide the least amount of spin? Would it be the 32 major newspapers (including the New York Post with a daily circulation of over 724,000), 34 major magazines, 10 international broadcast stations, 5 satellite television stations (including China’s STAR TV with 300 million viewers in 54 countries), 14 American cable channels, 26 international cable channels, and twenty major websites (including MySpace with 100 million accounts and, “India’s Number One Entertainment Portal”) two publishing houses and one record label owned by
Rupert Murdoch? Or would it be Wikipedia, who showed it’s new reach as a news reporting source during the Virginia Tech shootings, with over 2,074 contributors to the story and more than 750,000 visits to the main article in the first two days, an average of four visits per second?

A Shift.

Crucial to this shift from centralized to decentralized information sourcing is a system of checks and balances, or a way of managing and checking credibility of the sources. A valuable aspect of conventional news media is accountability. Libel, plagiarism and truthfulness can all be put to a known writer working under many supervisors in an actual workplace. With blogs, we do not have that luxury. Lord knows someone’s trying—the debate of culpability of a blog moderator continues with no end in sight.

This is the challenge the future of information presents. To develop a decentralized medium that will be above inherent human biases by shaping sources chosen by the user and providing a system of checks and balances. This is the goal of the Manhattan Project. This project will be a long-term, ongoing process of discussion concerning the development of such media.

A few things can be established from the start. First, the project must be a multi-platform web browser application. No other medium can encompass information and sources like a web browser. Mozilla’s Firefox has shown the power of an independent web browser in both the security it provides and the innovation coming from a non-profit foundation, although Mozilla receives significant financial contributions from Google
to default Google as its search engine. offers an interesting, albeit limited, example of combining user-generated content, unique interface and visual graphics. utilizes CSS thumbnails features to embed references that can be accessed simply by rolling a mouse over the icon. is limited by its status as a website. It lies in, rather than framing, the material viewed. Our proposed web browser will have an interface that can process the text body from the source of a website, pick up on certain semantic and linguistic cues and be cross referenced through search engines such as Google and databases such as Wikipedia. The user can then see the page through a filter, perhaps similar to the effect of the Dashboard program on Mac OS X.

One drawback to previously creating a browser like this has been
bloat, a curse upon Firefox. Potential bloat can be offset by outsourcing some program executions via thin-client service. Online services can be accessed by the browser as soon as it is opened.

A separate interface should be used to process RSS feeds. RSS feeds do not belong in the browsers of today. There have been many attempts to reconcile this misfit. For example, is a web-based personal news aggregator that works to make scanning RSS feeds easier. However, the format is limited in the sense that it uses spreadsheet interface.

The RSS feed needs its own space and structure in which to operate--the spreadsheet or toolbar does not work. The struggle between RSS and website interface is similar to Apple’s struggle to mesh music information with a spreadsheet interface. Music information has traditionally been called “the album,” which provided track names, musician information, lyrics and art selected by the musicians. The digital revolution in music increased the reach of recorded music, but laid a massive blow on the album as a channel of information.
Andrew Coulter Enright took on this discrepancy by designing CoverFlow, restoring one aspect of information sorely missed in digital world.

Another example of this comes from
Marumushi’s News Map, a Java app using Google tags on a story to show its prominence in the Internet world. Increased size means higher web references to a story and the color scheme represents the age of a storyOne could envision flipping through RSS feeds much like flipping through the mail, with color schema to show which source has been updated and XML formatting to extract the new title posts.

The Deal.

The Manhattan Project will be an ongoing process. Since its design is to meant to foster decentralization, so should its programming and development. While maintaining an overall aesthetic, the different aspects of the browser should have different teams working on the ideas and actual programming. Monthly teleconferences supplemented by listserves should suffice for communication. This project will not adhere to the term open-source. Open-source in the past has meant community building with no other goal than efficiency of programming and maintenance. While it is wonderful for working out kinks, open-source lacks overall focus. The Manhattan Project will most likely be seen as an open-source endeavor, but will differ because it has the overall objective of empowering the user and existing above the news media in order to produce the most accurate information possible in the most decentralized way possible.

In the 1980s, there was a debate in the future of computer systems. It was the debate on whether users would prefer text-based interface of DOS provided by Macintosh computers or graphic-based interface provided by an operating system called Windows. Now, users are being presented with the option of a new experience. On one hand is the experience of a personal computer as an end—a storage system of information held and accessed in a rudimentary input-output fashion. On the other is the experience of the personal computer as a means—a portal to infinite amounts of information, thin-client software and storage space. The Manhattan Project believes this latter experience is the more invigorating and promising user experience. Through its development, we hope to contribute important and meaningful tools to help navigate the overwhelming information and by doing so, further empower the user.

No comments: