Thursday, July 19, 2007

Manhattan Project -- Formal Proposal.


This post adheres to:


Creative Commons Attribution-Noncommercial-Share Alike 3.0 License

You are free:
• to Share — to copy, distribute and transmit the work
• to Remix — to adapt the work
Under the following conditions:
• Attribution. You must attribute the work in the manner specified by the author or licensor (but not in any way that suggests that they endorse you or your use of the work).
• Noncommercial. You may not use this work for commercial purposes.
• Share Alike. If you alter, transform, or build upon this work, you may distribute the resulting work only under the same or similar license to this one.


Executive Summary
The decentralization of media and internet content demand new tools for receiving information. Previously elite media institutions have suffered economic and creative decline in the past years. The Manhattan Project is a lofty endeavor wishing to foster development of tools for the proliferation of decentralized media while maintaining a level of quality control or fact-checking lacking in the blogosphere. The first two proposals are plug-ins, Junk Monkey and BKNi, both meant for Firefox to test the viability of developing such media. It is our hope the following proposal will inspire others to assist with the creative and technical process of developing crucial tools for the 21st century user.



Introduction

Rupert Murdoch’s announced offer to buy Dow Jones for $5 billion will be seen by future generations as a definitive point in the degradation of the mainstream news media. For a man with a record of prescience and cunning, Mr. Murdoch’s offer shows he no longer gets it. Major news media has suffered substantial blows in the past decade. The 18 to 24 age demographic would rather crack open their MacBooks than a hardcopy newspaper and, as a result, newspaper advertising has shown a significant decline. It takes 100 online subscribers to make up for every print reader who cancels her subscription.

Mainstream media is rushing to catch up. New York Times has begun their TimesSelect section on their website with exclusive web content. Murdoch’s News Corporation bought MySpace. CBS CEO claims he doesn’t care about the uploading of CBS content on YouTube, "As long as we get paid."

What these moguls do not understand is that the only solution for their troubles is complete investment in one realm or the other. Trying to straddle the hardcopy and software world only thins their efforts. In a column entitled “Final Thoughts About My Tenure and The Times’s Future” Public Editor Byron Calame writes: “The Times’s effort to do more with the same size news staff and do it 24 hours a day, requires workload decisions that can affect quality, especially in editing […] With the expanding commitment to get stories online as soon as they are good enough to post, The Times will have to work very hard to keep the time pressure from eroding the quality of either the stories or the supplements." Quantity is increasing, while quality is stagnant.

The proliferation and purchasing of user-generated websites that has recently occurred bears striking resemblance to the dot-com bubble of the 1990s with one major difference. Instead of taking the companies behind websites, investors are simply buying into the websites. Those websites, such as YouTube, MySpace, and Flickr do not offer a large staff or human resources, but rather offer frameworks and infrastructure filled by user-generated, crowdsourced content. This difference shows an increasing democratization of the Internet being challenged by the encroaching corporate interest. This encroachment does not produce fear so much as disappointment. The corporate model works well in establishing managerial, hierarchical and multinational structures to execute mindless labor, but it unfortunately stifles creativity necessary for web growth and interesting content. The headline "States Seek MySpace's Sex-Offender List" reflects a loss of carelessness and spontaneity that formerly characterized the networking site.

There is an undeniable shift in the younger generations from centralized to decentralized forms of information. This rise of news aggregate sites and decline of conventional news comes from the gradual public realization that the emperor is wearing no clothes. The digital manipulation of O.J. Simpson's mugshot on the cover of Time Magazine in 1994, the reporting circus the eve of the 2000 Presidential election and slanted and totally uncritical coverage of WMDs in Iraq have all shown the danger of human bias behind supposedly objective news outlets. Summed up best by Mike Conway from University of Illinois, "in a post-modern media environment, every communication zone—from opinion to hard news—has a spin."

So-called post-modern users are then put to a test. If we are to read news to objectively perceive the happenings of the world, what source would provide the least amount of spin? Would it be the 32 major newspapers (including the New York Post with a daily circulation of over 724,000), 34 major magazines, 10 international broadcast stations, 5 satellite television stations (including China’s STAR TV with 300 million viewers in 54 countries), 14 American cable channels, 26 international cable channels, and twenty major websites (including MySpace with 100 million accounts and Indya.com, “India’s Number One Entertainment Portal”) two publishing houses and one record label owned by Rupert Murdoch? Or would it be Wikipedia, who showed it’s new reach as a news reporting source during the Virginia Tech shootings, with over 2,074 contributors to the story and more than 750,000 visits to the main article in the first two days, an average of four visits per second?

Crucial to this shift from centralized to decentralized information sourcing is a system of checks and balances, or a way of managing and checking credibility of the sources. A valuable aspect of conventional news media is accountability. Libel, plagiarism and truthfulness can all be put to a known writer working under many supervisors in an actual workplace. With blogs, we do not have that luxury. This is the challenge the future of information presents. To develop a decentralized medium that will be above inherent human biases by shaping sources chosen by the user and providing a system of checks and balances.

Concepts
The ideas presented for the Manhattan Project will be different approaches to aiding the decentralization of user-generated content. The term “Manhattan Project” will describe a concentrated, organized effort to cultivate innovative mediums for information processing.

We would like the first two proposals to be plug-ins designed for Firefox. A previous proposal described the formation of a complete web browser, but this was shelved for multiple reasons. The technical feasibility of constructing a web browser seemed over the top and too demanding at this stage, along with the product testing feasibility of plug-ins. Plug-ins are not only more manageable technical projects, but they can be released and streamlined into existing browsers for testing. This is not to say a browser has been completely panned. On the contrary, the discussion of a possible browser and its framework should be kept lively as production and development continues.

The market for these plug-ins we imagine will be near the end of the long tail. This is not to be an immediately popular feature, rather one introduced to a specific subset of internet users.

a) Junk Monkey (working name) – One of the more spectacular aspects of the internet experience that doesn’t get deserved attention is the View Source command. It’s like opening the back of a watch and observing the miniscule parts of a machine work. Source codes are labors of love, and it’s an incredible thing to see the genome of a well-designed website.

This proposed plug-in would take source code from websites, say The New York Times, and isolate the body text. The script would then analyze the text using linguistic and grammatical cues. For example, from the New York Times: “The Strathclyde police in Scotland” one can isolate “Strathclyde” and “Scotland” and run a Wikipedia search on the geographic region. Also, one can take the author of an article, “Frank Rich” or “David Brooks” and run a search to see previous articles, past history and reporting reputation.

After mining potentially interesting items, the program would then run a cross-reference with the individual items corresponding to its relevant database. Wikipedia or Google could be used as a cure-all database, can run searches on almost any proper noun, also news stories. Sourcewatch or Media Matters are political databases that hold a general database on different commentators and media figures. PubMed and Google Scholar are two great databases for searching for reports and articles.

The program would work in a click on/off style, similar to the dashboard program for Mac OS X. If the user has a question, or wishes to learn more about the story, she can click on the program to show what links can be made or what follow ups can be made from the story. To ensure flexibility, the user can choose her preferences to which database she would like to use. This would be done on opening the program for the first time. The user would decide which database to use for pronoun searches (for example), choosing between Wikipedia, Google, Dictionary.com or Ask.com. This would be similar to choosing blog preferences the first time one opens the web browser Flock.

Similar features do exist. CSS Thumbnail photos produce snapshots on certain hyperlink rollovers. This feature is innovative, but slow and intrusive on the web experience. It takes too much time to load the page, which can be especially annoying if the user accidentally rolled-over and then must wait for the file, sometimes simply an advertisement, to load. The New York Times website allows for users to double-click on a word or phrase and brings up a new window with a search through Ask.com along with other references within the New York Times. The hope would be to create a quicker, subtler combination of these two features already available.

b) BKNi (working name) – The RSS feed needs its own space and structure in which to operate--the spreadsheet or toolbar does not work. The struggle between RSS and website interface is similar to Apple’s struggle to mesh music information with a spreadsheet interface. Music information has traditionally been called “the album,” which provided track names, musician information, lyrics and art selected by the musicians. The digital revolution in music increased the reach of recorded music, but laid a massive blow on the album as a channel of information. Andrew Coulter Enright took on this discrepancy by designing CoverFlow, restoring one aspect of information sorely missed in digital world.

BKNi will be a blog visualizer, a new interface experience. It involves creating a section based on frequently updated sites to reduce the amount of time it takes to sort through the large amount of information feeds produce. The goal is to create a graphic interface that will accommodate RSS feeds and updated blogs to streamline and make more efficient news reading.

On the information side, BKNi will offer maintenance options in different respects. Sorting feeds will be done on a large span of qualifiers, including New, Favorites, Least Visited, Most Visited, Most Recently Updated (sorted by comments), Most Frequently tagged, Title, Author(s), Theme and maybe even Country. BKNi will also offer to drop feeds you haven’t visited in a specific period of time, say 90 days, and recommend feeds in your collection based on number of comments or tags.

On the graphics side, this would be the place to completely differ from any other sort of aggregator available. Combining the idea of Coverflow along with Marumushi’s News Map, the graphic interface will make going through feeds similar to sorting through the mail, flipping through a rolodex, or glancing at the daily paper. Perhaps with a Google Earth hack and traced IP address, a literally global view of the user’s feeds could be presented.



Structure

Structure for development within Manhattan Project will be decentralized. Projects will be divided into collaboratives. For example, the Junk Monkey project has three separate collaboratives: Graphic Design, Programming and Resources. Each collective will have a few people responsible for developing ideas and goals for the specific area. These areas can be posted at the wikispace (themanhattanproject.wikispaces.com) with an expected level of commitment. Jobs that require many hours of work can be discerned from others taking only a few moments. Near the end of the development, heads from each of the collaboratives will meet to seam the different areas together, working to form an aesthetically unified finished piece.

Finance and Sponsorship

The Manhattan Project is currently seeking “Not-For-Profit” 501(c)(3) organizational status. Not-For-Profit status will allow the Manhattan Project to seek institutional sponsorship as well as incorporate the organization.

Institutional sponsorship will help the public prominence of the project. The Project has engaged Free Culture International to assist with development and advising. We are also in contact with George Soro’s Open Society Institute (OSI) and will apply for funding from OSI in September 2007.


Conclusion

In the 1980s, there was a debate in the future of computer systems. It was the debate on whether users would prefer text-based interface of DOS provided by Macintosh computers or graphic-based interface provided by an operating system called Windows. Now, users are being presented with the option of a new experience. On one hand is the experience of a personal computer as an end—a storage system of information held and accessed in a rudimentary input-output fashion. On the other is the experience of the personal computer as a means—a portal to infinite amounts of information, thin-client software and storage space. The Manhattan Project believes this latter experience is the more invigorating and promising user experience. Through its development, we hope to contribute important and meaningful tools to help navigate the overwhelming information and, by doing so, further empower the user.




No comments: