Smoking and eating the flowers of the Thames 2014

bnr#76 => Smoking and eating the flowers of the Thames 2014

NetMonster introduction

bush blair map


This is a raw collection of NetMonster documents from around 2004 - as far as possible I left them unedited - NetMonster started after producing 9nine for the Waag Society and Imagine IC in 2003. It stopped being a live project in and of itself in 2008 but the core engine has been used in the Aluminium book and most recently Data Entry. In 2008 I became frustrated that people could never seem to get beyond the immediate aesthetics of the image. The image was treated like a physical experience of a landscape rather then being seen as constructed from code - a kind of constructivist network image.

NetMonster was written by Harwood with input from Matsusko Yokokoji, Richard Wright, Francesca De Rimini (who created the Trade NetMonster) and Matthew fuller.

NetMonster was first developed in 2003/4 and started as an way to give the network expression to a time-slice of texts, images, links that could be accumulated around a particular search term(s).

“A NetMonster project starts with a set of key search terms. The Monster then starts to construct itself from the search results. This resultant "networked image" can then be navigated, edited and directed by its users using various browser based authoring features - allowing them to edit text, turn images off and on, etc. Based on these new settings, the NetMonster iterates another search and rebuilds the image. The result over many such iterations is a collaboratively built networked image which continuously changes and offers up new content and configurations. In this way the software autonomously generates thematic content, composes it and displays it and then allows its content and thematic focus to be modified by people after each stage.”

Working on Nine9 at the waag society I noticed that people would only come to the site to make a data dump, build a map (9 images etc) then leave. I thought that if you could scrape content that would would be of interest, challenge, contextualise the users content then this might bring them back to see what has turned up based on their uploading.

I was also interested in the way people 'read' the WWW at a page a time.

We were interested in scraping telephone numbers, email address so that what ever was being constructed could phone people up, email them in a media frenzy that's why we fist developed Southend soundbites, telephony trottoire etc.

NetMonster allows you to iterate through web pages ripping them into editable text, storable images, and further links to follow. The process begins with creating keywords which would be used to search google or a URL in a few different ways.
The initial search would return text and show you what words are used the most frequently in association with that key phrase. You can then decide which of these words you want to associate with the keyword. When you search again it will bring back the results in one of the following ways.

1. 'all' returns the text found as one blob. (good for a well known subject )
2. 'sentences' returns every sentence containing the keyword and omits the rest.
3. 'associated' returns every sentence containing the keyword and any sentence containing a work you have associated with the keyword.
4. 'summary' returns a short summery of all the text in a search. (bad for small texts)

Images are returned for in the following manner.

1. 'auto' ( brings back images automatically from google)
2. 'web' ( use a text search on google and follow all the links )
3. 'image' ( do a image search on google and follow all the links )
4. 'href' ( specify a web page you wish to start with)

Our definition of an ‘image’ here is not just a picture but any sensory impression, percept or idea. Because of the multivalent, embedded and contingent nature of networked media, the image no longer has such a strong representational or poetic function of its own. So how does it now function in contemporary culture and how can we give it a new power?

The internet is the largest network of desire for wealthy countries. From net porn to ebay, these desires are usually encountered in a diffused and piecemeal fashion, one page at a time. How can we reveal these unconscious desires for cultural intervention?

One of our approaches is to examine the context we can give the image. How much can be put under software control and how much can be gleaned from an existing context, its ownership or its accessibility?