Folkestonomy


Application Design


I've come up with a final application design I'm happy with and confident will work. Now to start implementing it all.

There will be numerous components in the system, and it's a pretty loose description, but it will do the job.

Data Collector

This will be based in the collector box, running a piece of software on an Arduino base ATMEGA Micro Controller. The program will check for presses of the start button, signalling a listener application on the connected MacBook that the mapping has started. It will then wait for the stop button to be pressed, and collect the data from across the 8 wire networks it can be attached to, and transmit that information to the main computer along with a stop signal.

Listener Application

This will be a background application on the MacBook, which will listen for signals for the data controller. Upon a 'start' it will create a new mapping, and grab co-ordinates for the floats location from the GPS device connected to the macbook, as well as logging the start time and creating a unique ID for the map. When the stop signal is recieved it will log the information gather from the various circuits into a database.

As we won't have a 'live' internet connection on the float this application will also listen for an active internet connection and upon receipt start synchronising data with he primary server were the website lives, uploading new maps and downloading any data that's changed on the website.

Display Application

This will be an application that will run on the laptop on the float, and also on the website for the project. It will work slightly differently for each context, but visually will be mostly identical. It will display the latest map available, as well as allowing exploration of the maps and augmentation of the data collected.

Web Synchronisation

This will be a backend application that will be triggered by the listener application, and handle the details of synchronising data with the main server.


Handling the input.

Having worked out the scope of the language we may be dealing with: 100's of icons in many combined groupings, the initial thoughts of RFID have gone out of the window: readers are too costly to implement multiple times, and mutli-tag readers are expensive. RFID was an interesting possibility initially as we though we may be able to locate the tags within a 3D space, but that's also a bit more expensive, and not as accurate, as initially imagined.

Increasingly, however, the language seems to have developed into one of placing items together to make the map, and I've been thinking about these as some form of network. Having been searching around for similar ideas to RFID, I was reminded of iButtons. I'm going to look at how iButtons and 1-Wire Networks might be used to identify each of the components of the language.


Refining the language and gathering some initial data

A rough and ready mapping, clarifying my thoughts...no, really!
A rough and ready mapping, clarifying my thoughts...no, really!

The last month has been spent going back and forwards defining what information we are going to gather, and how we are going to use that to generate maps. I've also been working on how we might gather the data and get that into a computer, and then translate that into one of many possible views, as a map.

Kathrin, Andreas and I took a trip to Folkestone so I could have a look around, and so we could talk to some people about mapping, and the space. They told us there stories of how their lives are involved in the creative side of folkestone, the conversations helped me to think about how the pieces fit together into a map.


Storytelling by Signs, Language as a Network

The primary challenge for me is trying to work out how to turn a physical interaction by visitors into a set of data that can be used to produce maps of the cultural spaces in Folkestone.

From my earliest meetings regarding this I've always felt that the visitors are telling their stories. As the project has proceeded and the idea become refined the story has always been a core idea, with the refinement being the language used to describe the story, and the subject of the story we are asking to be told.

The initial brief has a strong concept for a visual language that would be used to tell the stories, based upon street signs and road-works. This would be a very physical process - not someone inputting data into a computer - and we wanted to keep the need for active digitisation to a minimal; the person who's looking after visitors should be focusing on the people, not a computer.

Early on this was imagined as having a map of Folkestone that players told their stories on, by placing markers on. The markers representing actions, connections and objects. This would allow for the story to be laid out across Folkestone, creating each persons map within that space.

Looking into how this could be digitised I started looking into ways of tracking the objects, initial thoughts pointed at
RFID
, but whilst it's possible to track RFID tags in space, it wouldn't have been accurate enough to track the placement of objects on the map.

More recent discussions, including the Mapping Workshop at the Stanley Pickering Gallery, has move us away from using the town map as the scope of the mapping tool, and move towards a more abstracted storytelling. This may, or may not, be easier to build an input system for.

Breaking the story down into it's components - essentially verbs and nouns, or items and contexts - has allowed me to view the stories as a simple set of star networks: core actions that are related to places and objects, which have another structure - time, which is (generally) a line.

I can imagine ways of mapping this physically, but the challenge is now how to get that to merge with the overall ideas of the graphical language and possible interactions. It's going to be a challenge to make that work with the visual ideas, and then produce meaningful and interesting maps from the data collected.


a public works project. site design and build by dorian