Sun 18 May 2008
Just published my roundup of the first day. Found it pretty difficult as I hadn’t taken any notes yesterday so I’ll try taking some today I think. I’m going to try something a little different today. I’ll publish this article in the morning then keep typing in it during the day, unfortunately WordPress doesn’t seem to be autosaving but I’ll click “Save” from time to time. I’ll also put something at the end of the article to mark it finished.
Micro, Nano, Pico formats
Talking about marking up the various formats with geo information: KML, GeoRSS, Atom. Formats are too big, they need to be smaller. Need to be able to store multiple locations and time information potentially. Discussing the best ways of linking to multiple locations, whether to place them as separate entries within Atom (which in old viewers will show the locations as separate entries), also whether we should insert links within text, or have separate links. Do those links link within a file, or to separate files? Probably the best place to look for updates on this part of the session is Andrew Turner’s blog.
Geohash from geohash.org, simple way of giving bounding boxes or lat/lon coordinates. Easy to search within a bounding box though will have issues as it’s based on predefined tiles – actually apparently not, this does not use tiling.
Guy from xrosspath is talking about “Geotudes” but they do seem rather similar to basic tiling mechanisms, quad-tree indexing and morton numbers.
Is geo privacy shit?
Mapufacture has enabled fire eagle to generate an RSS/KML feed. Andrew Turner thinks lots of people will decide “screw privacy”, Fire Eagle think that’s fine, so long as it’s a choice.
Keys to allow specific people to get access.
Privacy issues stifle innovation, caused issues with Dopplr. Can we trust the government? Being near a criminal offence might cause you to become a suspect (or witness). Location black lists. Nathan Eagle relationship mining – ContextAware. Regulatory frameworks. Infer location through friends. Went through the six de bono hats method. Here’s the notes that Andrew Turner took during the discussion.
1. Platial – some maps that people have done.
2. State of the Map
5. Geocoding news (at the source)
6. NNDB Mapper
8. Home-brew 360-degree display
9. Quantum Nanotechnocracy?
10. Neigborhood map project
11. Image recognition game – Imagewiki
Winners of the talks were: 8, 6, 1 (in that order)
Google AppEngine GeoStore
This was mainly a roundup of the geo features available within the Google AppEngine.
Are Google and Microsoft killing the ecosystem?
Much discussion on the topic, are Google et al killing the “mom and pop” size businesses? They are getting so big that noone can even start to compete with them because you can’t get near them. It was mentioned though that there wouldn’t have been the innovation there was so far without Google releasing the free maps API in the first place. There was also a small discussion at the end about whether Google could release data – India data and map edits – that they have collected themselves. Unfortunately not much information was known about those but the Googlers did seem open to doing something.
Well that’s the whole thing over, it was a great conference, I’m really glad I made it over. Some interesting discussions. As I mentioned above I kinda launched something, I’m going to get the tidying up sorted then post something further on that later.
Technorati tags: wherecamp, where, where2.0, where2008
Sun 18 May 2008
So we’re now halfway through WhereCamp. I decided not to live blog the sessions as there was just too much discussion going on to keep up, and I wanted to be involved with these discussions which was a little hard if I was taking notes all the time. I’ll give a quick roundup of the ones I went to here but for more notes (and some alternative choices) you can check out my colleague’s blog here.
This was a pretty interesting look at the ways of representing time-based map data. We discussed the use of sliders for filtering visible data, spirals for representing the passage of time and various other methods. Take a look at the wiki page I’ve linked to for more detail.
Mapmaking and Visualization with Processing
This session covered various visualizations that have been done using the Processing graphics toolkit. A few examples were shown together with a brief look at the source code, websites such as Cabspotting, Hackety and Obsessing.
I actually ended up in this talk by accident, I got lost on the way to “Kiev”. It was, nonetheless, very interesting to hear about the ways in which Burmese volunteers are trying to get aid to the people in Burma. A number of resources were also mentioned including Sahana Disaster Management System and Myanmar Cyclone Relief Donations.
Is 3D Shit
Steve Coast requested this session to discuss whether all the money being put into 3D virtualization was really worth it. Discussion from a number of interested parties including people from Google and Planet 9. We basically decided that the people involved were generally investing for the future.
The location tracking session was actually organized by me, I’ve been tracking myself at various points for the past two years and have been building a site on-and-off for the past year so I was hoping to get together with some other people doing similar things. We managed to get people from Fire Eagle, Loki and iPoki. We had a good discussion about privacy, about ways to track and about requirements for accuracy but didn’t really come out with too many conclusions. I also kinda launched the site that I’ve been working on but it still needs some more work so I’ll do another blog post on that when I get a little more time.
Xrosspath is a new site that is intending to take location history for a number of users and compare them to find points in time that they have crossed paths. They’re also looking to link in world events and historical events to see what’s happened around you during your life. It was interesting how this session had links to earlier sessions including the 4D mapping and the location tracking sessions. It did seem like they should be linking into what other people are doing in location tracking. I definitely think their premise is valid though and it will be interesting to see what comes out of that.
I wussed out and didn’t stay overnight, I also didn’t get around to doing much hacking. After a couple of beers I was pretty tired. I did end up getting to bed earlier than I’ve managed for the whole trip and got a great night’s sleep. Hopefully that means I’ll be all ready to get hacking today!
Technorati tags: wherecamp, where, where2.0, where2008
Thu 15 May 2008
Talking about real world uses of our work, in terms of issues.
Grew up in Sudan, Kenya, now write two blogs “White african” “afrigadget”.
Excited about geotate.com very light weight, good in the field with unskilled people.
Also AfricaMap, very exciting as geolocation data was hard to find in Kenya.
Buglabs, any time you can hack hardware and software is seriously good. As more people use it, it’ll get cheaper and might get to 3rd world.
DIY Drones – great for crisis scenarios, how normal citizens can use it to help out
Illustrating that tools that we make are being used by people, perhaps with different background. Interesting over the coming years as GIS tools become easier to use, similar to how we see in CMS areas. We’re going to see something really big happen in the next few years.
Kenya Elections 2007 – issues and irregularities, opposition leader didn’t agree with the outcome. What started as apolitical fracas devolved into ethnic one. Showing slides of maps of polarities. People were kicked out of differing descents.
At same time, there was a media blackout, difficult as no way to get information out. Only way to get messages was SMS and phone calls, outside could only see through blogs and social media. Harvard law grad’s blog in Kenya got information out on her blog – kenyanpundit.com
* way for everyday kenyans to report incidents of violence
* create an archive of news and reports
* show here majority of violence was happening
* Detailed geospatial data is hard to come by in Africa
* How much should it be web-based in a mobile phone culture?
* Mobile phones – getting a full report in 140 characters is not easy
* What data points do we need?
We’re not part of humanitarian industry so don’t know what’s needed.
[Calendar of events]
Dec 27 elections
Dec 27 – 30 – Period of uncertainty
Dec 30 – Jan 1 – Media blackout.
Launched Ushahidi.com by end of January.
Only took a few days but really worked. Allowed us to do something not everyone could do. Timeline of events, see events occurring in the field. Draggable timeline and events on updates.
Had the beginnings of a crowdsource crisis information, realised this was pretty new, we were knew so hadn’t realised. So now decided what we do next.
* The importance of mapping accuracy
* Data poisoning – what happens when your antagonist starts using is?
* Verification is difficult
* Clarify why it was created and make that inescapably obvious – this was for rough data, not for ICC
* Create a feedback loop to end user
“So, did it work?”
* Advocacy? – Yes, mainstream media was affected and brought attention to the situation
* Security? Probably not
* Monitoring? Probably not
* Information Gathering? Yes, pretty well.
Formed by Erik and four other Kenyans, also funded now.
Types of activist, beer activist, anarchics, passionate about illegal immigrants, passionate about immigrants.
Activism always has two sides, both sides are passionate.
“It turns out activists are just everyday people, most with limited technical acumen.”
Going to go through several activist sites.
Crisis in Darfur google earth application, US holocause museum and amnesty international.
Sudan ICC war criminals website, warrants out for their arrests
Access Denied map from global voices, online censorship maps in closed countries.
“Tunisian Prison Map” – applying transparency to prisons in tunisia.
Bahrain land rights, showing difference between quality of life between haves and have nots.
Operation Murambatsvina – zimbabwe – showing land distribution, very heavy handed.
Mapping election conditions in Zimbabwe – taking news data about heavy handed government acts against normal civilians well before the election time. Showing that there was a track record well before the election happened.
The Great Whale Trail Map
Planet action – different environmental causes around the world
“I love mountains” – US based, enter zip code or state or city and see how you’re connected to mountaintop removal.
Mapping for human rights violations vs Mapping for activism – two separate things.
First is GIS/neogeo – this is what happened, used for taking criminals to court, second is to create awareness of buy-in of an issue
“Think about how you can use your skills to help in a cause that is important to you”
We have the ability to affect issues miles away which we couldn’t do not long ago.
Enemies Around Every Corner: Mapping in an Activist World
Technorati tags: activist, where, where2.0, where2008
Thu 15 May 2008
Several companies have noticed the value in collaboration, OSM, geocommons, google base, others..
Structured databases means they can collaborate world
Basics of freebase
Web based database of community entered data managed by metawhere.
Freebases data includes but is not exclusively spatial.
Have seen three glue domains, business, people, location.
Began with points, 262,000 locations, exposing an API this summer. Mashups will be able to query freebase for spatial data in multiple formats as well as the other data.
Examples on openlayers, google maps, others.
“What is semantically structured data?”
* Strongly typed data
* Hardwired data relationships
Enter “olympic torch” into google, from results extract descriptive text, filter out the search term. Send results to engine that generates tag clouds, then show resulting tag cloud to people who don’t know the search term. Ask them to guess.
foundation cookbook bertolli shattuck grouemet california restaurant berkeley alice waters – What is Chez Panisse?
Semantic meaning gives a lot of information
Brains are very good at semantics, which classify or type data, and form relationships.
What were brains doing? Noticing Alice and Waters were names, saw “co-founder” – that’s people. Maybe Alice Waters is a co-founder. Saw “restaurant”, that’s a business, started by a co-founder. Might have noticed references to place. Put them together and get restaurant in berkeley, california, co-founded by alice waters to get Chez Panisse.
How to put this into a database?
Entries have multiple types of properties, properties have relationships. Business entry has geospatial data, as does alice waters. Berkeley has its own properties, these properties relate to the chez panisse and alice waters.
If I went to Berkeley and want funding perhaps these relationships can help me, freebase can show these.
“Relevance by faceted browsers”
Search in results for references to people, then kinds of people.
A Data Source to Make Mashups Correct, Complete, Relevant and Revisited
Technorati tags: mashups, freebase, where, where2.0, where2008
Thu 15 May 2008
First, what is GMM? Maps for mobile phones.
“My Location” shows blue dot with big circle giving idea of how accurate your location is.
Not GPS accurate, but very useful.
Why is “My Location” useful?
GPS free service, Free! Saves battery. no problem with line-of-sight. Many applications benefit from thie accuracy. No waiting for first fix. Works across many carriers and network types.
* Collect geocontextual information along with a cell-id
* Cell Tower Identifier (cell-id)
* Location: GPS vs center of the map
Difficult to make it work across platforms and carriers, there is no unique ID across techs/carriers. How do we get location? If you have GPS cell phone, we collect that and the cell ID. We can also benefit from geo info like where you look at. Anonymous, just stores cell tower and location information, GPS or non-GPS.
We store this in our platform and run algorithms to figure out the location.
* 100s of different platforms – causes many issues
* Area of interest vs actual location
* Noisy data
* * Oklahoma points
* * GPS errors
* Towers in the water!
This approach can cause problems, if we’re all looking at SF maps the Burlingame cell tower will be identified as SF. Wherever you are, we often think you’re in Oklahoma because that’s the “center” of the US on google maps. GPS points can also be wrong if you have bad signal. Sometimes we find we have towers in the center of the ocean due to averaging, can be due to cell towers on oil platforms, which alters the accuracy. Sometimes that’s right.
* GPS Clustering vs non-GPS
* Use data diversity to calculat accuracy
Have invested more time in analysing the data. Data collection has been growing exponentially, working around the world, including the himalayas. Shows that this can work, non-GPS data is providing large amount of the data, more than GPS.
“Why doesn’t it work on my cellphone?”
We’re trying to get it working on as many as possible, some platforms don’t provide API to get cell ID. Some give part, so we do smart techniques to partly work it out. Some give multiple cell IDs. Others give full information.
* A balance between respecting user privacy and providing good useful functionality to the user
* How does My Location do this?
* * Anonymous: No PII, no session id
* * User has full control, can disable or enable it.
* Improve accuracy and coverage
* Continue improving security
* Enabling location for 3rd parties via Android, Gears (browser)
Can use gears to enable it for your website or application. Build innovative location-based applications
Google Maps for Mobile with My Location – Behind the Scenes
Technorati tags: google, mobile, map, my-location, where, where2.0, where2008
Thu 15 May 2008
I work at the biodiversity institute at the university of Canvas, have background in geography, GIS, remote sensing, computing. Most recently worked on “lifemapper” – creating archive of ecological niche models. Maps of where species might occur based on where we know they’ve been collected by scientists. Also creating web services that expose the archive.
Showing what this looks like. Showing google earth. Showing specimen locations of a plant. Red parts are where we expect species to occur, yellow where we’ve found it, using ecological niche models. Can look and see that these came from Burke museum at Washington. Goals are archive of niche models, predictions.
Spatial data in OGC web formats, WFS coming soon, WCS for raster data, WMS too. No query analysis yet but coming in next month or so. Landscape metrics, fragmentation of habitat, comparison of habitats of different species, predicted future climates…
Also have on-demand models. Niche-modelling on a web-service on a 64 node cluster. Anyone can use this, our archive has limitations, no significant quality control, assume it’s been done by museums, but could be more done really. On-demand service can be used by researchers with their own data perhaps at higher resolution.
Niche modelling becoming more popular because more specimens becoming available. Environmental and occurrence data, put into model and it calculates formula, also project onto a map to help visualise.
Data is more available as there’s a massive effort to database and geotag it. Might be paper catalogs as that’s how it’s been collected for 300 years, now putting into databases to digitise collections. Also exposing data via the internet using standard protocols. Slide shows examples of 3 collection that when put together give more powerful collection.
Several scenarios on economic development, regionalisation, environmental aspects, modelled by nobel prize winners with Al Gore. We use multiple scenarios and compare them “apples to apples”.
Use this distribution data, put together with ecological variables through a modelling algorithm to get ecological niche model of where the species would occur. Using 15-20 variables. Model is output through projection tool to project is onto a map.
Specimen point data is taken to create a model using an algorithm of the current environment, projected back to get distribution in the native region. Done with climate models get distribution after climate change. Significance is looking at non-native region can see what areas might be vulnerable to invasion by species after climate change.
Archive created with pipeline, constructs request, passed to cluster on a 64-node cluster, web services in front, nodes retrieve environmental data, using WCS, node dispatches to openModeller and pipeline polls for status and retrieves data and catalogs.
Exposing data on website but also exposed on web services, can see in google earth.
Showing samples that we have the data in the database but don’t have lat/lon, have around 80% of those. ~220 institutions are providing data, within those about 600 collections, fish, mammals, etc.
Other ways to access data is to request data, experiment consists of all the data and the models and maps produced on top. Just URLs, can be accessed programmatically.
Other thing is “submit an experiment”, constructing search parameters and get a URL back with information for this experiment. Get really basic data back for it, shown projected on 4 different climate scenarios, current and 3 future ones. Showing metadata for the collection and other properties.
* Distributed computing
* Screen savers
* Competitive like SETI
* Captured wrong audience
People weren’t really interested in the topic, couldn’t handle the demands of the audience.
* Funded, cluster computing, OSS, standards
Lifemapper 2.0: Using and Creating Geospatial Data and Open Source Tools for the Biological Community
Technorati tags: lifemapper, niche-modelling, where, where2.0, where2008
Wed 14 May 2008
We come from a background, created by Larry Briliant, to create a global network for early detection and early response.
InSTEDD Overview – launched 01/08, non-profit, funding from google.org, rockefeller.. agile, “innovation lab”
“From a faint signal to collective action”
Single event deemed noteworthy to collective action. Seems simply, very complex. Big chart, many parts. Integrating approaches is significant challenge. Collaboration is at the top of the stack. Focus of the organisation, on human integration aspects.
Must work with capabilities in many places, Iraq at temperatures 117F. “If you can make it work here, you can make it work anywhere”
We partner wherever we can with organisations working on technologies that may be useful, twitter, facebook, single use techs can be repurposed to other goals. We’re vendor agnostic, will use anything, and will innovate where we can.
To know these techs will work, we go where they are, to field labs. By failing quickly again and again we’ll create designs that can work when they’re needed and will be reliable.
Office just opened in Steung Treng Province, northern Cambodia. Sharing disease information across border with Lao. What does usability mean in an environment like this.
Recent project, set of liraries and applications, OSS FeedSync implementation to combine technologies to create mesh environment to provide data in low bandwidth environments. Have implementations in C#, Java, others. Brought up HTTP sync service at sync.instedd.com, we’re hosted on google gode too.
https://sahana.instedd.org – Cyclone Nargis response. Been working very hard, very little sleep, work done in past few days is astonishing.
We realised that geochat, messages over sms, wasn’t needed in Myanmar, really needed to translate into Burmese, where unicode code isn’t standardised, very difficult. What was needed was to organise rapid crowdsourcing over >11k lines. Google spreadsheet of names and getting feedback. True of slower moving scenarios as well as fast moving disasters, can’t predict what will work. Very important to apply learnings.
Google spreadsheet of translated text. Have used different platforms but there wasn’t a toolkit already to solve problem. Situation in these responses requires integrating many systems, never be one coordinating agency, reductionist approach just fails.
Google code project contains utility that gives a client to allow you to synchronise KML files with each other. Showing 3 KML files in google earth. Adding placemark. Save it. Can synchronise it two ways with another file, figuring out changes in the file and what is new, deleted, edited, and creating XML respresentation of changeset and changes are applied to other file. People can edit items in the field, pass a thumbdrive around and can collaboratively edit information. Guy put together 11Mb file by himself using contributions from other people. This type of technology allows anyone to make updates and he can merge them in. Can also sync up to a service, currently hosted by instedd but looking to do EC2 or S3 versions.
Now showing online RSS document keeping the KML information and versioning data. Decentralized versioning system. Can be put into Google Spreadsheet and it will still work. Need right adapters but we’re creating it. Would like to discuss with interesting partners.
GeoChat – hook up multiple gateways to website, have a gateway for twitter, can send a text message with lat/lon and message, or location name, and tags and this will be displayed in a string and can be displayed in KML or RSS. Showing in Cambodia. Showing team moving about in google earth. Can click reply and reply directly, via correct gateway, will go to twitter. Can also do group replies in a certain location or tag. All OSS and free.
InSTEDD: Humanitarian Collaboration Tales
Technorati tags: humanitarian, geochat, disaster, where, where2.0, where2008
Wed 14 May 2008
Want to talk about digital cities. Probably heard Geoff Zeiss talk about construction industry. Here to talk about the cities that we live in. We must change our ways in our cities to be sustainable.
Like other times we’ve moved from “Old Way” to “New Way”, e.g. Analog to Digital. We still pass contracts around using paper for instance.
We build actual models of buildings, we really need to get into virtual intelligent models.
Talked yesterday about building information models that people can now share to find and fix expensive problems on the computer before going out on site. Communities are now sharing rich 3D and geospatial information models.
As we think about intelligent city models will share 4 ideas about autodesk digital cities.
visual Model, digital platform, improved workflow, smarter way to plan.
Smart models, smart-alec models, they talk back to us.
Today’s 3D GIS models
3D visualization models are rarely used after the project, mainly just for visual images.
Tomorrow’s models are a reality today, convergence of CAD and BIM, GIS. To combine perspectives together to create informed decisions. Seattle animation shows what the city will look like with tunnel or elevated roadway. Allows people to improve on that city.
Sacramento model shows existing environment, proposing high speed rail, these animations show the public what this will look like in very non-technical way for very technical way. Buildings show up coming from planning authority, they’re geospatially accurate. Visualization shows how it will work, allows us to get inside the model. Can be sure it will be accurate and trustworthy.
Also ability to analyse and simulate to make informed decisions. Previously it would be artists creating models. In these models cars are being driven by simulation. You can actually see how transportation system will work, Sustainability analysis, wind patterns, energy patterns. Rich hi-fidelity data sets brought together allow this analysis.
Pollution visualisation, flood simulation.
Allow richer integration and analysis using these digital cities. Allows to simulate and predict operations. Saves money.
Wondering what to put in your digital city. Digital sustainability. We should continue to build these rich hi-fidelity models instead of using paper all the time, but we should be reusing these models.
“Be a Model Citizen”
Technorati tags: digital, cities, visualization, simulation, where, where2.0, where2008
Wed 14 May 2008
Scott A. Hotes
WaveMarket helps people to manage location.
Problem of locating a handset is one of triangulation.
“Acquiring the information necessary to locate a wireless device typically requires close access to the underlying wireless operator.”
Two reasons – 1st question of privacy and security, don’t want just anyone knowing this information, lots of liability; 2nd reason, value, the information is very valuable.
Accessing a handset can sometimes be done without the carrier, e.g. RIM and windows mobile have built in GPS, a resident application can access location without interacting with the carrier, but the vast majority of cellphone GPSes use assisted GPS which needs the carrier.
Getting information for doing cellular triangulation is very similar but different.
“Leveraging our experience with Family Locator: privacy expertise, technical intergration, trust; in creating a developer environment.”
The technical integration experience, privacy expertise, carrier partners, we’re taking that experience and exposing it for 3rd party developers. That’s what Veriplace is all about.
Consider an application like locating friends on facebook. At some point the friend will need to opt in to the service. If the user isn’t the owner of the friend’s handset then interaction flow will need to pass to the account holder, perhaps the parent. This is the type of thing that the developer doesn’t want to handle. That’s what Veriplace handles for you.
Veriplace: Acquiring and Sharing Consumer Location
Technorati tags: veriplace, where, where2.0, where2008
Wed 14 May 2008
Usually at academic and arts conferences. I’m a media studies scholar, interested in the use and development of satellite technologies from citizens’ point of view. Try to dream about what type of satellite development in the public interest might look like. Came from my interest of public interest TV, what does public interest satellite look like?
Showing an artist based in berlin and start installing physical google markers in places.
Interested in satellite image orders by individual consumers. Dan Bollinger tried to get satellite images to show Survivor.
KFC requested an Ikonos image of their new logo that they did huge in Nevada.
Use of GPS to “plot the personal”, and generate unique “movement signatures”.
The way that the planet is now being crisscrossed by satellite footprints and wireless footprints. How we don’t just map the world in terms of countries, states, blocks, but also coverage footprints which are sometimes more important.
“Cultures in Orbit” – Lisa Parks’ book.
“Part of my research has focused on specific uses of satellite imagery in the news media to represent global conflicts and events”
4 questions: images used to represent global conflicts? Where does authority to use come from?…
Satellite images showing alleged mass graves in Bosnia. Appeared in papers and press after declassification. Problems are that the images acquired during the atrocities, from a safe haven that was overrun, 8000 muslim men were allegedly driven away and buried in mass graves. Problem was of an overload of satellite information and techniques were not good enough to make the images useful in a timely fashion. Detailed investigation of this timeliness.
Also look into imaging of refugees, requests were made for images to be released to show the situation in Rawanda. People began to pay attention.
War in Iraq, 2003-present – Colyn Powell’s infamous presentation about WMD before the war began. Use of the images in powerpoint in UN council chambers. Scathing critique was given of these images weeks after this. This compromises the ability of US to use these images with credibility.
Showing Google Earth & USHMM “Crisis in Darfur [layer]” – interested in the shifting function and role of satellite image as it circulates in the popular culture. Data about activities happening in the region together with photos.
Looking at case studies over 10 years there’s an eclipsing of the satellite imagery, in earlier media there was a focus on the image as the site of scrutiny, these days the satellite image becomes a wallpaper and the closer views are privileged over the satellite image. Showing that the image is no longer interesting, it’s the zoom through to the detail. These alternative images may perpetuate bad images of e.g. Africa whereas unfiltered satellite images did not do this so much.
As images become mass media, more and more citizens use them to understand the earth, but most are not interpreting the imagery and know little about it and its uses. Visual and technological literacy problems.
Citizens have a right to know how these are used.
Developers can help citizens, by embedding metadata. Would be great to get the source, sensing instrument, infra-red, spectral, owner, date, orbital address, proprietary status. Helps to understand imaging more effectively. Now being done some by google earth. These graphics reveal how it occurs, show that satellites don’t hover, they pass over. Gives a historical record on satellite imagery acquisition.
We need better maps of orbital space, of satellite traffic, of the dynamic activity or earth and orbit.
[Slide showing satellites being used for TV transmission during Yugoslavia war]
Multiple other representations. Showing photo of US 193, satellite that was shot down by US. Talking about Trevor Paglen trying to find out about things we’re not meant to know about, he took a photo of this satellite and does more investigation.
Earth-Browsing: Satellite Images, Global Events and Visual Literacy
Technorati tags: satellite, imagery, art, where, where2.0, where2008
Next Page »