• Where 2.0: DIY Drones: An Open Source Hardware and Software Approach to Making “Minimum UAVs”

    Chris Anderson, Wired

    Talking about a hobby gone horribly wrong :)

    Attempt to be a geek and a father, doing fun things with children. Got some model aircraft, played around, crashed a lot. Also had lego mindstorms, but we were never going to be really good at that either. Thought about “airplanes, robotics” what can I do? Ooh, put them together and get a UAV.

    Predator UAV costs $4 million.

    In a quest to do something cool and original, we realised there was a dimension of aerial robotics that we could compete in, cost. Couldn’t make the best UAV but could make the cheapest. Want to be the minimal UAV project.

    Why? Low-cost access to the sky, any time, any place

    What if you could have eyes in the sky, low cost ubiquitous access to the sky?

    Started by looking at how to simplify the project? Two functions - stabilization project, kinda 3D. Then there’s navigation, following GPS waypoints, kinda 2D. Can use commercial stabilization hardware. How can we do navigation?

    Stuck Lego mindstorms in lego planes. Put a camera with lego pan/tilt on the bottom. Got the world’s first semi-lego UAV.

    Not autonomous. Brought in bluetooth GPS. Mindstorms has bluetooth. Added accelerometrs, gyros. Now have fully functional inertial measurement. Now have autonomous UAV. Take off manually and land manually, flick switch and it follows waypoints. All driven by lego, cool.

    “Turning the military industrial complex into a toy”

    Proof of concept, nice to use lego - easy, non-threatening. Have been accused of weaponising lego :)

    Taken an export controlled technology and recreated with lego.

    How to do cellphone easily? With a cellphone? On-board processing, on-board memory, very good wireless. 2-way communications. Can send text messages with waypoints, it can send back telemetry and imagery through various methods. Strap it to the plane and the autopilot into a software app.

    Can do better…

    Attach phone to stablisers. Set phone to continous snapping (0.5Hz) mode, get 3cm resolution with toys. When you want it, no waiting for satellites to come around.

    Showing photo of the google campus, actually done by UAVs and helicopters. Showing very high resolution image.

    You get close to finding out what’s happening right now. Can find that Google don’t really have their logo on their infinite pools.

    Responsibility with UAVs. Lawrence Livermore National Laboratory. Something was to be torn down so wanted to record it. Launched easystar with basic “altitude hold”, something went wrong and it went down behind the gates of the secure national laboratory. I could run, call 911, surrender. Or I could go to the gates and explain what happened. I went and took my child with me. They took mercy and found the plane, blew it out with a hose and knocked it down. They collected the pieces and found GPS sensors, cameras, etc. We promised never to do this again.

    Embedded processors (under $500). Comparing to Steve Jobs and Wozniak and how they were putting together their own machines. We’re back there now and with open source hardware. Arduino chip. Using FlightGear they’ve created the ArduPilot a $110 open source autopilot. Uses IR stabilization, GPS, fully programmable, can control camera.

    “Indoors and under $100?”

    Using blimps - BlimpDuino a $70 autonomous blimp using $15 toys from toys’r’us.

    DIYDrones website/community. Open source in this context.

    Is this legal? We don’t know. Two regulatory classes for UAV, military and commercial. Can get regulatory permission commercially. They never considered that this can be so cheap and easy. Keeping under 400 feet, away from built in areas, have a pilot in control at all time. Trying to be responsible. Cannot ban anything here, these are toys. This is global too. We don’t know how to create regulatory guidelines.

    How do we export these? Currently a license is required to export, but we’re an open source community. Some of our participants are teenagers from Iran, but some people would think we shouldn’t do that. I’d rather it was done in public in a community than they just do it on the quiet as all the technology is available now anyway. We’re testing the boundaries of how to get robots and machinery into the skies.

    Ending on one picture, “What’s this good for?” “Because we can, because it’s fun, because nobody had done it before”

    • Our job is to make the technology cheap, easy and ubiquitous.
    • Then users will show us what it’s for.

    We don’t know what people will do with it, but we’re hoping people will show us.

    DIY Drones: An Open Source Hardware and Software Approach to Making “Minimum UAVs”

    Technorati tags: drones, diydrones, uav, opensource, openhardware, where, where2.0, where2008

  • Where 2.0: Global Weather Visualization: Utilizing Sensor Networks to Monetize Realtime Data

    Michael Ferrari

    We try to utilise as much data as possible.

    This is not your father’s weather forecast.

    Weather presenters are basically presenting information from basic weather models. After a few days the weather information is basically useless.

    When you’re trying to base financial decisions on short term weather forecast, it’s very [bad/difficult].

    We offer a completely different approach.

    The world is warming, it’s a trend so won’t be constant and everywhere. Using traditional methods you can’t predict this properly.

    We can do weather forecasts up to 10 months out using ~400 sensors around the world.

    We can’t predict everything. Down to the daily level we can produce a granular forecast. We use multiple sensor networks.

    Usually seasonal forecast that your tax dollars pay for say “there’s equal chances of everything happening”. We offer a more granular level showing, e.g. daily temperature changes. A lot of this information is supplemented by the sensor datas constantly.

    Last year there was a landmark paper published Craig Venter recreated the darwin journey and sampled sea water at regular locations across the globe. We put the information into a geo-spatial context.

    In evolutionary biology there’s always been a saying that “everything is everywhere”, took samples from around the globe. the thought before the study was that genes would be similar, but study showed that repeatability of some genomes was very specific. Caribbean vs Indian Ocean for instance.

    Paradigm shift for future sampling studies. We now have a great dataset to base this stuff on.

    Some of this will be addressed in further talks. We’re not at the point where we can prevent weather disasters but with realtime monitoring we can plan and react better.

    Global Weather Visualization: Utilizing Sensor Networks to Monetize Realtime Data

    Technorati tags: weather, sensors, where, where2.0, where2008

  • Where 2.0: Bringing Spatial Analysis to your Mashups

    Jeremy Bartley, ESRI

    What is ESRI’s role in the geoweb? GIS is all about the fat tail. They have all that data but it’s not really accessible. We’re really excited about the ArcGIS9.3 release that will make this data more available. Our users create maps and datasets. We’ve been open in the past with OGC and SOAP. In 9.3 we’re also creating KML and REST interfaces. “When it’s open, it’s accessible”.

    [demo]

    http://mapapps.esri.com/serverdemos/siteselection/index.html

    Showing some of the applications that we’ve built. We’re developing our own JavaScript APIs and Flex APIs. Based on the Dojo framework. Showing very simple mapping tile application. Standard basic JS mapping API.

    This is actually a leyer to a mappign service running on the server. But they’re more than simple maps. We also have geo-processing models. Showing heatmap population dataset. Can define a polygon, sent it to the server using a restful interface and find out how many people are living there.

    Showing census information for the US, you can view the attribution information.

    Showing tile maps on virtual earth.

    Bringing Spatial Analysis to your Mashups

    Technorati tags: esri, arcgis, map, where, where2.0, where2008

  • Where 2.0: Navigating the Future: Mapping in The Long Tail

    Pat McDevitt

    Worked in “University cap and gown” as a student. Seasonal business. Would go out and deliver these and return them later. Would load boxes into vans. Dispatcher give us list of addresses and directions. His van had no radio and no speedometer. Would get very used to finding signs of where schools would be, speeds changing. People would generally know where schools would be. People would say “it’s just down the street, you can’t miss it”. “just down the street” means different things in different people.

    More recently worked at a company that would map hazmats. We would pull out these directions from envelopes. Asked colleagues for strangest things they’d found. One set including going to the end of a fence, continue north for the distance of two cigarettes on a slow horse. Not an uncommon measure of surveying in texas.

    “All ‘navigation’ is local” “‘Where’ is relative”

    Why in the past 25-30 years has this content been concentrated in a small number of companies?

    The Long Tail. Hit based - older companies map places that get more “traffic”. “Niche”-based is newer organisations.

    In the past this data was hand created. Eventually GIS applications became standard, modern, affordable. More people found they could become creators of GEO content. Local councils started having “geo” divisions. They could collect information that wouldn’t be economic for a big company to collect.

    More recently tools were launched that would allow almost anyone to map data. We see that people will still go in and hand digitise this data. Filtering technologies are becoming much more important in where this long tail is going to go.

    There’s a future that will contain both paleogeographers & neogeographers. I think the answer is “yes”.

    Graph showing the types of data that people are mapping, words like “popular” “scenic” “clean” don’t describe data that a huge mapping company is likely to collect, but smaller more local ones might.

    The smaller data could be collected better by local communities so we think we might leave it for them to collect.

    Navigating the Future: Mapping in The Long Tail

    Technorati tags: thelongtail, teleatlas, map, where, where2.0, where2008

  • Where 2.0: What about the inside?

    Mok Oh, EveryScape

    [Video]

    3D imagery of time square. Similar to street view.

    “The Real World. Online”

    • Photorealistic Immersive Interactive
    • Scalable Distributable Maintainable Extensible Self-Healing
    • Annotatable scriptable searchable shareable “my world”

    “What about the inside?”

    • Outside’s getting all the attention
    • Inside is important and valuable
      • Your valuable time We’re indoors most of the time for this conference. We’re indoor at home, indoor at work.
      • POI quality Categorise it as indoors and outdoors, most of them are inside.
    • WTF?

    EveryScape has a platform to build their applications are, can separate outside and inside: Two lists of “Local search, travel, real estate” showing they apply to the outside and the inside.

    “Just the outside is not enough…”

    • We focus on eye-level visul representation - inside & outside
    • Perceptual accuracy, not geometric accuracy - Visual Turing Test - Does it look like Boston, feel like Boston?

    “Ambassador Program” Providing the means to paint the world.

    [Demos]

    www.everyscape.com, go to Boston. Can look around a panorama. Clicking on an orange disk in the ground and you actually move. you then get to the “Cheers” bar, you can also go down the stairs and into the bar. Can leave a memo for his friend on a poster on the wall of “Red Sox”.

    Showing another demo, inside a building. MIT corridor. It’s a 3D model. Very fast, seems native. Showing as a FPS game and have multiple people playing the game.

    We need to answer the question “What about the inside?”

    What about the inside?

    Technorati tags: inside, everyscape, where, where2.0, where2008

  • Where 2.0: A NeoGeographical Approach to Aerial Image Acquisition and Processing

    Jeffrey Johnson, David Riallant

    We have a background of GIS and remote sensing. We were very excited when we had virtual globes as it gives us new opportunities to visualise our data. Neogeography is not so new, for a long time geographers have been using external technology to get the big picture, which is all neogeography is about.

    When we look at the geoweb today we have 2 main sources of information, large scale base maps and site specific annotations from users. We notice that site specific base maps are missing. With PictEarth data becomes information as it’s captured in real time.

    [Showing slide of two planes with N95 plane]

    N95 has good camera, reasonable GPS, and we have a python script and could stream too. Downlinking video and telemetry from the phone. Also have a more advanced solution. Have softwae that takes down the video and telemetry and imports to google earth. Also thermal cameras, some windows software, platforms are autonomous capable. General aviation aircraft too. Also ultralites. Using lon/lat/altitude/bearing to work out the geoprint of the imagery. Can see it live from an aircraft. The height of the plane and focal length give the area covered by the camera, visualizing in google earth.

    Software is for S60 phones, using N95. Showing some mosaics from imagery.

    Usually you have to wait a long time for new imagery and it costs a lot, but you might just want a small amount of imagery. This is what we’re aiming for. Can overlay with other types of data too. Can also shoot images at the time that you want them to be shot. Showing example of pictures taken in the Mediterranean sea 15m below the water level. Shooting when you wants makes sure you can control the data. If you can see the data in real time you can make decisions in real time.

    Last year in San Diego could take photos during the wild fires.

    GIS and remote sensing guys so make sure the data can be used in regular GIS systems.

    Example of N95 pictures, pretty good detail.

    Another example of UAV over dense urban area. Great success, interesting to see how it could operate. Aim to create 3D modelling.

    Burning man image.

    Goal is that images are not just a nice background, they become a source of information. Just like any information, geo-aerial information has it’s value from its accuracy, freshness and flexibility.

    A NeoGeographical Approach to Aerial Image Acquisition and Processing

    Technorati tags: uav, pictearth, where, where2.0, where2008

  • Where 2.0: Indexing Reality: Creating a Mine of Geospatial Information

    Anthony Fassero, Earthmine

    We all agree that there’s a renaissance happening for mapping, an explosion of online geo-spatial platforms that allow us to visualise dense layers of data. Also allow us to attach information to these frameworks.

    The baselayer allows us to enable this connection of data. Satelite imagery is precisely registered to the earth’s surface, we can work out the coordinate for every pixel. We can do the same for bird’s eye together with some advanced computation.

    Also more recently street view. Street view doesn’t really link to the base layer, you don’t really have a lat/lon/altitude for the imagery. Showing a video of a 3d rendering from street imagery. Stereophotogrammetry. There’s so much information, curbs, etc. that it can’t be delivered too easily.

    base layer = resolution + fidelity + accuracy

    there’s a wealth of information but we need to be able to percieve it, e.g. text. Fidelity, we need 1-1 pixel to coordinate information.

    generative, we’re able to extract road and building information.

    [demo]

    3D view on a bridge, can be dragged around similar to google street view. Can also double click on features in the image and it will jump to the nearest location. Start on bridge, click on a building at a distance and it jumps to that view, seems like spiderman jumping from point to point. Can also search for features, “gas lamps”, geocodes. Can also click on traditional overhead map/imagery and can be taken to that point.

    Can also link to new imagery. Can click on an image and add a point, labelling it as a “bakery”. We’re actually tagging this on 3D imagery so you can rotate the image around and the point stays possible.

    Can also do measurement. Dragging a line can give you a measurement, e.g. lane width. Can also take snapshots of nice views, virtually taking photos. Can measure the height of a (small) building. Tagging a manhole, street light. These are also showing up on the traditional inset map. Exporting this, exported as KML and loaded in google earth. This information isn’t stuck in the image but is really generating real world coordinates.

    We’re also looking to provide data services to distribute this data. Software and tools to allow integrating it.

    Announcing APIs JS and Flex: www.earthmine.com/beta

    Indexing Reality: Creating a Mine of Geospatial Information

    Technorati tags: earthmine, streetview, map, where, where2.0, where2008

  • Where 2.0: Mirror World: Using MMOs for Real World Mapping

    W. James Au

    “What I learned as a Virtual World chronicler”

    My main gig is talking about second life on my own blog “new world notes”. I was originally contracted for linden labs. I was hired to write about it as an emergent society.

    People are using virtual worlds as part of their day-to-day activities, 60-70 million users have an account on a virtual world. Instead of watching tv, browsing the web, they’re going to have virtual worlds as interactivty spaces. They’re going to need maps, of worlds that don’t exist.

    Showing a picture of a map from “Lord of the Rings Online”. The company mashed up their world with Google maps. You can plot your course, showing a route of Frodos trip from Rivendell. They have to think in terms of geolocations, in terms of maps. Same as if you’re going to visit Manhattan.

    “A brief tour of map apps lications from my home mmo, second life”

    Second life - user created online world, data is streamed, standalone app. Have 3D building tools, scripting tools. Can take in XML data from second life to the web and back. This causes interesting mashups.

    Second life has a google map api to show the world. “Caladon” is a steam punk area that exists in second life, map shows that area. There’s also IBM’s campus visible. It’s interesting that you have the steam punk empire next to IBM campus and they both depend on maps. Great combination of fantasy and reality.

    “Sculpty Earth, Topographic Globe Streaming Real Time Weather Data”

    A globe that’s visible in second life with real live cloud data streaming onto the globe. You can walk over the globe and can see the weather patterns below you.

    “David Rumsey’s Historical Map Archive, Translated Into 3D Immersive Topography”

    One picture shows two maps overlaid on spheres. Other one shows mountain topography with map overlaid.

    “Daden UK: Google Earth/SL Mash-Up Dynamically racks LAX Traffic”

    Avatar standing on Los Angeles. Plane is visible too, data from LAX air traffic is imported into SL and rendered. Almost live represenation of air traffic going into LAX.

    “Digital Urban UK: London Building/Street Data Uploaded, ‘3D-ized’ In Real Time”

    “Daden UK: Google Maps Mashp With Satellite and RSS Overlay” you can interact with the mashup from with SL.

    “But why immersive maps of real worlds?”

    Not quite sure yet. “The ‘Memory Palace’ Argument: Rapid and profound data acquisition and retention through immersion… but we’re not sure, yet.”

    The intuition is that this is going to have a transformative effect. May not be real world applications quite yet but it’s very cool.

    “Then again, maybe the metaverse will just become pocket-sized:” showing a mobile version of SL. A lot of the applications might be more useful on a mobile. If you have a representation of the real world you can interact with it from within SL on your mobile.

    nwn.blogs.com and the book “The Making of Second Life” by Wagner James Au.

    Mirror World: Using MMOs for Real World Mapping

    Technorati tags: secondlife, map, where, where2.0, where2008

  • Where 2.0: Where is the "Where?"

    Dr. Vincent Tao

    This is based on my experience of moving from an academic research company moving to a big company like Microsoft.

    What is the direction of VE?

    What is Where 1.0 “The death of distance”

    • Social networks are independant of distance.
    • E-commerce is locationless
    • Search is universal: finding information anywhere

    Where 2.0 “Location matters”

    • Concentration of human activities continue to grow
    • 72% of survey participants prefer to stay within a 20-minute drive of their homes to reach a business
    • 1/3 f search queries are local intent

    Independant thinking is important.

    “like keyword, location is basically an index to information and data organization”

    “from organizing spatial information to organizing information spatially”

    “moving from W3 to W4”

    W4: what, where, when and who

    “How do people look for local information?”

    “How do people look for information ‘indexed’ by location?”

    Looking at entry points, we have local search mapping. Looking at the larger entrypoints:

    • Search
    • Portal
    • Community - social networks
    • Gaming
    • Entertainment
    • Commerce
    • Communication

    In this context LBS gives a value-add, not a primary driver to the site.

    Where do people look for local info?

    • PC - 71%
    • Phone (Voice) 5%
    • Phone (Data) 2&
    • In-vehicle devices 1%

    When we look at your life, we feel your home-life is getting blurred, where is location? Location is enabling pieces for the services.

    Virtual Earth is an enabling platform - millions of earths!

    A few weeks ago we launched mobile search with “image mapping, real-time traffic, voice search”. We have voice recognition for location search too.

    VE embedded with messenger. Can share locations with people, realtor, travel agent.

    VE add-in for Outlook. Massive user of office and outlook. Book meetings with customers and partners, totally free. Can alert you to leave the office when there’s traffic.

    VE with SQL Spatial Server - powerful combination of two products for professional users.

    VE: the largest mapping project ever in industry

    • imagery coverage of 80% population
    • 500 world wide 3D city models
    • rich and in-depth local contents

    Must plan very careful and not make any mistake.

    Investment in high resolution images (6cm). 220M pixels. Showing picture of traffic accident. These images could have a lot of uses.

    Automated image processing pipeline. Automated alimination of moving autos by comparing multiple overlapping pictures.

    Automated 3D city model generation. Reduced polygons and real templates. Building models for 20-30 cities per month. This is an automated pipeline.

    Virtual Earth v2 cities. Much more upgraded texture and quality, real, environment.

    Comparing a real photo with a screenshot from VE. Automatic placing of trees and models. They did miss the lamp post though.

    Comparing a photo of a golf course “Bali Hai” to 3D model, it’s very similar.

    Also crowd sourcing, 3D model of stadium.

    Map of China showing the effects of the recent earthquake.

    Showing vegas 3D model. For the v2 cities, we have even the smaller buildings which we didn’t have before.

    Showing bird’s eye imagery on vegas. 3D looks very similar to the bird’s eye.

    “Virtual Earth : your own photo experiences”

    Geo registration software to embed photos into virtual earth. Overlaying photos onto virtual earth to recreate the views. Not sure how you do the geo registration. Coming is 3D simulation of sunsets and shadows. Embedding street images in 3D world.

    Where is the “Where?”

    Technorati tags: virtualearth, microsoft, where, where2.0, where2008

  • Where 2.0: Augmented Reality Lets the DPD Know Where You Are

    Tom Churchill

    Today’s mashups are going to get a lot better because the tools to view them will become a lot bette.

    [Showing a video of a dashboard device]

    Earthscape Augmented Reality System - built for a geobrowser, for a google earth application. Infra-red cameras that are suspended from police helicopters. Imagine you’re a cop in a helicopter looking at video from the camera. They’re responsible for manging the chase. They were using a moving map to show where the camera was looking. A 2D map vaguely linked into the gimble. Video on one side, map on other. The location could be wrong because the orientation was wrong and the perspective was wrong.

    If you can pick them up on the camera you don’t want to lose them. Picture of hardware, IMU, motherboard, battery. Drove it around in a van then got time with the police.

    Actually not as fun as you expect. Helicopter flies very aggressively. You’re probably going to get sick. The hardware might not act as you expect. They were writing code in the air while pulling 2.5Gs and flying around. If you know which way the helicopter and camera are pointing then you know how to render the view in a geobrowser. You get a computer generated image that looks like what you’d see on the monitor. Because it’s all computer generated we can add useful information. The big win comes from simple things. If you turn off the aerial imagery and replace it with the live video. You can overlay the streets on top of the video. We could give the officers the precise address of a property and they could find a gun.

    Next demo showing polygon data being overlaid. Showing individual properties and addresses being overlaid on video.

    This can sometimes be fun if it’s a quiet night.

    Want to go back to where we started. Early day of geobrowsers, can verify by the amount of change we see. Google adding street view, MS with their 3D buildings. Takes you back to the early 90s when web browsers were changing rapidly.

    Geo browses are fantastic on the desktop. But they can save lives out on the field. Augmented reality goes out much further, aiding fire departments, rescue workers.

    Earthscape is a way for us to test our geobrowsers engine, solving the problems that people in the field might want to do with it. The process of using our own API improved the product. To do anything more interesting that display dots on the map you need something that’s programmable. You could do this using KML. It’s like in the pre-JavaScript days. The J in AJAX makes everything so interesting.

    First responders have little tolerance for their tech not working.

    Lastly, you don’t see a lot of long development cycles. Develop for the very high end with the expectation that things will catch up. They had to build for older rugged machines, but this allowed them to quickly develop for the iPhone.

    Augmented Reality Lets the DPD Know Where You Are

    Technorati tags: augmentedreality, earthscape, where, where2.0, where2008

subscribe via RSS or via JSON Feed