Monday, December 14, 2009

MashupAustralia Winners Announced

After a month of deliberations, the winners of the contest for the Best Australian Mashup were finally announced. The overall winners, receiving $10,000 in prize money, are Suburban Trends and Know Where You Live (both were featured on all-things-spatial blog in the last few weeks: mashups pt 1 and winners of GovHack contest).

Judging panel citations:

Suburban Trends - a mashup of different types of crime and census data that allows to compare and contrast suburbs by a range of economic, education, safety and socio-economic indicators. The judges thought the ability to compare suburbs visually combined with the selective choice of statistics was excellent especially in a field dominated by many entries using similar datasets.“

Know Where You Live - This entry bills itself as a prototype of a mashup of a range of open access government data based on postcodes so that you can truly know where you live. The judges loved the very citizen-centric ‘common questions’ user experience of this app and the groovy, and again, selective repackaging of what could otherwise be considered (we’ll be honest here) slightly boring data. The integration of publicly-held historical photographs and rental price data was a nice touch as was the use of Google’s satellite images in the header. Judges were disappointed that some of the data for states other than NSW wasn’t available for inclusion. The focus on compliance only with the most modern standards compliant browsers was not seen as detrimental to this mashup.”

Highly Commendable Mashup award and $5,000 in prizes went to geo2gov (an online service that accepts location information in a variety of formats, like address, postcode, lat/lon, IP address, and returns data on that locations as JSON feed) and Firemash (mashup of relevant twits and New South Wales’ Rural Fire Service RSS feeds; it sends twitter alerts for registered users if fires are reported in the vicinity).

Notable Mashing Achievements ($2,500 prize) were awarded to:
Best Student Entry
and $2,000 in prize money was awarded to Suburban Trends (yes, again) and Suburb Matchmaker:

People’s Choice Award and $2,000 in prize money went to In Their Honour, with the following citation: “The clear winner of the People’s Choice Award was In Their Honour — which is consistent with the judge’s thoughts on its usability. As commenter Nerida Deane said, ‘I just looked up my Great Uncle Al and found the site easy to use and I liked the information it gave me. Maybe one day I’ll have a chance to visit his memorial.’

Student Entry — Commendable Effort ($1,000 prize):

Transformation Challenge ($1,000 bonus prizes for mashups which enhanced and/or made the provided data available for reuse programmatically):

Congratulations to all the winners and pat on the back for tens of other participants who submitted great entries for this competition! Full story at and blog.

Friday, December 11, 2009

Maps in Viral Marketing

A quote from the Australia Coastal Watch website reads:

“Unbelievable Satellite Images! Recent images of Bondi Beach show a massive school of unidentified sharks cruising just metres from swimmers. Click the plus symbol to zoom in…”

This is a fake of course but a very interesting example of a low cost viral marketing campaign for a TV program that plays on people’s fascination with “unusual things” captured on Google Map satellite imagery. I have spoiled the fun by describing what it is all about but the key point I want to make is that maps have such a wide range of potential applications - limited only to one’s imagination!

Related post: Adding value to free online maps…
First spotted on: Google Maps Mania

Thursday, December 10, 2009

Favorite Places on Google

It is hard to keep up with all those new services that Google is rolling out! This latest initiative aims to make Google a significant player in location based services (LBS) market by bringing together businesses and consumers. In particular, Google has just started adding prominent businesses to its Favorite Places service and, at the same time, it sent out QR codes to over 100,000 businesses in the US encouraging them to display those codes in store windows. Customers can scan the QR codes with mobile phones and will be taken to the store's Favorite Places page, where they can read reviews. They can write a review and/or star the business as a favourite if they like the place. Alternatively, people browsing Favorite Places on the Internet can view the location of the featured business on a map.

Keir Clarke from Google Maps Mania wrote a very comprehensive post on this latest service, outlining a new approach adapted by Google to ranking businesses. And Kevin Benedict from Mobile Strategies for Businesses has just published a related story on barcode scanners on mobile phones. Very interesting reading.

Wednesday, December 9, 2009

Google introduces Aerial View

Just recently I presented a new Australian entrant into the online mapping game: NearMap, with a very impressive application for viewing high resolution, multi-directional aerial imagery. Today Google announced the release of Aerial View option for Google Map API that allows viewing objects on the ground from four different directions (actually 5, if you count top-down satellite image as well). Current coverage includes only San Jose and San Diego in California, USA but more locations will be added in the coming months.

Google ensured that the new functionality also works with existing overlays, like hybrid streets layer and other Google Maps services like driving directions. As Google engineers put it, “…this is a result of a lot of code and computing power that reprojects the imagery to make it easy to overlay data on the map given lat/lon locations like in any other Google Map type. The result is a great user experience together with easy display of data on the map.”

Microsoft was the first to introduce the concept of multi-directional aerial imagery in online mapping applications with its Bing Map Brid’s Eye view option. And Bing Map already has quite a good coverage of Australian capital cities.

First spotted on: Google Maps Mania

Tuesday, December 8, 2009

Map of the future

Ordnance Survey, UK national mapping agency, has been trialling the use of lasers to create a detailed three-dimensional map of Bournemouth town centre. The map was generated from 700 million individual points cloud as part of a trial spanning three years, with every square metre of Bournemouth captured using a combination of land-based and aerial survey with high-accuracy lasers.

Glen Hart, Head of Research at Ordnance Survey, comments: “Three-dimensional maps in themselves aren't new, but what we've achieved in Bournemouth is a level of accuracy and detail that's never been done before. This technology could change the way we map the country, but also have a massive impact on things like personal navigation.”

Ordnance Survey says that it will be continuing with trials to help perfect the technology, but expects detailed mapping in three dimensions to be a reality within the next five years. On a small scale, similar maps can be easily created by anyone with digital photography and some help from Photosynth and Phyton scripts.

After: Ordnance Survey News Release

Monday, November 30, 2009

Mapping crime statistics UK style

Just to complement my latest post on Australian mapshups with crime statistics, a quick look on how the topic is dealt with in the UK – on the official map of crimes and antisocial behaviour from National Policing Improvement Agency. The map was launched with a great fanfare by Home Office Minister on 20th October, 2009.

The application shows monthly and quarterly crime statistics for the last couple of years (by type) for policing regions in England and Wales. Crime levels in neighbourhoods, within the policing regions, are shown on a Bing Map as shaded thematic overlays. Users can compare statistics of selected area to other locations as well as download the data as csv files.

Due to high demand the site went down frequently within the first 24 hrs after the launch but now performance is quite reasonable. However, the developers did not put much effort into generating neighbourhood boundaries since they do not align well with each other. As well, there may be some underlying issues with the data collection methodology – my UK based colleague pointed out that an area in his neighborhood with no population at all (ie. local park) is shown on the map as having the highest crime rate. This is of some concern since it my potentially influence purchase decisions and the demand for properties in the area. What is traditionally seen as a very desirable feature (ie. parkland) is now indicated as a hive of criminal activities. It highlights the perils of using statistics indiscriminately.

I believe that Australian developers participating in the MashupAustralia tackled technical and presentation challenges for equivalent set of crime statistics pretty well, and to a very high standard.

Australia’s love for private space

A news item making rounds in the media this morning is that Australians have the largest houses in the entire world (statistically speaking). A research commissioned by CommSec and conduced by the Australian Bureau of Statistics concluded that the average floor area of new dwellings in Australia hit a record 214.6 square metres in the last financial year. In the US the average size of a residential dwelling is only 201.5 square metres which still compared favourably to Denmark which has Europe’s biggest dwellings with an average floor area of 137sq m, Greece (126sq m) or the Netherlands (115.5sq m). Homes in the UK are the smallest in Europe at 76sq m.

Australia's largest dwellings were built in Victoria with an average 224.5sq m floor area, followed by Western Australia, Queensland, Northern Territory, NSW, Tasmania, South Australia and the ACT. However, the biggest houses are in NSW where the average new house built in 2008-09 was 262.9sq m, and in Queensland (253sq m). We love the space but does it makes us happier? Just think about the cleaning…

Sunday, November 29, 2009

MashupAustralia highlights Pt 4

One dataset that generated quite a lot of interest amongst developers was New South Wales Crimes by offence type, month and Local Government Area (1995-2008). It is quite a comprehensive and complex bit of information and developers took a range of different approaches in presenting that information in their applications. All in all, there were 7 applications built with crime information as the main theme and several more where crime statistics were used as a complementary information. Here is a review of the most interesting submissions.

NSW Crime Explorer presents statistics in tables, graphs and on Google Map as thematic overlays (only 2008 data). Tables are very well laid out and are very easy to read despite that they contain very comprehensive set of information. I particularly like little graphs in each table row depicting trends over the 10 year period. There are also links to line graphs with monthly stats for each crime type (generated with Google Visualisation tools). A drop down list of all Local Government Areas enables easy navigation between locations of interest. The map is very basic but a selection of overlays for each crime type clearly shows where the offences were committed in 2008. The author has chosen 15 categories for data classification and it slightly blurs the clarity as to which areas are crime hotspots. Overall a very comprehensive presentation.

CrimeFinder is a mashup created with Silverlight and Bing Map. Thematic map shades Local Government Areas according to crime rate or absolute numbers of crimes committed. Different crime types can be selected from a drop-down list. Mouse-click on a particular area brings up a line graph showing comparative trends in crimes for NSW and the selected LGA. Mouse-over function highlights particular boundary and brings up summary information for the area. Slider filters allow setting start and end dates for data to be presented on the map as well as enable adjusting transparency level of the overlays. Redraw of LGA boundaries on the map is very smooth on zooming and panning which is quite an achievement considering it is all vector data.

NSW Crime Map demonstrates yet another approach to presenting crime statistics with Google Map and Google Visualisation tools. The number of crimes committed in major NSW regions, in any given month, is represented as dots on the map. Size of those dots is proportional to the number of committed offences. Different crime types are selectable from a list next to the map and selection of dates is done with a slider tool. Dynamic graph under the map displays monthly counts of incidents between 1995 to 2008.

How Safe Is Your Suburb is an application created with commercial software integrated with Google Map and Flash graphics. Crime data is presented in four different ways: based on geographic distribution (thematic map with data table), as a cross tabulation of offence by type and year (pie chart and data table), as a cross tabulation of year of the offence and type by Local Government Area (with line graph and data tables) and as summary statistics highlighting the most dangerous regions in NSW.

NSW Crime is a very simple mashup presenting crime data in two columns and a Google Map with location points depicting user selected Local Government Areas. The first data column contains a list of LGAs and counts of all crimes committed in those areas. The second column lists various types of offences and counts for a selected LGA. There is also an option to specify time range for calculating statistics. Simple approach yet allowing to show a wealth of information.

Tuesday, November 24, 2009

Google to index data from mashups

Last week Google announced its plans to log information from mashups that are using JavaScript Maps API v2. The information will then be used within the main Google Map site. Basic premise is similar to “My Maps” where user generated content can be searched and displayed on a map along other options (eg. locations, businesses or real estate).

Details are scarce but it appears Google will index only location and content of the markers and/or infowindows that are displayed on hundreds of thousands of mashups. There is an option to opt out from this arrangement and Google has indicated it will apply the following rules:

  1. We only index data from maps that have been viewed by many unique users. For example, maps only viewed by you and a few friends won't be indexed.
  2. If your page is protected by robots.txt, we will not index your content.
  3. You can opt-out of the logging by specifying "indexing=false" when loading the API.
  4. If you are a Maps API Premier customer, we will not index data from your maps, unless you opt into indexing by passing in "indexing=true".

The exact day the new arrangement will take effect will be announced in due course.

First spotted on Google Maps Mania

Monday, November 23, 2009

Free data - sign of times…

It’s official. Starting from April 2010, UK citizens and the rest of the world will be able to openly access maps from Ordnance Survey as well as plethora of interpretive geographical data such as crime, health and education statistics by postcode. After many years of significant revenue from a successful commercial model of licensing government geographical information to value added resellers, UK government has decided to change its approach and make the information available online for free.

Ordnance Survey monopoly on GIS data in UK will end although some may argue that its position was already heavily undermined by the success of OpenStreetMap project – a community driven initiative to provide free high resolution vector data in competition with OS. Smaller players and website developers will be the winners as this opens up new opportunities for mashing up the information into myriads of specialised online applications.

Ordnance Survey is a £116 million a year enterprise and now part of this revenue will be forgone for “a wider good”. And the burden of maintaining high quality geographic information will have to be shifted to the UK Government (ultimately taxpayers) as the activity will no longer be funded by end users. However, the argument is that the overall commercial benefit to UK economy will be greater than the lost revenue stream.

There is similar attempt to liberate government data in Australia with Government 2.0 Taskforce initiative. It is not a first such attempt - Spatial Data Access and Pricing Policy from 2001 is still in place. It allowed free access to quite a range of geographic information in the past but it is rather very difficult to assess its economic benefits.

It is quite obvious from past experience that just releasing the data will not lead to any tangible benefits. There must be a framework put in place at the same time for managing and improving that data (ie. either big money from the government or big crowdsourcing initiatives, as in case of OpenStreetMap project). Otherwise there is a danger that we will end up in a big mess… with everyone maintaining their own sets of data (hence multiplying the efforts) and creating own set of problems (one only need to look as issues with postcode data in Australia to understand what it can lead to... ).

Tuesday, November 17, 2009

NearMap Goes Live!

When I was writing my recent post about Stuart Nixon and his latest project: NearMap, the site was not yet operational. But today, to my great excitement, I got an anonymous tip that NearMap is up and running! I could not resist and took it immediately for a test drive.

My first impression is that NearMap looks… so familiar! Creators went to a great length to ensure NearMap has the "look and feel" of other online maps, like Google or Bing. It is quite appropriate because potential users will be immediately familiar with how it works… Yet, there is also plenty of unique features. One such feature is a sliding bar across the top of the map which allows scrolling through imagery acquired at different points in time. Perth has the best selection. Currently high resolution coverage is limited to major Australian cities (Adelaide, Brisbane, Sydney, Melbourne and Perth) but there is enough there to fully demonstrate the quality of imagery and functionality of NearMap. There are a few Landsat mosaics and Blue Marble monthly mosaic series from 2004 to cover the rest of the world for now.

Another unique feature is "Terrain" viewing mode which depicts terrain model derived from high resolution imagery (including buildings!). And when you are in PhotoMap viewing mode and zoom close enough, "Multiview" option is activated which allows observing the object on the ground from 4 different directions. This feature is similar to Bing's Birds-eye view mode.

NearMap street overlay comes from OpenStreetMap. It is worth noting that, unlike other suppliers, NearMap actually encourages users to utilise its imagery to improve the quality of this community generated and maintained vector dataset. There is "Edit" button on the map that takes the user directly to OpenStreetMap site. NearMap can be embedded in other websites with iframe HTML element (as in this blog for example) and specific locations can be bookmarked, shared and/or linked to with a unique URL address.

I am very impressed with NearMap. The level of detail on imagery is amazing and directional views, although not perfectly stitched, are very realistic indeed. You can literary "peek over the fence" and see what your neighbor is up to! I just hope it won't become a big issue that will necessitate degrading of the resolution of publicly viewable imagery. The application is very responsive and there is definitely a great potential for NearMap to carve a big niche in the online mapping market, in direct competition with Google and Bing maps.

I will share with you how to view NearMap images in true 3D but for that you will have to come back next time. :-)

Google enters mobile GPS Navigation Market

So far Google played only on the fringes of the mobile GPS Navigation market with flat Google maps for mobile, integrated with driving directions and live traffic information. But the situation is about to change with the release of Google Map Navigation (beta). This new application comes with everything you'd expect to find in a fully featured GPS navigation system: maps in 3D perspective, turn-by-turn voice guidance and automatic routing. But there is more. Google took advantage of a wide range of products it is already supplying to the public and packaged them together into a potentially very attractive product.

In particular, users will always get the latest maps and there is no need for bulky downloads since maps come directly from Google servers. The application supports simple plain English search, voice search and search for POI along the route. There is live traffic data (an indicator light in the corner of the screen alerts current traffic conditions along your route) as well as high resolution satellite imagery and street view options that show exactly what to expect at the destination.

There is an expectation that Google will release API for Map Navigation to allow developers to build myriad of customised applications and help forthcoming Android phones better compete with the Apple iPhone. Google Map Navigation if free and user only need to cover Internet connection fees to use it. For now, Google Map Navigation can only run on Android 2.0 enabled phones which are very limited in supply. In fact, there is only one handset on the market - Droid from Verizon - on sale since 6th November and only in the U.S.

Originally spotted on: Google Maps Mania

Saturday, November 14, 2009

Pixel worth 1000 bits

It took rather long time for raster imagery (maps but also aerial and satellite photography) to gain wide acceptance for use in online GIS applications. Raster imagery used to be just a backdrop to “real data” (ie. vectors) as substantial file size (often in obscure proprietary format) and generally low resolution made it difficult to incorporate imagery into applications and rather impractical to use for more than just the initial “splashback”. The fact that remote sensing, aerial surveys and GIS were traditionally seen as totally separate disciplines was not helping in driving amalgamation of various formats into a single data management and online visualisation solution.

It would be fair to say that the release of Google online mapping technology was a catalyst to initiate change. These days raster imagery is dominating in online mapping applications, especially in those with high traffic such as Google Maps, Bing Maps, Yahoo Maps. The ability to show ground features in detail increased significantly with the advances in image capture technologies, reducing the reliance on vectors to depict such information. The ingenuity of using small image tiles, to overcome the issues with raster data file size and to improve efficiency of online distribution, made it much easier to use raster rather than vector data format for presenting information.

[exmple of raster map: Topo 250K by Geoscience Australia]

Just to clarify, in the traditional online GIS applications both raster images and vectors are presented on a computer screen as images (eg. gif, jpg, png). However, the difference is in how those images are created. In case of vector data (and some raster data with pixel attribute information), all images are generated from data on the fly by a map server (caching techniques are now available for static information to reduce the load) and there is a “link” maintained between what is presented on the image and vector data so if user “clicks” on the line representing eg. road, attribute information specific to that segment of the line can be returned as a query result (eg. name, width, type). In case of raster imagery without attribute information for each pixel, the image is pre-generated in advance (eg. raster topographic map). With this approach, dynamic referencing to attribute information in the source dataset is lost.

[example of map image generated from vector data: Topo 250K by Geoscience Australia]

It is technically possible to use a mix of vector data and raster imagery in a single application to get the best of both approaches (ie. high resolution imagery or nice pre-designed cartographic representation of vector data, delivered fast as tiled images yet still referenced to attribute information in the original dataset) but I have not seen this yet implemented as a general practice. Here is another idea for Google – add “get feature” service to your maps! It would work exactly as “reverse geocoding” service but rather than returning address information for the point (based on cadastre vector dataset) it could also return information on other map features (like roads, parks, buildings, other POI, etc). Creating a “link” to source vector data could also open up the opportunities for all sorts of spatial query service options: “distance”, “area”, but also more complex like “select within”, “adjacent” etc.

With the exception of a handful of technologies, specifically developed for true vector and imagery data streaming, the overwhelming majority of online mapping applications is not capable of processing efficiently vector data into complex images - on the fly and in volumes required for today’s Internet – without a massive hardware infrastructure behind it.

Currently, reliance on true vector data in browsers is limited to presenting a small number of points, polygons or 3D objects, or only to highly specialised applications. There is support for true vector data in a browser environment via Java, Flash or Silverlight but to make it work efficiently it requires sophisticated solutions to handle on-the-fly vector generalisation and local data caching (as mentioned above, only a handful of companies managed to do it, and they are not the industry leaders! Although, I should mentioned that I am very impressed with an online application I have seen recently, developed in Silverlight and showing nicely quite a volume of vector data - I will have to investigate in more detail!

Applications such as Google Map make use of browser’s support for vector data in VML/SVG format but overall, browser processing capabilities are very limited. Therefore, although Google Map will accepts vector data in KML format but if the file is too big (100KB +) Google will convert it to image tiles to speed rendering of the information in the browser. It is appropriate for presentation of static data but will not work with dynamic information (eg. thematic maps) because once vectors are converted to images on initial map load they cannot be changed with a script. If the same amount of data is imported into Google Map application and rendered as true vectors (eg. with GPolygon() function), loading and drawing of the information on the map is rather slow.

There is a new concept emerging for handling of spatial information, regardless whether the source is raster imagery or vector data. It is the concept of a spatial grid. Traditionally, girds were used with Digital Elevation Models (DEM) data. Later they were also found to be applicable in the field of spatial analysis. Now grid concept can also be applied to referencing myriad of attributes to a specific location (cell) on the surface of the Earth – making all the data hierarchical and analysis-ready. If I understand correctly, such organisations as USGS are now planning to start releasing information as referenced grid cells rather than traditional imagery although there are still some challenges in defining those grids, indexing the data, and of course in storage capabilities for high resolution datasets.

The theory and technologies developed for handling imagery will find use in implementation of this new approach. After all, image pixels are a form of a grid. Graphic formats offer more efficient storage and compression capabilities than traditional spatial databases and emergence of graphical processing capabilities (GPU) offer a great hope for very fast analytical capabilities – a new and exciting era in spatial data management, processing and delivery!

Friday, November 13, 2009

MashupAustralia highlights Pt 3

It looks that this will be quite a long series… The competition has just been closed with the total of 78 entries (jumped from 40 just yesterday) - overwhelming majority map based applications. And in the end only one major GIS company decided to participate which is quite a disappointment. It's not the little guys and galls that will make a difference. It is the big end of town that has the capacity and resources to manage the data on ongoing basis and that can bring technology and know-how to enable opening up of government volts of information. In the end, there was probably not enough "fresh carrot" in this contest to bring them aboard. So, let's review those institutional mushups first. Yesterday I finished with entries relating to bushfire information so I will continue with this theme.

Firelocator: map based application showing latest fire related information (satellite hotspots, Country Fire Authority and Rural Fire Services RSS feeds), some basic population stats and photos from Flickr. - built with Silverlight and Bing Map, and surely full GIS at the back end. The application was submitted by Pitney Bowes (formerly MapInfo Australia). Firelocator has a few nice features (like interactive filters for hotspots - based on "confidence" value - or for fire incidents based on type and status). However, there are also a few limitations which is rather disappointing considering that Firelocator carries "TechAmerica Innovator Award" logo. In particular, administrative boundaries overlay is not projected correctly - the further you zoom out the greater the discrepancy. Also, I never liked Bing's onmouseover auto pop-up feature but in case of this application it is frankly quite annoying as it makes map panning very difficult if the map is crowded with makers. Firelocator didn't work for me in FireFox at all (unlike the other Silverlight application I reviewed yesterday).

AuScope Portal: another institutional entry featuring information on Australia's mineral resources, mines and processing centres - built with Google Map and using OGC web services (WMS and WFS). Feels like fully featured online GIS but I found it a bit cumbersome to use (I had real difficulties in displaying and removing layers and especially managing order of layers for display). Great collection of geology related information though!

LocateMe: application submitted by Western Australia Landgate crew. It presents demographic profiles for areas of interest and lists government services available in the area. It uses Google Map as a background layer but is built with OpenLayers - it certainly does not behave like a real Google Map. Once again, the application is styled as a fully featured online GIS with traditional navigation tools - which makes it rather awkward to use (ie. no double click to zoom, no zoom slider tool, requires selection of buttons to either zoom in or out or pan). It could have been quite an interesting application if it wasn't for so many layers being restricted with a password. And I couldn't get demographic profiles to load at all.

EasyData: new website released by the Department of Trade and Economic Development of South Australia that provides information on sixty social, environmental and economic indicators, and allows regional comparisons. It has a basic Flash map to enable users to navigate to area of interest and lots of graphs of statistical and economic information. It may be a bit too heavy for lower end computers though...

Data Aggregator: very slick and well built Google Map application that allows displaying almost all datasets released for MashupAustrlia competition, and more. Easy to navigate and certainly much easier to use than any of the above mentioned mapping mushups. Which is quite surprising since the application is using hierarchical and expandable layer selector, very similar to the other applications mentioned today. A unique feature of Data Aggregator is a couple of custom controls for changing display options for markers and vector overlays.

To be continued…

Location specific time

Dynamically updated data, like eg. weather information on site or various RSS feeds, are published with a time stamp. It is customary that time is expressed as a Greenwich Mean Time (GMT), a universal reference so anyone, in any part of the world, can determine exactly "when this thing happened". Very handy concept but in order to make the most of the information, some recalculations are often required. There are sophisticated server side scripts to deal with time but JavaScript also has a few handy functions to help with the task.

Before I move on to examples, just a quick reference to one more important concept: epoch. It is defined as time kept internally by a computer system, expressed as the number of time units that have elapsed since a specified date. Unix time epoch, used by UNIX, Linux, other UNIX-like systems, Mac OS X, as well as Java, JavaScript, C and PHP Programming languages, started on January 1, 1970 and is usually expressed in milliseconds. The epoch is what makes time related calculations so simple…

You can find many example of various JavaScript time functions by simply “googling” the term however, there is not much practical information on how to apply them. Here are a few transformations I found very useful:

Current date and time
Date() function, in its simplest form, will return current date and time as displayed on user computer.

var localtime= new Date();
// returns eg. Mon Nov 9 18:37:46 UTC+1100 2009

I often use a “combination” shown below to time stamp dynamic data (eg. frequently updated RSS feeds) in order to prevent caching so the information cannot be recalled from a local browser cache but is always called directly from the source:

var o= new Date();
var p= o.getTime();

Then I add dynamically extra parameter to source data URL :

var url= “somepage.html?p=”+p;

It is simpler than trying to get all “no-cache” headers and metatags to work consistently in all browsers.

Date and time in the past
By adding an additional parameter, the same Date() function can be used to convert a string of letters to a proper date format (ie. any date, expressed in “local time”):

var localtime= new Date(“Fri, 06 Nov 2009 04:56:00”);
// returns Fri Nov 6 04:56:00 UTC+1100 2009

Past date and time in GMT
Any past date expressed in local time can be easily converted to GMT.

var localtime= new Date(“Fri, 06 Nov 2009 04:56:00”);

var gmt=localtime.toUTCString();
// returns: Thu, 5 Nov 2009 17:56:00 UTC

GMT to local time
Reverse conversion from GMT to local time is also very easy. If your input time is expressed in GMT and you need local equivalent, just include “GMT” letters in the code to flag the input is not a local time!

var localtime= new Date(“Fri, 06 Nov 2009 04:56:00 GMT”); or
var localtime= new Date(“2009/11/05 04:56:00 GMT”);
// returns Fri Nov 6 15:56:00 UTC+1100 2009

Time in milliseconds (epoch)
Epoch is a very handy format for storing time info in a database and for all sorts of time related calculations.

var fulldate= new Date(); //returns local time as set on the user computer
var epoch=fulldate.getTime(); // converted to milliseconds

Unfortunately, I could not find any information on acceptable date formats. For example, this very frequently used Unix format does not work in JavaScript Date() function: “2009/11/06T03:42:40Z” - it has to be converted to a format that JavaScript can recognise before use.

Thursday, November 12, 2009

MashupAustralia highlights Pt 2

These are the last days of the competition so the number of submission is growing rapidly. Today just a few more examples of the most interesting entries. Again, applications using maps to present the information are dominating so, I will focus specifically on those.

The first application I would like to mention is another creation in Silverlight and Bing: GeoDemo (from the author of previously featured CrimeFinder). I must admit, I am very impressing with vector handling capabilities of Bing Map, even in FireFox browser! And I like the concept of changing boundaries to more detailed as the user zooms in closer and closer. Installing Silverlight is really no trouble but unfortunately not an easy task for institutional users (especially those in the public service) so, that may explain low popularity of this application. But there is a JavaScript version available as well and it also performs surprisingly fast (shown below)

There are also two new entries dealing with bushfires. Victoria: Fire Ready depicts locations of Fire Brigades and Police Stations in the vicinity of a chosen address. Distance circles of 10, 25 and 45km from the address are marked to allow users discern how quickly aid can arrive in case of emergency. The application also plots incidents reported by Victorian Country Fire Authority as well as current weather and wind conditions from the Bureau of Meteorology. Source information is very similar to what I use in my Bushfire Incidents map but the concept of the application and presentation is totally different. And much more popular judging by the number of votes!

Firemash is an attempt to match official announcements from NSW Rural Fire Services with incidents reported by a community via Twitter. Users can plot location of their houses on the map and if any incidents are reported in the vicinity, the application will retweet the information to alert the user. Interesting concept although its usefulness will be difficult to demonstrate at the moment because there are no fires reported in the entire NSW!

I will continue with more tomorrow…

Monday, November 9, 2009

New entries in mashup contest

There are a few new entires in the MushupAustralia contest. With only a handful of exceptions they are all built with at least a simple mapping capabilities (you guessed it, overwhelmingly Google Maps!). Here is a brief description of the most interesting ones, in no particular order:

Fridgemate: simple concept of creating a "fridge list" and a map of places of interest in a local area. Currently the most popular mashup, with top score and the largest number of votes.

CrimeFinder: very impressive thematic mapping application built with Silverlight and Bing Maps. Nice user interface. What is of interest to me the most is how did the author managed to incorporate Local Government Boundaries in vector format. I know, I know, Silverlight is vector format (as Flash) but it appears there is some support as well for vector generalisation since boundaries are redrawn with different level of detail depending on the map zoom extents.

Suburban Trends: also very impressive thematic mapping application with ABS population statistics and various crime related information. Great use of dynamically loaded vector polygons of suburbs and Google Chart API.

NSW Crime Explorer: another application exploring crime statistics in tables, graphs and thematic maps (it appears to use static KMLs though).

ABS In Motion: statistics presented on an interactive Flash charts (not for low end computers).

Demographic Drapes: OpenLayers application using Google Maps as a backdrop and a number of thematic layers with populaution statistics and various administrative boundaries. More like a traditional online GIS.

Tonight I have also submitted my last application for this contest: revised Bushfire Incidents map. Just in time for the new bushfire season. I added extra controls to show additional information on the map:
- wind conditions (live from Bureau of Meteorology Web Feature Service)
- RSS feeds from Victorian CFA and NSW RFS (geocoded on the fly where possible)
- Locations of Fire Brigades in Victoria (over 1,200 points), and
- Weather widget and YouTube video player.

This is the last week of the competition. Entries close at 4pm, Friday 13 November. I am still hoping to see submissions from the commercial end of town but maybe they are all too busy chasing real projects…

GovHack winners announced

News from the recent mashup event held in Canberra under the GovHac banner finally started to filter though. The overall winner is Lobbyclue, a Flash based application attempting to visualise complex tangle of government suppliers and contracts with individual Departments. There is also has a mapping component, built with OpenLayers and OpenStreetMap API, but is very slow to load.

Although not specifically mentioned as such, the following creations appear to be runners up in the contest:

Know where you live: stylish snapshots of population statistics from ABS and some crime statistics (for postcodes in Sydney area only).

What the Federal Government Does: tag cloud and network diagram of different functions of the government and its various departments (concept only).

Rate A Loo: shows the location of public toilets in the ACT - along with some descriptive data, and allows users to rate the condition of public toilets and comment.

Overall, there are 13 projects listed on GovHac. Unfortunately, I could not get some of them to work so, I reserve my further comments. Although, there were a few potentially interesting submissions. You can find links to most of them on the GovHac wiki page.

I was hoping to give you a first hand account of what was happening at GovHac but it was not meant to be. When I turned up for the closing ceremony, at the nominated place and nominated time, there was only me, the caterers, buckets of drinks and plenty of food…. I figured out that there must have been some last minute changes to the schedule so I went back to ANU campus only to find locked door and a note stuck on the wall “See you next year”. I managed to communicate with a person on the other side of the door that “something is still happening upstairs” but for me, there was nothing else to do but leave…

The disappointing bit was - no, not that I missed out on the closing ceremony but - lack of participation from commercial operators and larger companies. The event was dominated by amateurs. Don’t get me wrong. It is good that individuals are able to present fresh ideas on potential use of government information but the real breakthrough cannot be achieved without active involvement of IT/GIS industry leaders. However, to secure that participation organisers would have to provide some tangible commercial benefits – like publicity, contacts and/or prospects of big money for specific projects… Neither appears to be available for now.

Organisers are planning GovHack Canberra Encore on November 11 2009, 4:00pm - 5:00pm to show several submitted applications to a larger audience so, I will have another chance to see it first hand.

Thursday, October 22, 2009

Free Address Validation Tool

Today I am announcing release of another freebie from – Address Validation Tool. It is an online application for geocoding and validating address information in a semi-automated fashion. It is built with Google Maps API and Google geocoding engine and is suitable for handling small to medium volume of data.

Geocoded geographic coordinates of points of interest can be adjusted manually by repositioning location marker on the map (latitude and longitude will be updated from corresponding map coordinates). Address and accuracy code details can also be edited manually before saving the record. All saved records can be processed into CSV, KML or GeoRSS output format on completion. Individual records in input data are identified with a sequence number which is maintained throughout the entire process to facilitate easy reconciliation of output file with original information.

Geocoded information is classified according to accuracy, eg. “address level accuracy”, “street level accuracy”, “town/ city level accuracy” etc. Counts of records in each accuracy level are maintained during the process and all saved points can be previewed on the map at any time.

Address validation is a 3 step process:

Step 1. Paste list of address or locations to be geocoded and validated into a text area in “Input” tab and click “Press to Start/ Reset!” button to initiate the process.

Step 2. Edit geocoded information in “Edit” tab and save the result (one at a time). “Save” button saves current record and geocodes the next from the input list. Any text and accuracy code edits will be saved as well. Use “Next” button to skip to the next record on the input list without saving (skipped record will not be included in the final output file).

Step 3. Generate output from saved records to reconcile with the original information. CSV is the most obvious choice for updating original dataset. Although KML and GeoRSS outputs generated by the tool can be used with Google Map or Google Earth without further edits, it is recommended that you update content of at least "title" and "description" elements to improve presentation of the information.

Useful tips:
  • Include “country” field in the input data to improve geocoding accuracy if you are getting too many results from incorrect parts of the globe.
  • You have a chance to preview saved locations and to make final positional adjustments by selecting any of the options from “Show saved records by accuracy:” pull-down menu in “Edit” tab. Please note, all makers displayed on the map can be moved however, any changes in latitude and longitude coordinates will be saved automatically and cannot be undone.
  • Composition of address detail will differ depending on geocoding accuracy level. For ease of further processing, avoid mixing various accuracy levels in the final output file if you intend to split address details into components.
  • Geocoded address information is written into CSV file as a single text field but it can be split further using spreadsheet's “Data/Text to Column” function if you require individual address components as separate fields.

Address Validation Tool is a replacement for my earlier application - Bulk Geocoder - which was also built with Google geocoding engine. Since Google terms of use changed earlier this year, it is now prohibited to run fully automated batch geocoding using free Google service. To comply with those restrictions this new tool allows to geocode only one point at a time. And if I interpret the wording correctly, the information itself can only be used with Google applications.

I have submitted this application as my second entry in the MashupAustralia contest (the first one was Postcode Finder). I hope that it will be a handy resource to help improve spatial accuracy of data released for this competition and beyond. Any comments, feedback and suggestions greatly appreciated!

Friday, October 16, 2009

MashupAustralia contest update

MashupAustralia contest I mentioned in my earlier post has been running for a week and a bit now. Only five entries so far (in descending order, from the newest to the oldest):

Your Victoria Online: Google Map based application to locate the nearest Emergency Services, Government Services, Schools and Public Internet etc.

Victorian Schools Locator: Google Map based application, created with map maker, this application shows locations of over 2,000 schools in Victoria.

Broadband Locator: the site is using Google Maps and Street View to display address information - visitors can enter the address and the application will show what broadband services are in their area.

Geocoded List of Medicare Office Locations: a geocoded ATOM feed of Medicare offices.

Postcode Finder: my first entry into the contest – with postcodes and suburbs boundaries and Victorian Police Stations as POI. I am planning to add more POI, eventually. Unfortunately, the data supplied for the contest is all over the place and cannot be just “plugged in” without major rework (I mean, to show consistent information or reasonable spatial accuracy).

Judging by the number of visitors coming to my page from site and the number of entries to date the contest is not as widely embraced as many may have hoped for, but these are still early days. Hopefully, my blog can bring a bit of extra publicity for this contest. It is after all a very worthy cause.

The closing time for lodging entries into the contest has been extended to 4PM Friday, 13th November 2009 so it gives plenty of time for building complex applications. There will also be a number of mashup events over the next few weeks which should bring plenty of exciting new entries:

I can already claim one “conciliation prize” in this contest – being the first entrant into the competition! It does not come with any formal accolades nor a cheque but this will do me just fine. I am not really in contention for any prizes. Just wait till you see what is cooking in garages around the country and what master chefs - cream of Australian GIS industry - will soon start to serve!

Wednesday, October 14, 2009

Mapping Stimulus Projects in Oz

Last month in my post on Google tools for public sector I provided a few examples of how Australian government departments and organisations are using Google maps to present various information. Today another interesting example: a map showing where and what projects billions of dollars committed by the government in the economic stimulus package is spent on. Information is available for six different expenditure categories: education, community infrastructure, road and rail, housing, insulation and solar. Zoom to your local area to find out what is actually happening in your neighbourhood with the allocated money.

Some of the information available on the Nation Building - Economic Stimulus Plan site has also been released under Creative Commons - Attribution 2.5 Australia (CC-BY) licence and can be freely used for various mashups and analysis. In particular, you can access information on all current community infrastructure, and road and rail projects across Australia. And if you have a great idea on how to use this data you can enter Mashup Australia contest for great prizes. It is run by Government 2.0 Taskforce for a limited time.

Tuesday, October 13, 2009

Ed Parsons on Spatial Data Infrastructure

I recently attended Surveying & Spatial Sciences Institute Biennial International Conference in Adelaide and was privileged to see Ed Parsons’ presentation. For those who don’t know Ed, his bio describes him as “… the Geospatial Technologist of Google, with responsibility for evangelising Google's mission to organise the world's information using geography, and tools including Google Earth, Google Maps and Google Maps for Mobile.” He delivered a very enlightening and truly evangelistic presentation outlining his views on the best approach to building Spatial Data Infrastructures. The following paragraphs summarise the key, thought provoking points from the presentation – with some comments from my perspective.

The essence of Ed’s position is that the currently favoured approach of building highly structured and complex to the n-th degree “digital libraries” to manage spatial information is very inefficient and simply does not work. There is much better framework to use – the web – which is readily available and can deliver exactly what is needed by the community, and in a gradual and evolutionary fashion rather than as a pre-designed and rigid solution.

I could quote many examples of failed or less than optimal implementations of SDI initiatives in support of Ed’s views. There is no doubt that there are many problems with the current approach. New initiatives are continuously launched to overcome the limitations of previous attempts to catalogue collections of spatial information. And it is more than likely that none of the implementations is compatible with the others. The problem is that metadata standards are too complex and inflexible and data cataloguing software is not intelligent enough to work with less than perfectly categorised information. I recently had first hand experience with it. I tried to use approved metadata standards for my map catalogue project, hoping it will make the task easier and the application fully interoperable, but in the end, I reverted to adding my own “interpretations and extensions” (and proving, at least to myself, that “one-fit-all” approach is almost impossible). I will not even mention the software issues…

Ed argued that most SDI initiatives are public sector driven and since solution providers are primarily interested in “selling the product”, therefore by default it all centres on data management aspect of the projects. In other words, the focus is on producers rather than users, on Service Oriented Architecture (SOA) rather than on “discoverability” of relevant information. All in all, current SDI solutions are built on the classic concept of a library where information about the data (metadata) is separated from the actual data. Exactly as in a local library, where you have an electronic or card based catalogue with book titles and respective index numbers and rows of shelves with books organised according to those catalogue index numbers. For small, static collections of spatial data this approach may work, but not in the truly digital age, where new datasets are produced in terabytes, with myriad of versions (eg. temporal datasets), formats and derivations. And this is why most SDI initiatives do not deliver what is expected of them at the start of the project.

Ed made a point that it is much better to follow an evolutionary approach (similarly to how web developed over time) rather than strict, “documentation driven” process, as is the case with most current SDI projects. The simple reason is that you don’t have to understand everything up-front to build your SDI. The capabilities may evolve as needs expand. And you can adjust your “definitions” as you discover more and more about the data you deal with. In an evolutionary rather than prescriptive way. It is a very valid argument since it is very, very hard to categorise the data according to strict rules, especially if you cannot predict how the data will evolve over time.

[source: Ed Parsons, Google Geospatial Technoloist]

The above table contrasts the two approaches. On one side you have traditional SDIs with strict OGC/ISO metadata standards and web portals with search functionality - all built on Service Oriented Architecture (SOA) principles and with SOAP service (Simple Object Access Protocol) as the main conduit of information. Actually, the whole set up is much more complex as, in order to work properly, it requires formalised “discovery” module - a registry that follows Universal Description, Discovery and Integration (UDDI) protocol and a “common language” for describing available services (that is, Web Service Description Language or WSDL in short). And IF you can access the data (big “if” because most public access SDI projects do not go as far) it will most likely be in a “heavy duty” Geographic Markup Language (GML) format (conceived over a decade ago but still mostly misunderstood by software vendors as well as potential users). No wonder that building SDI based on such complex principles poses a major challenge. And even in this day and age the performance of such constructed SDI may not be up to scratch as it involves very inefficient processes (“live” multidimensional queries, multiple round trips of packets of data, etc).

On the other side you have the best of web, developed in an evolutionary fashion over the last 15 years: unstructured text search capabilities delivered by Google and other search engines (dynamically indexed and heavily optimised for performance), simple yet efficient RESTful service (according to Ed Parsons, not many are choosing to use SOAP these days) and simpler and lighter data delivery formats like KML, GeoRSS or GeoJSON (that have a major advantage – the content can be indexed by search engines and therefore making the datasets discoverable!). As this is much simpler setup it is gaining a widespread popularity amongst “lesser geeks”. US government portal is the best example of where this approach is proving its worth.

The key lesson is, if you want to get it working – keep it simple and do not separate metadata from your data to allow easy discovery of the information. And let the community of interest define what is important rather than prescribe upfront a rigid solution. The bottom line is that Google strength is in making sense of chaos that is in cyberspace so it should be no surprise that Ed is advocating similar approach to dealing with chaos of spatial data. But can the solution be really so simple?

The key issue is that most of us, especially scientists, would like to have a definite answer when we search for the right information. That is: “There are 3 data sets matching your search criteria” rather than: “There are 30,352 datasets found, first 100 closest matches are listed below…” (ie. the “Google way”). There is always that uncertainty, “Is there something better/ more appropriate out there or should I accept what Google is serving as the top search result?... What if I choose the incomplete or not the latest version of the dataset?”… So the need for highly structured approach to classify and manage spatial information is understandable but it comes at a heavy cost (both time and money) and in the end it can serve only the needs of a small and well defined group of users. “The web” approach can certainly bring quick results and open up otherwise inaccessible stores of spatial information to masses but I doubt it can easily address the issue of “the most authoritative source” that is so important with spatial information. In the end, the optimal solution will probably be a hybrid of the two approaches but one thing is certain, we will arrive at that optimal solution by evolution and not by design!

Saturday, October 3, 2009

Mushup Australia Contest

A few days ago Australian Government 2.0 Taskforce announced an open invitation to any "able and willing body" to create mashups with nominated datasets from various Federal and State jurisdictions in Australia. It is a contest so there will be prizes for winning entries:
* $10,000 for Excellence in Mashing category
* $5,000 for Highly Commendable Mashups
* $2,500 for Notable Mashing Achievements
* $2,000 for the People’s Choice Mashup prize
* $2,000 for the Best Student entry
* $1,000 bonuses for the Transformation Challenge

Anyone in the world is eligible to enter but prizes will only be awarded to individuals from Australia or teams where at least one member have Australian credentials. The contest is open from 10am October 7 to 4pm November 6 2009 (Australian Easter Standard Time - GMT+11.00).

I will be entering at least two of my applications that have been running on site for the last couple of years and are already used by quite a few people. These are: bushfire incidents map (part or my larger Natural Hazards Monitor concept) and Postcode Finder with links to Australian demographic information from the Australian Bureau of Statistics. If you have an application that is suitable to present information nominated in the contest rules and need Australian representative on the team or would like to access some of the data from my site I invite you to partner with me in this competition.