Monday, November 30, 2009

Mapping crime statistics UK style

Just to complement my latest post on Australian mapshups with crime statistics, a quick look on how the topic is dealt with in the UK – on the official map of crimes and antisocial behaviour from National Policing Improvement Agency. The map was launched with a great fanfare by Home Office Minister on 20th October, 2009.

The application shows monthly and quarterly crime statistics for the last couple of years (by type) for policing regions in England and Wales. Crime levels in neighbourhoods, within the policing regions, are shown on a Bing Map as shaded thematic overlays. Users can compare statistics of selected area to other locations as well as download the data as csv files.

Due to high demand the site went down frequently within the first 24 hrs after the launch but now performance is quite reasonable. However, the developers did not put much effort into generating neighbourhood boundaries since they do not align well with each other. As well, there may be some underlying issues with the data collection methodology – my UK based colleague pointed out that an area in his neighborhood with no population at all (ie. local park) is shown on the map as having the highest crime rate. This is of some concern since it my potentially influence purchase decisions and the demand for properties in the area. What is traditionally seen as a very desirable feature (ie. parkland) is now indicated as a hive of criminal activities. It highlights the perils of using statistics indiscriminately.

I believe that Australian developers participating in the MashupAustralia tackled technical and presentation challenges for equivalent set of crime statistics pretty well, and to a very high standard.

Australia’s love for private space

A news item making rounds in the media this morning is that Australians have the largest houses in the entire world (statistically speaking). A research commissioned by CommSec and conduced by the Australian Bureau of Statistics concluded that the average floor area of new dwellings in Australia hit a record 214.6 square metres in the last financial year. In the US the average size of a residential dwelling is only 201.5 square metres which still compared favourably to Denmark which has Europe’s biggest dwellings with an average floor area of 137sq m, Greece (126sq m) or the Netherlands (115.5sq m). Homes in the UK are the smallest in Europe at 76sq m.

Australia's largest dwellings were built in Victoria with an average 224.5sq m floor area, followed by Western Australia, Queensland, Northern Territory, NSW, Tasmania, South Australia and the ACT. However, the biggest houses are in NSW where the average new house built in 2008-09 was 262.9sq m, and in Queensland (253sq m). We love the space but does it makes us happier? Just think about the cleaning…

Sunday, November 29, 2009

MashupAustralia highlights Pt 4

One dataset that generated quite a lot of interest amongst developers was New South Wales Crimes by offence type, month and Local Government Area (1995-2008). It is quite a comprehensive and complex bit of information and developers took a range of different approaches in presenting that information in their applications. All in all, there were 7 applications built with crime information as the main theme and several more where crime statistics were used as a complementary information. Here is a review of the most interesting submissions.

NSW Crime Explorer presents statistics in tables, graphs and on Google Map as thematic overlays (only 2008 data). Tables are very well laid out and are very easy to read despite that they contain very comprehensive set of information. I particularly like little graphs in each table row depicting trends over the 10 year period. There are also links to line graphs with monthly stats for each crime type (generated with Google Visualisation tools). A drop down list of all Local Government Areas enables easy navigation between locations of interest. The map is very basic but a selection of overlays for each crime type clearly shows where the offences were committed in 2008. The author has chosen 15 categories for data classification and it slightly blurs the clarity as to which areas are crime hotspots. Overall a very comprehensive presentation.

CrimeFinder is a mashup created with Silverlight and Bing Map. Thematic map shades Local Government Areas according to crime rate or absolute numbers of crimes committed. Different crime types can be selected from a drop-down list. Mouse-click on a particular area brings up a line graph showing comparative trends in crimes for NSW and the selected LGA. Mouse-over function highlights particular boundary and brings up summary information for the area. Slider filters allow setting start and end dates for data to be presented on the map as well as enable adjusting transparency level of the overlays. Redraw of LGA boundaries on the map is very smooth on zooming and panning which is quite an achievement considering it is all vector data.

NSW Crime Map demonstrates yet another approach to presenting crime statistics with Google Map and Google Visualisation tools. The number of crimes committed in major NSW regions, in any given month, is represented as dots on the map. Size of those dots is proportional to the number of committed offences. Different crime types are selectable from a list next to the map and selection of dates is done with a slider tool. Dynamic graph under the map displays monthly counts of incidents between 1995 to 2008.

How Safe Is Your Suburb is an application created with commercial software integrated with Google Map and Flash graphics. Crime data is presented in four different ways: based on geographic distribution (thematic map with data table), as a cross tabulation of offence by type and year (pie chart and data table), as a cross tabulation of year of the offence and type by Local Government Area (with line graph and data tables) and as summary statistics highlighting the most dangerous regions in NSW.

NSW Crime is a very simple mashup presenting crime data in two columns and a Google Map with location points depicting user selected Local Government Areas. The first data column contains a list of LGAs and counts of all crimes committed in those areas. The second column lists various types of offences and counts for a selected LGA. There is also an option to specify time range for calculating statistics. Simple approach yet allowing to show a wealth of information.

Tuesday, November 24, 2009

Google to index data from mashups

Last week Google announced its plans to log information from mashups that are using JavaScript Maps API v2. The information will then be used within the main Google Map site. Basic premise is similar to “My Maps” where user generated content can be searched and displayed on a map along other options (eg. locations, businesses or real estate).

Details are scarce but it appears Google will index only location and content of the markers and/or infowindows that are displayed on hundreds of thousands of mashups. There is an option to opt out from this arrangement and Google has indicated it will apply the following rules:

  1. We only index data from maps that have been viewed by many unique users. For example, maps only viewed by you and a few friends won't be indexed.
  2. If your page is protected by robots.txt, we will not index your content.
  3. You can opt-out of the logging by specifying "indexing=false" when loading the API.
  4. If you are a Maps API Premier customer, we will not index data from your maps, unless you opt into indexing by passing in "indexing=true".

The exact day the new arrangement will take effect will be announced in due course.

First spotted on Google Maps Mania

Monday, November 23, 2009

Free data - sign of times…

It’s official. Starting from April 2010, UK citizens and the rest of the world will be able to openly access maps from Ordnance Survey as well as plethora of interpretive geographical data such as crime, health and education statistics by postcode. After many years of significant revenue from a successful commercial model of licensing government geographical information to value added resellers, UK government has decided to change its approach and make the information available online for free.

Ordnance Survey monopoly on GIS data in UK will end although some may argue that its position was already heavily undermined by the success of OpenStreetMap project – a community driven initiative to provide free high resolution vector data in competition with OS. Smaller players and website developers will be the winners as this opens up new opportunities for mashing up the information into myriads of specialised online applications.

Ordnance Survey is a £116 million a year enterprise and now part of this revenue will be forgone for “a wider good”. And the burden of maintaining high quality geographic information will have to be shifted to the UK Government (ultimately taxpayers) as the activity will no longer be funded by end users. However, the argument is that the overall commercial benefit to UK economy will be greater than the lost revenue stream.

There is similar attempt to liberate government data in Australia with Government 2.0 Taskforce initiative. It is not a first such attempt - Spatial Data Access and Pricing Policy from 2001 is still in place. It allowed free access to quite a range of geographic information in the past but it is rather very difficult to assess its economic benefits.

It is quite obvious from past experience that just releasing the data will not lead to any tangible benefits. There must be a framework put in place at the same time for managing and improving that data (ie. either big money from the government or big crowdsourcing initiatives, as in case of OpenStreetMap project). Otherwise there is a danger that we will end up in a big mess… with everyone maintaining their own sets of data (hence multiplying the efforts) and creating own set of problems (one only need to look as issues with postcode data in Australia to understand what it can lead to... ).

Tuesday, November 17, 2009

NearMap Goes Live!

When I was writing my recent post about Stuart Nixon and his latest project: NearMap, the site was not yet operational. But today, to my great excitement, I got an anonymous tip that NearMap is up and running! I could not resist and took it immediately for a test drive.

My first impression is that NearMap looks… so familiar! Creators went to a great length to ensure NearMap has the "look and feel" of other online maps, like Google or Bing. It is quite appropriate because potential users will be immediately familiar with how it works… Yet, there is also plenty of unique features. One such feature is a sliding bar across the top of the map which allows scrolling through imagery acquired at different points in time. Perth has the best selection. Currently high resolution coverage is limited to major Australian cities (Adelaide, Brisbane, Sydney, Melbourne and Perth) but there is enough there to fully demonstrate the quality of imagery and functionality of NearMap. There are a few Landsat mosaics and Blue Marble monthly mosaic series from 2004 to cover the rest of the world for now.

Another unique feature is "Terrain" viewing mode which depicts terrain model derived from high resolution imagery (including buildings!). And when you are in PhotoMap viewing mode and zoom close enough, "Multiview" option is activated which allows observing the object on the ground from 4 different directions. This feature is similar to Bing's Birds-eye view mode.

NearMap street overlay comes from OpenStreetMap. It is worth noting that, unlike other suppliers, NearMap actually encourages users to utilise its imagery to improve the quality of this community generated and maintained vector dataset. There is "Edit" button on the map that takes the user directly to OpenStreetMap site. NearMap can be embedded in other websites with iframe HTML element (as in this blog for example) and specific locations can be bookmarked, shared and/or linked to with a unique URL address.

I am very impressed with NearMap. The level of detail on imagery is amazing and directional views, although not perfectly stitched, are very realistic indeed. You can literary "peek over the fence" and see what your neighbor is up to! I just hope it won't become a big issue that will necessitate degrading of the resolution of publicly viewable imagery. The application is very responsive and there is definitely a great potential for NearMap to carve a big niche in the online mapping market, in direct competition with Google and Bing maps.

I will share with you how to view NearMap images in true 3D but for that you will have to come back next time. :-)

Google enters mobile GPS Navigation Market

So far Google played only on the fringes of the mobile GPS Navigation market with flat Google maps for mobile, integrated with driving directions and live traffic information. But the situation is about to change with the release of Google Map Navigation (beta). This new application comes with everything you'd expect to find in a fully featured GPS navigation system: maps in 3D perspective, turn-by-turn voice guidance and automatic routing. But there is more. Google took advantage of a wide range of products it is already supplying to the public and packaged them together into a potentially very attractive product.

In particular, users will always get the latest maps and there is no need for bulky downloads since maps come directly from Google servers. The application supports simple plain English search, voice search and search for POI along the route. There is live traffic data (an indicator light in the corner of the screen alerts current traffic conditions along your route) as well as high resolution satellite imagery and street view options that show exactly what to expect at the destination.

There is an expectation that Google will release API for Map Navigation to allow developers to build myriad of customised applications and help forthcoming Android phones better compete with the Apple iPhone. Google Map Navigation if free and user only need to cover Internet connection fees to use it. For now, Google Map Navigation can only run on Android 2.0 enabled phones which are very limited in supply. In fact, there is only one handset on the market - Droid from Verizon - on sale since 6th November and only in the U.S.

Originally spotted on: Google Maps Mania

Saturday, November 14, 2009

Pixel worth 1000 bits

It took rather long time for raster imagery (maps but also aerial and satellite photography) to gain wide acceptance for use in online GIS applications. Raster imagery used to be just a backdrop to “real data” (ie. vectors) as substantial file size (often in obscure proprietary format) and generally low resolution made it difficult to incorporate imagery into applications and rather impractical to use for more than just the initial “splashback”. The fact that remote sensing, aerial surveys and GIS were traditionally seen as totally separate disciplines was not helping in driving amalgamation of various formats into a single data management and online visualisation solution.

It would be fair to say that the release of Google online mapping technology was a catalyst to initiate change. These days raster imagery is dominating in online mapping applications, especially in those with high traffic such as Google Maps, Bing Maps, Yahoo Maps. The ability to show ground features in detail increased significantly with the advances in image capture technologies, reducing the reliance on vectors to depict such information. The ingenuity of using small image tiles, to overcome the issues with raster data file size and to improve efficiency of online distribution, made it much easier to use raster rather than vector data format for presenting information.

[exmple of raster map: Topo 250K by Geoscience Australia]

Just to clarify, in the traditional online GIS applications both raster images and vectors are presented on a computer screen as images (eg. gif, jpg, png). However, the difference is in how those images are created. In case of vector data (and some raster data with pixel attribute information), all images are generated from data on the fly by a map server (caching techniques are now available for static information to reduce the load) and there is a “link” maintained between what is presented on the image and vector data so if user “clicks” on the line representing eg. road, attribute information specific to that segment of the line can be returned as a query result (eg. name, width, type). In case of raster imagery without attribute information for each pixel, the image is pre-generated in advance (eg. raster topographic map). With this approach, dynamic referencing to attribute information in the source dataset is lost.

[example of map image generated from vector data: Topo 250K by Geoscience Australia]

It is technically possible to use a mix of vector data and raster imagery in a single application to get the best of both approaches (ie. high resolution imagery or nice pre-designed cartographic representation of vector data, delivered fast as tiled images yet still referenced to attribute information in the original dataset) but I have not seen this yet implemented as a general practice. Here is another idea for Google – add “get feature” service to your maps! It would work exactly as “reverse geocoding” service but rather than returning address information for the point (based on cadastre vector dataset) it could also return information on other map features (like roads, parks, buildings, other POI, etc). Creating a “link” to source vector data could also open up the opportunities for all sorts of spatial query service options: “distance”, “area”, but also more complex like “select within”, “adjacent” etc.

With the exception of a handful of technologies, specifically developed for true vector and imagery data streaming, the overwhelming majority of online mapping applications is not capable of processing efficiently vector data into complex images - on the fly and in volumes required for today’s Internet – without a massive hardware infrastructure behind it.

Currently, reliance on true vector data in browsers is limited to presenting a small number of points, polygons or 3D objects, or only to highly specialised applications. There is support for true vector data in a browser environment via Java, Flash or Silverlight but to make it work efficiently it requires sophisticated solutions to handle on-the-fly vector generalisation and local data caching (as mentioned above, only a handful of companies managed to do it, and they are not the industry leaders! Although, I should mentioned that I am very impressed with an online application I have seen recently, developed in Silverlight and showing nicely quite a volume of vector data - I will have to investigate in more detail!

Applications such as Google Map make use of browser’s support for vector data in VML/SVG format but overall, browser processing capabilities are very limited. Therefore, although Google Map will accepts vector data in KML format but if the file is too big (100KB +) Google will convert it to image tiles to speed rendering of the information in the browser. It is appropriate for presentation of static data but will not work with dynamic information (eg. thematic maps) because once vectors are converted to images on initial map load they cannot be changed with a script. If the same amount of data is imported into Google Map application and rendered as true vectors (eg. with GPolygon() function), loading and drawing of the information on the map is rather slow.

There is a new concept emerging for handling of spatial information, regardless whether the source is raster imagery or vector data. It is the concept of a spatial grid. Traditionally, girds were used with Digital Elevation Models (DEM) data. Later they were also found to be applicable in the field of spatial analysis. Now grid concept can also be applied to referencing myriad of attributes to a specific location (cell) on the surface of the Earth – making all the data hierarchical and analysis-ready. If I understand correctly, such organisations as USGS are now planning to start releasing information as referenced grid cells rather than traditional imagery although there are still some challenges in defining those grids, indexing the data, and of course in storage capabilities for high resolution datasets.

The theory and technologies developed for handling imagery will find use in implementation of this new approach. After all, image pixels are a form of a grid. Graphic formats offer more efficient storage and compression capabilities than traditional spatial databases and emergence of graphical processing capabilities (GPU) offer a great hope for very fast analytical capabilities – a new and exciting era in spatial data management, processing and delivery!

Friday, November 13, 2009

MashupAustralia highlights Pt 3

It looks that this will be quite a long series… The competition has just been closed with the total of 78 entries (jumped from 40 just yesterday) - overwhelming majority map based applications. And in the end only one major GIS company decided to participate which is quite a disappointment. It's not the little guys and galls that will make a difference. It is the big end of town that has the capacity and resources to manage the data on ongoing basis and that can bring technology and know-how to enable opening up of government volts of information. In the end, there was probably not enough "fresh carrot" in this contest to bring them aboard. So, let's review those institutional mushups first. Yesterday I finished with entries relating to bushfire information so I will continue with this theme.

Firelocator: map based application showing latest fire related information (satellite hotspots, Country Fire Authority and Rural Fire Services RSS feeds), some basic population stats and photos from Flickr. - built with Silverlight and Bing Map, and surely full GIS at the back end. The application was submitted by Pitney Bowes (formerly MapInfo Australia). Firelocator has a few nice features (like interactive filters for hotspots - based on "confidence" value - or for fire incidents based on type and status). However, there are also a few limitations which is rather disappointing considering that Firelocator carries "TechAmerica Innovator Award" logo. In particular, administrative boundaries overlay is not projected correctly - the further you zoom out the greater the discrepancy. Also, I never liked Bing's onmouseover auto pop-up feature but in case of this application it is frankly quite annoying as it makes map panning very difficult if the map is crowded with makers. Firelocator didn't work for me in FireFox at all (unlike the other Silverlight application I reviewed yesterday).

AuScope Portal: another institutional entry featuring information on Australia's mineral resources, mines and processing centres - built with Google Map and using OGC web services (WMS and WFS). Feels like fully featured online GIS but I found it a bit cumbersome to use (I had real difficulties in displaying and removing layers and especially managing order of layers for display). Great collection of geology related information though!

LocateMe: application submitted by Western Australia Landgate crew. It presents demographic profiles for areas of interest and lists government services available in the area. It uses Google Map as a background layer but is built with OpenLayers - it certainly does not behave like a real Google Map. Once again, the application is styled as a fully featured online GIS with traditional navigation tools - which makes it rather awkward to use (ie. no double click to zoom, no zoom slider tool, requires selection of buttons to either zoom in or out or pan). It could have been quite an interesting application if it wasn't for so many layers being restricted with a password. And I couldn't get demographic profiles to load at all.

EasyData: new website released by the Department of Trade and Economic Development of South Australia that provides information on sixty social, environmental and economic indicators, and allows regional comparisons. It has a basic Flash map to enable users to navigate to area of interest and lots of graphs of statistical and economic information. It may be a bit too heavy for lower end computers though...

Data Aggregator: very slick and well built Google Map application that allows displaying almost all datasets released for MashupAustrlia competition, and more. Easy to navigate and certainly much easier to use than any of the above mentioned mapping mushups. Which is quite surprising since the application is using hierarchical and expandable layer selector, very similar to the other applications mentioned today. A unique feature of Data Aggregator is a couple of custom controls for changing display options for markers and vector overlays.

To be continued…

Location specific time

Dynamically updated data, like eg. weather information on site or various RSS feeds, are published with a time stamp. It is customary that time is expressed as a Greenwich Mean Time (GMT), a universal reference so anyone, in any part of the world, can determine exactly "when this thing happened". Very handy concept but in order to make the most of the information, some recalculations are often required. There are sophisticated server side scripts to deal with time but JavaScript also has a few handy functions to help with the task.

Before I move on to examples, just a quick reference to one more important concept: epoch. It is defined as time kept internally by a computer system, expressed as the number of time units that have elapsed since a specified date. Unix time epoch, used by UNIX, Linux, other UNIX-like systems, Mac OS X, as well as Java, JavaScript, C and PHP Programming languages, started on January 1, 1970 and is usually expressed in milliseconds. The epoch is what makes time related calculations so simple…

You can find many example of various JavaScript time functions by simply “googling” the term however, there is not much practical information on how to apply them. Here are a few transformations I found very useful:

Current date and time
Date() function, in its simplest form, will return current date and time as displayed on user computer.

var localtime= new Date();
// returns eg. Mon Nov 9 18:37:46 UTC+1100 2009

I often use a “combination” shown below to time stamp dynamic data (eg. frequently updated RSS feeds) in order to prevent caching so the information cannot be recalled from a local browser cache but is always called directly from the source:

var o= new Date();
var p= o.getTime();

Then I add dynamically extra parameter to source data URL :

var url= “somepage.html?p=”+p;

It is simpler than trying to get all “no-cache” headers and metatags to work consistently in all browsers.

Date and time in the past
By adding an additional parameter, the same Date() function can be used to convert a string of letters to a proper date format (ie. any date, expressed in “local time”):

var localtime= new Date(“Fri, 06 Nov 2009 04:56:00”);
// returns Fri Nov 6 04:56:00 UTC+1100 2009

Past date and time in GMT
Any past date expressed in local time can be easily converted to GMT.

var localtime= new Date(“Fri, 06 Nov 2009 04:56:00”);

var gmt=localtime.toUTCString();
// returns: Thu, 5 Nov 2009 17:56:00 UTC

GMT to local time
Reverse conversion from GMT to local time is also very easy. If your input time is expressed in GMT and you need local equivalent, just include “GMT” letters in the code to flag the input is not a local time!

var localtime= new Date(“Fri, 06 Nov 2009 04:56:00 GMT”); or
var localtime= new Date(“2009/11/05 04:56:00 GMT”);
// returns Fri Nov 6 15:56:00 UTC+1100 2009

Time in milliseconds (epoch)
Epoch is a very handy format for storing time info in a database and for all sorts of time related calculations.

var fulldate= new Date(); //returns local time as set on the user computer
var epoch=fulldate.getTime(); // converted to milliseconds

Unfortunately, I could not find any information on acceptable date formats. For example, this very frequently used Unix format does not work in JavaScript Date() function: “2009/11/06T03:42:40Z” - it has to be converted to a format that JavaScript can recognise before use.

Thursday, November 12, 2009

MashupAustralia highlights Pt 2

These are the last days of the competition so the number of submission is growing rapidly. Today just a few more examples of the most interesting entries. Again, applications using maps to present the information are dominating so, I will focus specifically on those.

The first application I would like to mention is another creation in Silverlight and Bing: GeoDemo (from the author of previously featured CrimeFinder). I must admit, I am very impressing with vector handling capabilities of Bing Map, even in FireFox browser! And I like the concept of changing boundaries to more detailed as the user zooms in closer and closer. Installing Silverlight is really no trouble but unfortunately not an easy task for institutional users (especially those in the public service) so, that may explain low popularity of this application. But there is a JavaScript version available as well and it also performs surprisingly fast (shown below)

There are also two new entries dealing with bushfires. Victoria: Fire Ready depicts locations of Fire Brigades and Police Stations in the vicinity of a chosen address. Distance circles of 10, 25 and 45km from the address are marked to allow users discern how quickly aid can arrive in case of emergency. The application also plots incidents reported by Victorian Country Fire Authority as well as current weather and wind conditions from the Bureau of Meteorology. Source information is very similar to what I use in my Bushfire Incidents map but the concept of the application and presentation is totally different. And much more popular judging by the number of votes!

Firemash is an attempt to match official announcements from NSW Rural Fire Services with incidents reported by a community via Twitter. Users can plot location of their houses on the map and if any incidents are reported in the vicinity, the application will retweet the information to alert the user. Interesting concept although its usefulness will be difficult to demonstrate at the moment because there are no fires reported in the entire NSW!

I will continue with more tomorrow…

Monday, November 9, 2009

New entries in mashup contest

There are a few new entires in the MushupAustralia contest. With only a handful of exceptions they are all built with at least a simple mapping capabilities (you guessed it, overwhelmingly Google Maps!). Here is a brief description of the most interesting ones, in no particular order:

Fridgemate: simple concept of creating a "fridge list" and a map of places of interest in a local area. Currently the most popular mashup, with top score and the largest number of votes.

CrimeFinder: very impressive thematic mapping application built with Silverlight and Bing Maps. Nice user interface. What is of interest to me the most is how did the author managed to incorporate Local Government Boundaries in vector format. I know, I know, Silverlight is vector format (as Flash) but it appears there is some support as well for vector generalisation since boundaries are redrawn with different level of detail depending on the map zoom extents.

Suburban Trends: also very impressive thematic mapping application with ABS population statistics and various crime related information. Great use of dynamically loaded vector polygons of suburbs and Google Chart API.

NSW Crime Explorer: another application exploring crime statistics in tables, graphs and thematic maps (it appears to use static KMLs though).

ABS In Motion: statistics presented on an interactive Flash charts (not for low end computers).

Demographic Drapes: OpenLayers application using Google Maps as a backdrop and a number of thematic layers with populaution statistics and various administrative boundaries. More like a traditional online GIS.

Tonight I have also submitted my last application for this contest: revised Bushfire Incidents map. Just in time for the new bushfire season. I added extra controls to show additional information on the map:
- wind conditions (live from Bureau of Meteorology Web Feature Service)
- RSS feeds from Victorian CFA and NSW RFS (geocoded on the fly where possible)
- Locations of Fire Brigades in Victoria (over 1,200 points), and
- Weather widget and YouTube video player.

This is the last week of the competition. Entries close at 4pm, Friday 13 November. I am still hoping to see submissions from the commercial end of town but maybe they are all too busy chasing real projects…

GovHack winners announced

News from the recent mashup event held in Canberra under the GovHac banner finally started to filter though. The overall winner is Lobbyclue, a Flash based application attempting to visualise complex tangle of government suppliers and contracts with individual Departments. There is also has a mapping component, built with OpenLayers and OpenStreetMap API, but is very slow to load.

Although not specifically mentioned as such, the following creations appear to be runners up in the contest:

Know where you live: stylish snapshots of population statistics from ABS and some crime statistics (for postcodes in Sydney area only).

What the Federal Government Does: tag cloud and network diagram of different functions of the government and its various departments (concept only).

Rate A Loo: shows the location of public toilets in the ACT - along with some descriptive data, and allows users to rate the condition of public toilets and comment.

Overall, there are 13 projects listed on GovHac. Unfortunately, I could not get some of them to work so, I reserve my further comments. Although, there were a few potentially interesting submissions. You can find links to most of them on the GovHac wiki page.

I was hoping to give you a first hand account of what was happening at GovHac but it was not meant to be. When I turned up for the closing ceremony, at the nominated place and nominated time, there was only me, the caterers, buckets of drinks and plenty of food…. I figured out that there must have been some last minute changes to the schedule so I went back to ANU campus only to find locked door and a note stuck on the wall “See you next year”. I managed to communicate with a person on the other side of the door that “something is still happening upstairs” but for me, there was nothing else to do but leave…

The disappointing bit was - no, not that I missed out on the closing ceremony but - lack of participation from commercial operators and larger companies. The event was dominated by amateurs. Don’t get me wrong. It is good that individuals are able to present fresh ideas on potential use of government information but the real breakthrough cannot be achieved without active involvement of IT/GIS industry leaders. However, to secure that participation organisers would have to provide some tangible commercial benefits – like publicity, contacts and/or prospects of big money for specific projects… Neither appears to be available for now.

Organisers are planning GovHack Canberra Encore on November 11 2009, 4:00pm - 5:00pm to show several submitted applications to a larger audience so, I will have another chance to see it first hand.