Standards underpin the entire discipline of geomatics, yet they are also the cause of the biggest failures when applied without much insight into their limitations.
Just imagine trying to create a map by overlaying two data layers with different and unspecified datums and projections. It is simply not possible. Datums and projections are examples of practical standards that work because they play a very important function in capture, management and use of spatial information.
Yet, there are also many cases where standards are an obstacle - for example, when people take them “too literary”. Many standards published by the Open Geospatial Consortium (OGC) for data interchange fall into this category (I have to declare here that I have been an avid critic of inappropriate use of spatial standards for a long time). Take for example GML defining geographic data structures – conceptually it is great, but when was the last time you downloaded or exchanged spatial data in this format? Never? I thought so…
Web Feature Service (WFS) is another standard that, in my opinion, failed to deliver. It was designed for accessing and/or updating data stored in dispersed databases via a common exchange protocol. True, you can quote “hundreds of millions of dollars” in projects for implementation of systems utilising WFS but beyond the closely gated communities with strict internal implementation rules, these are not part of a “global spatial data exchange”, which simply does not exist despite more than a decade of concerted efforts by many, many individuals and organisations. At the end of the day, people still prefer to get the data in “good old” csv or shp format… I am trivialising the whole issue but you get the point.
Spatial metadata standard is another example of a great concept (in theory) that failed to deliver in practice. The standard is too complex to implement and categorisations used are open to interpretation so, even if a particular record is “compliant” you cannot assume the information will be compatible with a record compiled for a similar dataset on another system. I am sorry to say but that is why any Spatial Data Infrastructure project based on ISO 19115 Metadata Standard is doomed to fail from the start…
Human nature is that we are drawn to simple things because of practicality, yet when we design things by a committee they tend to be bloated with complexities due to ever growing list of requirements. This is the case with many spatial standards…
I still remember the hype about WFS and GML at the conferences and presentations about interoperability a decade or so ago (funny how this word quickly fell off the vocabulary list) and how “dumb image” Web Map Service (WMS) was downplayed to the extent it was seen as an inferior solution not worth implementing. Yet, even dumber solution from Google (ie. static tile based representation of spatial data) succeeded as a dominant online mapping tool (every respectable GIS software vendor offers tiled maps as a part of standard package). The promoters totally ignored a much simpler format for data transfer offered by Simple Feature Service (SFS) standard, that would have much better chance of being widely accepted, opting instead for the “the biggest thing in town”- WFS.
However, despite all the efforts and good intensions, the non-GIS centric rest of the world didn’t buy into the arguments and invented alternatives such as RESTful service and GeoRSS and GeoJSON formats for transfer of spatial data - in order to address specific requirements and, most importantly, to keep things simple! Open Street Map project (which has grown to the extent that now contains roads and topo features for almost the entire world) invented its own spatial data structure instead of following officially sanctioned GML format. Meantime, Google pushed its version of spatial data format called KML which was specifically created for Google's 3D mapping application (KML was subsequently handed to OGC for ongoing management). All these became de-facto standards - by acceptance and not by design.
So, should standards be followed or ignored? The key message I am trying to convey is that standards should only be used for guidance and be implemented when you can gain an advantage by following them. But standards should be ignored when they are an obstacle. For example, persisting with implementation of solutions incorporating WFS, WCS, CSW etc. OGC standards for backend processes, just for the sake of “compliance”, totally misses the intention of original creators of those standards. Pick the best bits and create your own “standards” if it gets you to the finished line faster and/or delivers extra benefits. Sure, if there is a demand for your data in WFS, etc. format, you should cater for that need but it should not stop you from implementing a backed architecture that is optimal for your specific requirements, regardless whether it is “compliant or not”.
All in all, forget the standards if they prevent you from creating better solutions. Create your own instead. But use existing standards when they are fit for purpose. Because there is no need to reinvent the wheel… Common sense, just common sense.
Related Posts:
Ed Parsons on Spatial Data Infrastructure
Data overload makes SDI obsolete
What's the benefit of gov data warehouses?
No comments:
Post a Comment