Will the CityGML Standard Be Successful?

Yesterday, a group of us drove to Cambridge, MA to attend the OGC 3D Fusion Summit held at the Stata Center on the MIT campus. It was a very interesting event with presentations by:

  • Tim Case, Parsons Brinkerhoff

  • Mike Horhammer, Oracle

  • Gene Roe, Spar Point Research

  • Alain Lapierre, Bentley Systems

  • Neal Neimiec, Autodesk

  • Patrick Gahagan, ESRI

  • Paul Cote, Harvard

  • Claus Nagel, University of Berlin

  • Dr. Kevin Wiebe, Safe Software

  • Niels LaCour, University of Massachusetts - Amherst

  • Carsten Roensdorf, Ordnance Survey


Certainly there were a lot of very thought-provoking presentations and discussions, but at the end of the day I was left with some real questions about whether CityGML will become a successful standard.

For my money, the most thoughtful presentation of the day was given by Paul Cote from Harvard.  He very clearly described the logical structure of the CityGML standard as well as making the business case for why CityGML data would be valuable to a wide variety of organizations and business processes.

After the lunch break, there was a panel discussion with Matt Davis, ESRI; Tom Gay, FM Global; Javier Lopez, Oracle; and representatives from Google, OGC, and Autodesk whose names I neglected to write down (sincere apologies). There was quite a contrast of perspectives between the representatives on the panel (largely industry) and many of the other speakers during the day who were verbalizing a more government and academic perspective on the emerging CityGML standard.

One of the more enjoyable talks of the day was delivered by Dr. Kevin Wiebe from Safe Software. He has a great sense of humor and an ability to explain very complex subject matter in a way that is easily understandable. He also has about as much experience as anyone on the planet in translating data from one format to another and so his perspectives on what makes for an effective data translation were interesting to hear.

At the end of the day, I was left wondering what it takes for a standard to become successful (widely adopted by a global community of users that use the standard in every day life and the software industry actively engaged in enhancing the standard). It seems to me that those standards that have become very successful have been those that focus on an area where there is significant commercial appeal. The W3C standard would be one obvious example of this, KML would be another. In many respects, the ESRI shapefile has become an industry standard for data exchange even though no "Standards Body" has formally adopted or endorsed it. These are standards that have been picked up by the major software vendors and deliver significant value to a wide range of users. The standards are rigorously enforced by the software community. If you do not format your HTML correctly, it will not render properly in the browser.

Standards that have been promulgated primarily by government or academic interests have often been less successful. Anyone who has any experience with floor plans can tell you that the "CAD Layering Standard" is anything but universally applied. The Department of Defense SDS-FIE standard and the various metadata standards have also struggled to gain widespread, consistent implementation.

Given the intense competition in the software industry to build creative and powerful architecture and engineering design software, is it any surprise that Bently, Autodesk, ESRI and others are not leaping to embrace the emerging BIM standard that would force them all to use a least common denominator approach to software design?  These companies are competing hard on differentiating their products from one another so that they can provide unique capabilities to their user base.  No matter how many large government agencies are enlisted to beat the software industry into submission, without a compelling commercial motivation for the industry to embrace and enhance the standard there is not likely to be enthusiastic industry participation. It is hard to see what that motivation might be in the case of BIM.

So where does this leave CityGML? There is no question that the emerging CityGML standard is sensible and well designed. Unlike the BIM community who has created a text file based approach, OGC has created a thoughtful hierarchical set of abstractions in XML that would be tremendously useful to many different workflows in planning, facilities management, public safety, etc. Whether these structures could be efficient enough to deliver content through a web service model for web consumption has yet to be seen. Unfortunately, the planning and local government communities are unlikely to have the funds to invest in the development of significant CityGML datasets. Without this kind of investment, the standard is unlikely to take off quickly.

There is a possibility that some of the big data providers like Navteq, or TeleAtlas might see value in developing large urban areas in CityGML format to serve the consumer navigation market. If Google or some of the large mobile network providers were to recognize the value of these datasets for delivery through their web and mobile clients, then the CityGML standard could very well generate a tremendous amount of interest. My guess is that we are a few years from this kind of large scale investment in data development. Do you agree?