Getting Serious About GIS Information Security

The Geospatial Industry needs to get serious about information security. There have been countless news stories in recent weeks and months about the "security risk" represented by the ubiquitous availability of geospatial data.  We must take a proactive role in educating the public and policy makers about the issues surrounding Geospatial Information Security (GeoInfoSec).

One of the most recent and ridiculous examples of GeoInfoSec paranoia is the bill introduced to California legislature by some crackpot insisting that government buildings, schools and churches be blurred by publicly available imagery providers.  There was also a great amount of angst in the press about the fact that the Mubai terrorists had better geospatial intelligence about their targets than the security forces did that were protecting them.  There was even some movement after the attacks to restrict the use of GPS devices and to make the act of mapping a criminal activity.  In China recently several scientists were temporarily jailed for mapping geology.

This knee jerk reaction to limiting the availability to any geospatial data is amazing to me.  It would be tantamount to closing all optometrists for fear that someone might inadvertently give eye glasses to some terrorist.  Most of us in the industry when we read these articles just roll our eyes, chuckle to ourselves and get back to work.  We must do more.  We must do much more.

Unfortunately, there are few in our industry that are willing or able to have a serious discussion about GeoInfoSec.  Information security is an issue that we leave to the IT guys and we generally continue to share our data as freely and openly as a cooler of soda at a summer picnic.  And in the vast majority of cases, this free love approach to sharing geospatial data results in a tremendous benefit to society.

So, how to we rationally evaluate when the free love approach to sharing geospatial data is not an appropriate approach?

This is not rocket science.  There have been a bunch of smart people thinking about these issues for a long time and their work is freely available to you and anyone else in the industry.  The RAND Corp. has done some of the best work in this area.  Their "Mapping the Risks" publication released in 2004 is one of the best early works in this area and a must read for anyone interested in this issue.  A search for the word "Geospatial" on the RAND web site will return a lot more very interesting reading. In summary, the RAND publications suggest that we use three general criteria for evaluating the relative security risk of a particular data set:

  • Usefulness - how useful would this data set be to someone planning an attack?

  • Uniqueness - is this same information widely available somewhere else?

  • Societal benefits and costs - what would be the relative cost to society if this data set were removed from circulation?


Using these three main criteria, we can establish a rational process for evaluating the relative security risk associated with the release of a given geospatial data set.

I would add another consideration that we should add to the discussion, and that is what is it about the data that makes it unique or uniquely useful to an attacker.  The answer to this question is almost always attribution, not geometry.

For example, floor plans are often listed among data sets that are highly sensitive.  Floor plans represent fairly unique data (Rand McNally doesn't generally publish floor plans and you won't even find them on Google Earth) and it is possible that a given floor plan might indicate that the corner office on the fourth floor of building X belongs to the chairman of the New York Stock Exchange.  This information would clearly be of value to a potential attacker.  If we think about this critically, however, we realize that what is uniquely sensitive about this information is who the occupant of that particular office is.  If we strip out the occupant attribute, the rest of the floor plan has tremendous value to public safety planners and inspectors, space planners, facility managers and many others.  Simply by stripping out the value of a single attribute, we dramatically reduce the sensitivity of the data set, and enable tremendous value for many other potential users of this information.  So it is with many other geospatial data sets.  It is the attribution of a data set that makes it particularly sensitive.  If a few attributes are removed from the data, it suddenly becomes much more appropriate to share.

As a community, we need to be much more proactive about the systematic evaluation and documentation of the relative sensitivity of our geospatial data and systems.  There are some that clearly understand this and are taking appropriate action.  Greg Gault at the Bureau of Reclamation has completed a Certification and Accreditation of the Bureau of Reclamation GIS systems (BORGIS) as a major application within the Federal Enterprise Architecture.  Through this process he has analyzed and documented the sensitivity of the data managed by the BORGIS systems and plans in place to ensure the security and availability of those systems and data to the enterprise.  Greg is unusual in my experience, however, in his understanding of the security and systems issues involved and in his patience in dealing with federal IT standards groups.  We need more people like Greg in our community that will be proactive about GeoInfoSec within their own organization and will be willing to educate the public and our policy makers about the issues.

So…  What are YOU doing to manage GeoInfoSec within your organization?  How have these issues affected your day to day life?