409 Rate this article:
No rating

What Does Integrated Geospatial Data Look Like?

Mark Alonzo
It’s no secret that we are living today in a flood of data and it just keeps growing.  20 years ago, the hard problem was getting the data.  10 years ago the problem was having systems that could support the data volume.  Now that we’ve generally solved these hurdles, the main problem is getting the information from data.  Again, this is no secret as the main value proposition of dozens of geospatial companies is about turning data into information.  I see the first step in deriving more information from sources like optical imagery, SAR, and LiDAR as being about data integration, from which a cohesive picture emerges in time and space.  System integrators put enormous effort into solving this problem, but their solutions are often very complex and suffer from sometimes necessary, but painful and inefficient, what I call “beaurocratic” requirements from their customers.  Large government programs like NGA are searching for a more flexible system, with discreet applications and holistically integrated data sources.  While this is the dream of all of us consuming geospatial data, finding the platform that does this is elusive.  The NGA folks themselves have said that they’re still very early in the process of defining the system and requirements they want to use for such a platform and this is information that many solutions providers (like us) are eager to get. Large entities like Esri and Google are well positioned to be the anchors of such a system, but all suffer from certain limitations.  Open source advocates think open source is the best approach because it’s free, malleable, and basically crowd-sourced.  This can be great for non-critical applications, but for those of us who’s infrastructure integrity is important, we basically transfer the costs of open source software to consulting or customization services that tweak it to our needs.  Plus, at the end of the day, you have little guarantee that the open-source system is going to be stable or supported long term.  COTS solutions (or at least commercial, if not off the shelf) may seem more expensive up front, but typically come with guaranteed support and a dedicated company with a  planned roadmap and strategy.  This removes much, though not all, of the doubt with open source which can turn on a whim and be easily replaced by the next shiny thing. So, why my diatribe?  Really, I’m just thinking out loud about the evolution of the geospatial data and application world.  Sure, as lots of data moves to the cloud, how do I easily access, process, and integrate that data to give me information?  If I have a specific application or process that I like to use with my data, how does it live in a diffuse cloud system without moving gigabytes and gigabytes of data around between data repositories?  How do I tie all the pieces together or how do I deploy my application on the cloud?  Open standards are a good start – bridging the conceptual gap between open-source and commercial providers – but we still have challenges to tackle to tie this all together and give us a holistic picture.  This is certainly a big area of focus for us here at VIS.

Please login or register to post comments.