Skip to main content

Spatial Data Analytics

Mining companies strive to explore new and monitor existing sites, insurance companies need to evaluate claims, and financial service companies have to check on potential investment locations and monitor their assets. These use cases represent only the top of the iceberg, and the full potential of spatial data is only just being unlocked. But the analysis of spatial data depends on high-quality, homogeneous and up-to-date geo datasets – ranging from satellite, airborne or drone images to information about foot or road traffic. The compilation, actualization and analyzing of these datasets is tough and often inefficient. In this context, there is a need to overcome big data challenges, exploiting machine learning algorithms and utilizing cloud computing facilities to carry out advanced data processing and analysis on local to global scales. The Focus Topic “Spatial Data Analytics” will promote, support, and accompany innovative projects in this field – from idea generation to implementation.

First meeting: Topics: Introduction, Mission Statement, Geo-spotlights (2-3 lightning talks about spatial analytics with short discussion) 

Academic leader: Stefan Keller, OST,

Industrial leader: Reik Leiterer, Exolabs,



Webinar “Open Data for Blue Light Organizations”, organized by S. Keller, OST & GEOSummit (click to open)

80 participants joined the webinar on “Open Data for Blue Light Organizations” on the 11th of May – and in case you missed it, you can find the presentations here. The webinar started with practical reports from the operations control system of Hexagon Schweiz AG and from the Berlin Fire Department of ESRI Germany/Switzerland. Among other things, both showed how blue light routing works exclusively with OpenStreetMap data. One finding was, that there is still a need for education about Open Data and OpenStreetMap (OSM). Indeed, OpenStreetMap could be more homogeneous in terms of its data and services. And indeed, OpenStreetMap lacks (open) support for data integration processes. BUT the know-how for this is there and it is continuously being worked on – just as the authorities are working on improving OGD.

In the discussion, the familiar question of “uncontrolled changes” in OpenStreetMap then came up, i.e. what happens if, for example, a jewelry store is deleted on purpose? The answer to this is: i) there are volunteer “mappers” and professional data curators who observe (= monitor) and correct their respective areas; ii) you see this in the data history similar to Wikipedia; iii) different amounts of time pass before this shows up in the system, and finally iv) quality assurance tools can still prevent such unwanted edits. In the case of OpenStreetMap data, quality assurance takes place partly beforehand and then additionally afterwards during data preparation – as is state-of-the-art in data warehouses. Another discussed demand is: “We want highest quality”! It is better to speak of defined quality as the basis for “fit-for-use”. Data users with higher defined quality requirements should manage OpenStreetMap data with the data warehouse principle. This has always included data monitoring and data cleaning – including vandalism detection. That “clean data” from OpenStreetMap is possible is shown by the “Daylight Map”.

Sum-up: Authorities are welcome to integrate their OGD into OSM if they follow some recommendations, as shown by the University of Bern guide [] (2021). And vice versa, OSM data can be used by public authorities – as this event pointed out. See also the recent expert opinion by Schlauri & Marti (see TechLawNews 21, May 2023, by Ronzani Schlauri Attorneys at Law) and the “POP Study” [] (2021).