Map Change Prediction for Quality assurance Using a low quality datasource to predict changes in official governmental data[go to overview]
Governmental geospatial data are usually considered a de-jure goldstandard by official authorities and many companies working with geospatial data. Yet, official geospatial data are far from perfect. Such kinds of datasets are updated in long intervals (e.g. yearly), only selectively according to the judgements and regulations of the local governmental organizations and updated in a migratory process at the state and federal level. However, volunteered geographic information projects such as OpenStreetMap can provide an alternative in both freshness of data and possibly a bigger coverage of semantic attributes in the respective areas. This talk is about an attempt to use this often perceived low quality volunteered geographic information data to predict changes that will occur in a future revision of the authoritative geospatial data in order to give a hint to authorities which geometries and attributes are to be updated in their respective areas. Featuresets used for classification consist of established data quality metrics in the geospatial community to measure the quality of geospatial data. The talk will highlight first results of this classification for two similarly sized city areas of Germany and discuss the findings for selected machine learning algorithms.
28.02.19 - 10:15