semantic assessment and
play

Semantic Assessment and Monitoring of Crowdsourced Geographic - PowerPoint PPT Presentation

Semantic Assessment and Monitoring of Crowdsourced Geographic Information Hamish McNair, Paul Goodhue University of Canterbury Christchurch, New Zealand Outline Our research Project outline FOSS framework for the project


  1. Semantic Assessment and Monitoring of Crowdsourced Geographic Information Hamish McNair, Paul Goodhue University of Canterbury Christchurch, New Zealand

  2. Outline • Our research • Project outline • FOSS framework for the project • Crowdsourcing information • Determining Trust • Ontologies • Linked Data • Future direction & Conclusion

  3. Our Research • Trusting Crowdsourced Geographic Information – Improving the trust of crowdsourced geographic information • Crowdsourcing Spatial Data Supply Chains – Implications of trust beyond the capture of crowdsourced geographic information.

  4. Project – Fruit Trees

  5. Project – Fruit Trees

  6. Project – Fruit Trees

  7. INPUT TRUST RATING ONTOLOGY LINKED DATA SPARQL RDFLib OUTPUT Folium RDFLib

  8. INPUT

  9. Crowdsourcing User Interface WFS-T Data Server Database

  10. Data Model

  11. TRUST RATING

  12. Conceptual Trust Model Intrinsic: Extrinsic: Components of CGI: Spatial: Spatial: Shape metrics of the Spatial comparison to geometry based on neighbours based on geometry type rules about the CGI Spatio-temporal Temporal: Temporal: Assessment of feature Temporal comparison to changelog or age of neighbours based on Assessments of the feature rules about the CGI Information Assessment of internal Assessment of CGI to consistency of CGI with external data and Semantic ontologies describing ontologies known to the CGI influence the CGI Assessment of the Assessment of the trust Assessments of the Informations Source author’s trust and likely of the author as influence on the trust reviewed by the crowd, Social of the CGI, e.g. through e.g. through Linus’ Law , previous trust ratings or peer reviews and assessments of local Consensus knowledge Crowdsorucing

  13. Trust Model Feature type rules Features queried queried from OWL From PostgreSQL PostgreSQL/ Python OWL PostGIS Trust rating written Comparisons between Back to database Features and ontology in python

  14. Feature Trust Rating fruit_tree_species Lemon fruit_tree_height 2m fruit_tree_crown_diameter 1m fruit_tree_dbh 0.12m fruiting_observation Fruiting fruit_tree_trust_rating_overall 100 fruit_tree_trust_rating_metrics 100 fruit_tree_trust_rating_fruiting 100 fruit_tree_trust_rating_location 100

  15. Feature Trust Rating fruit_tree_species Coconut fruit_tree_height 5m fruit_tree_crown_diameter 2m fruit_tree_dbh 0.3m fruiting_observation Fruiting fruit_tree_trust_rating_overall 66.67 fruit_tree_trust_rating_metrics 100 fruit_tree_trust_rating_fruiting 100 fruit_tree_trust_rating_location 0

  16. ONTOLOGY

  17. Ontologies • Ontologies in crowdsourcing? – accessibility – adjustability – versatility • Implementation – Protégé – OWL/RDFS/XML

  18. Ontology

  19. Ontology hasMaxHeight

  20. Ontology 10 metres hasMaxHeight

  21. Protégé

  22. Protégé

  23. Protégé

  24. Protégé

  25. LINKED DATA SPARQL RDFLib

  26. SPARQL Query in RDFLib • Return reference attributes (via URIs) SELECT ?O WHERE { <http://somethingGoesHere.org/foss4tree#appleTree> foss4tree:hasMaxHeight ?O }

  27. SPARQL Query in RDFLib • Return reference attributes (via URIs) SELECT ?O WHERE { <http://somethingGoesHere.org/ foss4tree# appleTree> foss4tree:hasMaxHeight ?O }

  28. SPARQL Query in RDFLib • Return reference attributes (via URIs) SELECT ?O WHERE { <http://somethingGoesHere.org/foss4tree#appleTree> foss4tree:hasMaxHeight ?O }

  29. SPARQL Query in RDFLib • Return reference attributes (via URIs) SELECT ?O WHERE { <http://somethingGoesHere.org/foss4tree#appleTree> foss4tree:hasMaxHeight ?O }

  30. SPARQL Query in RDFLib • Return reference attributes (via URIs) SELECT ?O WHERE { <http://somethingGoesHere.org/foss4tree#appleTree> foss4tree:hasMaxHeight ?O }

  31. SPARQL Query in RDFLib • Return reference attributes (via URIs) SELECT ?O WHERE { <http://somethingGoesHere.org/foss4tree#appleTree> foss4tree:hasMaxHeight ?O TO THE TRUST MODEL }

  32. Linked Data • Structure of RDF – Triples (Subject, Predicate, Object) <http://somethingGoesHere.org/foss4tree#t44> <foss4tree:hasHeight> <2.5> – Familiar (URIs), accessible, mashups

  33. OUTPUT Folium RDFLib

  34. LINKED DATA WUNDERGROUND PYTHON MODEL FOLIUM OUTPUT

  35. LINKED DATA WUNDERGROUND PYTHON MODEL PYTHON MODEL TRUST RATING > 70 WINDSPEED FOLIUM MAP THIS OUTPUT

  36. LINKED DATA LINKED DATA WUNDERGROUND ?id <http://somethingGoesHere.org/foss4tree#hasTR> ?tr . FILTER (?tr > 70) ?id <http://somethingGoesHere.org/foss4tree#hasSpecies> ?species . PYTHON MODEL ?id <http://somethingGoesHere.org/foss4tree#hasFruiting> ?fruiting . ?id <http://somethingGoesHere.org/foss4tree#hasLat> ? lat . ?id <http://somethingGoesHere.org/foss4tree#hasLong> ? long . ?id <http://somethingGoesHere.org/foss4tree#hasHeight> ?height FOLIUM OUTPUT

  37. LINKED DATA WUNDERGROUND PYTHON MODEL PYTHON MODEL TRUST RATING > 70 ID i LAT i LONG i … ID ii LAT ii LONG ii … FOLIUM ID iii LAT iii LONG iii … ID iv LAT iv LONG iv … OUTPUT

  38. WUNDERGROUND WEATHER UNDERGROUND LINKED DATA http://api.wunderground.com/api/##/ geolookup /q/%f,%f.json http://api.wunderground.com/api/##/ conditions/q/pws :%s.json PYTHON MODEL www.wunderground.com FOLIUM OUTPUT

  39. LINKED DATA WUNDERGROUND PYTHON MODEL PYTHON MODEL TRUST RATING > 70 WINDSPEED FOLIUM OUTPUT

  40. LINKED DATA WUNDERGROUND PYTHON MODEL FOLIUM map1 = folium.Map(location = [Lat,Long], zoom_start=16) . . . FOLIUM For tree in trees: map1.simple_marker(treeLat, treeLong, popup = '''... https://github.com/python-visualization/folium OUTPUT

  41. LINKED DATA WUNDERGROUND PYTHON MODEL OUTPUT FOLIUM html ... OUTPUT

  42. Where to from here…

  43. Where to from here… WHY? Improved credibility of crowdsourced data

  44. Where to from here… WHY? Improved credibility of crowdsourced data HOW? Trust models and implementation

  45. Where to from here… WHY? Improved credibility of crowdsourced data HOW? Trust models and implementation THE HERE AND NOW

  46. Traditional Spatial Datasets • Credibility from legacy • Provenance for tracing errors • Dataset-level consideration

  47. W3C PROV DATASET wasGeneratedBy COLLECTION

  48. W3C PROV DATASET wasGeneratedBy COLLECTION … back to triples!

  49. W3C PROV DATASET wasAttributedTo wasGeneratedBy COLLECTION AGENCY

  50. W3C PROV DATASET wasAttributedTo wasGeneratedBy COLLECTION AGENCY wasAssociatedWith

  51. Authoritative Data • Dataset-level reactive provenance

  52. Authoritative Data • Dataset-level reactive provenance

  53. Authoritative Data • Dataset-level reactive provenance

  54. Crowdsourced Data • Feature level

  55. Crowdsourced Data • Feature level

  56. Crowdsourced Data • Feature level

  57. Crowdsourced Data • Feature level

  58. Crowdsourced Data • Feature level

  59. Trust Ratings • Simple indication of credibility of Datasets Features Attributes • Provides proactive provenance • Increases usability of crowdsourced data

Recommend


More recommend