evalu luatin ing c commercia ial p l probe d data o on
play

Evalu luatin ing c commercia ial p l probe d data o on arteria - PowerPoint PPT Presentation

Adventures in Crowdsourcing: Verifying Crowdsourced Data Evalu luatin ing c commercia ial p l probe d data o on arteria ial f l facili ilitie ies: Insi nsights s from the V he Veh ehicle P Probe be P Projec ect validation pr


  1. Adventures in Crowdsourcing: Verifying Crowdsourced Data Evalu luatin ing c commercia ial p l probe d data o on arteria ial f l facili ilitie ies: Insi nsights s from the V he Veh ehicle P Probe be P Projec ect validation pr progr gram ZACH VANDER LAAN UMD CENTER FOR ADVANCED TRANSPORTATION TECHNOLOGY

  2. Agenda • VPP Validation Program Background • Data Validation Approach • Arterial Case Studies • Conclusions and Next Steps

  3. Agenda • VPP Validation Program Background • Data Validation Approach • Arterial Case Studies • Conclusions and Next Steps

  4. Vehicle Probe Project (VPP) History VPPI (2008) I-95 Corridor Coalition (now Eastern Transportation Coalition) • established first and largest multi-jurisdictional Traffic Monitoring System sourced with Industry data Established accuracy, latency, availability standards for probe- o based traffic data Developed validation methodology and program o VPPII (2014) Established multi-vendor marketplace • Speed/travel-time standards extended to signalized roadways •

  5. VPPI and VPPII Validation Have consistently validated VPP speed and • travel time data since 2009 VPPI (2009-2014) • 1 vendor (INRIX) • 45 validation reports • Focused primarily on freeways at first, but • started exploring arterials at the end VPPII (2014-current) • 3 vendors (HERE, INIRIX, TomTom) • 24 total reports • Freeways & arterials •

  6. Agenda • VPP Validation Program Background • Data Validation Approach • Arterial Case Studies • Conclusions and Next Steps

  7. Validation Process Main idea: • Collect ground truth travel time data • Compare with speed/travel time data reported by VPP vendors • Compute error metrics, visuals, and summarize results in reports Evaluate data quality: • On various road types (e.g., freeway, arterial) and geographic areas • From multiple perspectives: Wireless Re-identification Technology (WRTM) used to • “Point in time” vs repeatable patterns collect ground truth travel time samples • Overall performance vs. during aberrations

  8. Traditional Validation • Compare Vendor & WRTM speeds in 5-minute bins  Average Absolute Speed Error (AASE) : Measures deviation from ground truth (10 MPH spec)  Speed Error Bias (SEB): Measures consistent over/underestimation of reported speed (+/- 5 MPH spec) • Error metrics are computed for four flow regimes • Specs are applied against Standard Error of the Mean (SEM) band (interval estimate of mean)  Works well on freeways, but doesn’t tell the whole story on arterials

  9. Arterials are more complex… Arterial characteristics (relative to freeway) ● Lower volume ● Lower average traffic speeds ● Interrupted flow (traffic signals, mid-block friction) ○ Bi-modal speed distributions ○ Higher variance in speeds Implications for Traditional Validation ● Average (i.e., space-mean) speed is used for evaluation – unexpected results when WRTM speeds have high variance / multiple modes ● Error measures need to be carefully interpreted Distinct speed modes (but no one ○ High variance can mask performance (wide band) travels at the average speed)

  10. Slowdown Analysis • Major slowdown events identified in reference data For each slowdown, vendor data is graded based on how • well it captures the magnitude and duration:  Fully captured  Partially captured  Failed to capture • Evaluates data quality specifically during anomalies (traditional method weights all 5-min periods the same)  This approach turns out to be useful on arterials

  11. Agenda • VPP Validation Program Background • Data Validation Approach • Arterial Case Studies • Conclusions and Next Steps

  12. Original Arterial Report (VPPI) ● Original arterial report produced in 2015 ○ 13 separate data collections during 2013-2014 ○ VPPI data only (1 vendor) ● Key findings : ○ Data quality depended heavily on road characteristics (signal density, and to a lesser extent volume) ○ Slowdown analysis provided the most insight into data quality ○ Fundamental issues across all case studies (erring towards faster speeds, complex flow patterns can’t be obseved) Recommendations from ** ORIGINAL ** VPPI report

  13. Arterial Report Update – VPPII Follow-up ● ETC Coalition commissioned an update based on VPPII data ○ Has data quality improved over time? ○ Are there major differences in data quality across vendors? ○ Is data quality still linked to road characteristics (signal density)? ● Updated report was produced in November 2019 ○ 14 separate arterial data collections between 2014-2018 ○ 3 vendors (called VPPII Vendor 1,2,3)

  14. Traditional Analysis Results All vendors in VPPII: ● Are highly compliant with contract specs ● Have improved error measures (AASE and SEB) across speed bins relative to VPPI levels  Encouraging results, but recall traditional validation does not tell the whole story on arterials

  15. Slowdown Analysis Results ● Drastic improvement for all vendors ● Fully captured: 33%  59-66% ● Failed to capture: 25%  6-10%

  16. Impact of Road characteristics ● VPPI data quality was closely linked to road characteristics (especially signal density) ● This is NOT the case with VPPII data (all vendors) ● But.. AADT and signal density are still worth considering ○ Lower AADT = fewer observations (harder to characterize ground truth conditions) ○ Higher signal density = more complex traffic flow

  17. Agenda • VPP Validation Program Background • Data Validation Approach • Arterial Case Studies • Conclusions and Next Steps

  18. Conclusions ● Performance has improved dramatically over time All 3 vendors much more accurate now than VPPI ○ ● Within the observed range of road conditions (0-3 signals / mile, >20k AADT): ○ All 3 vendors’ data is suitable for planning and many operational use cases ○ Data quality is no longer tied to road characteristics (encouraging, but harder to provide “rules of thumb”) ● Existing challenges: Data errs towards faster speeds during congested periods ○ Complex flow patterns can’t be captured in single value ○ Low volume roads difficult to validate ○

  19. Next steps ● Evaluate arterial probe data under conditions that fall outside current case studies ● Two 2020 low-volume deployments (<20k AADT) ● Refine analysis techniques ● Focus on developing methods to quantifying repeatable patterns, rather than just “point-in-time” ● Prepare for VPPIII launch in 2021 ○ Speed and travel time data remain a core product ○ E.g., Comparing time-of-day travel time Traffic volume & other products added (e.g., trajectory, O-D) distributions to quantify repeatable patterns ○ Currently developing validation strategies to streamline process and accommodate new data

Recommend


More recommend