Module 1.1 Calculating Metrics for Spatial Data Quality

Here we are in week 1, module 1. The topic covered for this week are Precision vs Accuracy, RMSE and CDF. Some of the learning outcomes were being able to understand the difference between precision and accuracy. Knowing how to calculate vertical and horizontal position accuracy and precision. Knowing how to calculate root-mean-square error (RMSE) and cumulative distribution function (CDF).

Below is the map I created that shows the precision of the collected waypoints. I started by calculating an average of the 50 waypoints that were collected to make an average waypoint feature. Then, I measured the distance from the average waypoint feature and the collected waypoints. The shorter the distance between the two the greater the precision. From there, I created the 3 buffers which represent the 50%, 68%, and the 95% precision estimates, which made it easy to visualize the variability of the precision. Then I determined the 68% precision for the estimate of the elevation for the vertical accuracy.




Horizontal accuracy measures how close a GPS-derived location is to a known, true position—typically a surveyed benchmark.

Horizontal precision measures how tightly clustered repeated GPS readings are, regardless of their proximity to the true location.

The average GPS location was 2.8 meters away from the reference point. The GPS unit demonstrated a horizontal precision of approximately 3.2 meters, indicating that 68% of recorded waypoints fell within a 3.2-meter radius of the mean location. The horizontal accuracy, measured as the distance between the mean GPS location and the surveyed reference point, was 2.8 meters. This suggests that while the GPS readings were moderately dispersed, they remained relatively close to the true location. The precision radius being slightly larger than the accuracy offset implies low random error and moderate systematic bias. This is a significant difference.


 

Comments

Popular Posts