How to State The Confidence of Accuracy/Inaccuracy?
Consider that I have a dataset automatically acquired by a machine that returns the following measurements:
[111, 121, 114, 154, 149, 150]
I then go and manually check how these values received by the machine compare to the true values, and I get the following measurements when checking manually:
[112, 121, 114, 154, 149, 149]
As you can see, the datasets differ in two places (I measured 112 where the machine saw 111 and I measured 149 where the machine saw 150), meaning the machine is inaccurate.
With that, what's the proper way to state and calculate how confident I am in how accurate or inaccurate the machine is? I could obviously say it was wrong 2/6 times (33% inaccurate/66% accurate) but I wasn't sure if there's a better way to represent this, especially with a larger dataset than the one I've listed for the example.
As a follow-up question, how could I compare these confidence or accuracy levels? For example, if this machine is usually 94% accurate but was recently improved to be 98% accurate, how would I state that apart from it's 4% more accurate?
Topic confidence accuracy
Category Data Science