Math question

I so suck at math. At my work we perform a number of tasks. We are allowed so many errors per 1000 tasks. If someone performs 1673 tasks and commits 2 errors, what is the error rate per 1000?

Divide the number of errors by the total to get the error rate: 2 / 1673 = 0.001195457 (roughly).
Then multiply that by 1000 to get the number of errors per thousand tasks. So it works out to about 1.195 errors per 1000.

Another way to look at this problem is **error rate **per thousand.

*Per *means “divide by”.

How many thousand tasks do you have? 1673 / 1000 = 1.673 thousand tasks

How many errors? 2 errors.

error rate = errors per thousand tasks = 2 / 1.673 = 1.195