Without getting into the debate about whether it’s a good proposal or a bad proposal, I am curious to know why that particular value seems to be the current battle-cry. I would think that $10.00 or $10.25 would just be easier to keep track of.
Something like the minimum wage is going to be subject to regular adjustment.
Usually, when you foresee this, you adopt a regular method of calculating adjustments, which will include rounding conventions - we round to the nearest 10c, we round to the nearest 25c, we round to the nearest 1. Sometimes, it can be a little more sophisticated than that -within the range of 0-$10, we round to the nearest 1c, within the range of $10-$100 we round to the nearest 10c, in excess of $100 we round to the nearest $1 (or whatever).
Up to now the hourly rate of the federal minimum wage in the US has always been expressed in multiples of 5c, and I’m guessing that points to a certain convention. If the hourly rate is to exceed $10, you might argue that for convenience a slightly more granular rounding convention is appropriate. I don’t see, though, any compelling case for making it as granular as the nearest 25c or the nearest $1.
It wasn’t always. My first minimum-wage job paid $1.46/hr, IIRC (although it’s possible that was a sub-minimum wage – I was only 15 at the time). But later minimum-wage jobs in that era (early 70s) also were not multiple-of-5-cents rates.
As for the OP, I don’t know, but I would suspect that someone started at some previous minimum wage rate and applied COLAs (cost of living adjustments) for all the years since. If so, they went back to before any of the states started to apply COLAs, since the highest minimum wage of any state is less than $10.
That would have been your state’s minimum wage, then (or your poor memory, grandpa! ;)). You can see all the federal minimum wages here, and they’re all multiples of 5.
Or he worked at a restaurant, as I did as a teen (current Federal minimum cash wage for tipped employees is $2.13/hr).
Yeah, but that’s not the minimum wage; if he didn’t make enough in tips, the employer was obliged to make it up.
True enough - but that may have been the rate he remembers the employer actually paying.
In any case, I’m not sure I understand the premise of the original question - how is 10.00 or 10.25 (or any other wage) “easier to keep track of” than 10.10 (or any other wage)? Easier for whom, and in what way?? 10 cents an hour isn’t much, but it adds up to $200 over the course of a year for someone working 40 hours*50 weeks, which I’m sure someone making minimum wage would appreciate.
$10.10 was arrived at by taking the last federal minimum wage and adjusting it to the current cost of living.
It’s probably my poor memory.
Don’t think so. Or rather something like this was probably done, but not figuring from the most recent raise.
The current rate is 7.25 and was put in place about 5 years ago. 10.10 is about a 39% raise over that figure. The rate of inflation has been fairly low over that period. Way too low to compound to 39%.
In order to get 10.10 by this method, you’d have to include an era with very high inflation. The last such period was the 70s, which means they’d have had to go back a long ways.
The minimum wage jumped around between 9 and 10 dollars in 2013 dollars through most of the period 1955-1980. So I’d say that’s the reason the proposed raise is around ten dollars.
But as to why 10.10 instead of 10.00 even, I have no idea. Maybe just as an inflation buffer, to keep it above the 1955-1980 average for longer.
Why should having a number that’s easy to keep track of be important? If you were earning 17,000 dollars a year and it increased to 7,310, would you complain that the extra ten made it hard to keep track?
He said cost of living, not inflation. They’re different things.
I assume that these things are decided by some government committee. There would have been extreme opinions on either side - “Keep it as it is - how can business recover if they have to pay any more?” “How can anyone live on that - should be at least double.” The result is a compromise.
Yes, but the US cost of living has actually grown slower than inflation over the past 5 years because of the housing crash.
Took me forever, but I figured it out:
So they got the round figure by doing the math for a one-income family of three, and the specific by… well, marketing.