It depends entirely on two things. One people will agree with me on, the other most people will not agree with me on.
First, it depends on the type of music you are ripping. It depends on the dynamic range of the music, the business of the music, the types of instruments used, whether there is a vocal component or not. I have done many experiments at home with this, as music quality is important to me.
There are only a couple 112 kbit MP3s I have that I cannot tell a quality difference when compared to the CD original. But this is to be expected.
I have found that I can tell a quality difference on about 40-50% or so (roughly) of 128 kbit MP3s, when compared to the CD original. Not bad overall, since I am a pretty discriminating listener with excellent hearing, esp. in the high ranges (I can hear 17+ kHz).
I have found that I can tell a quality difference on about 1-3% or so (roughly) of 160 kbit MP3s, when compared to the CD original. The only ones that I hear a difference on are ones with extreme dynamic ranges or pitch changes, very busy and fast music, esp. crashing symphonic music.
I have found that I cannot tell a quality difference on any 192 kbit MP3s, when compared to the CD original.
The second factor, that many will not agree with me on: I have, can, and do notice a small quality difference between MP3s ripped by difference rippers from the same source. This is easy for me to test too - I can go out to Napster, download 5-6 different copies of the same song ripped at the same rate, and often I can tell in any blind test you arrange, even at 160 kbit, that there is a difference between the song quality of at least one of the rips. People do not believe me, but I am and will be happy to demonstrate IRL if that is ever possible. I’ve also noticed the baseline volume level of the MP3s changes a lot between rippers, which is also disturbing to me.
Anyhow. Hope this helps.