(Brutus, I think you meant to say “LCD”, not “CRT” a couple of times)
According to most of the literature, they don’t continue to go bad over time, but it’s common for pixels on LCD’s to fail shortly after initial use. Due to current manufacturing limitations, it’s hard to avoid, and most LCD manufacturers have a stated number of “dead pixels” they consider acceptable. After you’ve used the thing for a while, all the pixels that are going to die should be dead.
Most of the drawbacks for gamers have been covered - there are latency problems with quickly changing images, and LCD’s have a native resolution. When running at lower resolutions, the driver has to interpolate, and you get, in most cases, “jaggies”. I had a driver which even took the tack of doing lower resolution by restricting itself to a smaller image in the center of the screen.
I’m not a gamer, so I’m never going to run my monitor at lower than the highest resolution, so that fact and the latency problems don’t bother me. They may seriously bother you, though, if you try to play games on an LCD.
The other major drawback with LCD’s (besides price) is viewing angle - though they are getting better.
LCD’s produce much crisper text, and are usually good for digital photography, but the images they produce aren’t to everyone’s taste, and they aren’t suited to every purpose.
One feature of the industry I consider very ironic is the “impedance mismatch” between display vendors and video card vendors. The former are pushing LCD screens like mad, including DVI digital interface. The video card makers, locked in a 3D frame rate war with each other for the “hearts and minds” of the gamers, have dragged their feet on DVI support, and the selection of cards compatible with your LCD has always been limited (this is also getting batter, but is still a factor).
That may be another reason a gamer might not want an LCD, at least a DVI / DFP based one. The high-end video card you’ve got your eye on might not support it.