Monitors

I need to buy some 24 (or so) inch monitors for medical imaging. Many people seem to use these: NEC 24" Professional Wide Gamut Graphics Desktop Monitor, 1920x1200, 340cd/m2 Brightness, 1000:1 Contrast Ratio, 8ms Response Time, DVI-D, HDMI, VGA

These cost about $800.00 each.

OTOH: I can get, for example, these: ASUS VS248H-P 24" Full HD 1920x1080 2ms HDMI DVI VGA Back-lit LED Monitor

for $149 each.
I know that in the past medical monitors were significantly different than commercially available ones. Is that still the case. Why would the expensive one do better than the cheap ones.

The biggest difference between the two is the “wide gamut” on the NEC monitor. It will be able to display more colors, particularly on the saturated green end of the spectrum. That can be very important for photographers and print designers in some but not all circumstances. It’ll be totally irrelevant if your medical images and display software aren’t designed for wide gamut.

How important is accurate, saturated color for this particular application? If it’s just for looking at single-color images like digital x-rays, my WAG is that the wide gamut monitor is complete overkill.

As a computer guy who has bid on gigs for medical and dental offices I agree with this sentiment.

The challenge from an IT standpoint is such facilities are often very picky and have high cost tolerance. The way medical IT providers present such items is, “this is what we reccomend for best performance” and will make them sign off on the request for lower perfomance items, much like a medical facility might force you to sign a form stating you are proceeding “Against medical advice” if you decline treatment.

If the facility comes back later unhappy with the cheaper displays, the IT provider will happily say…no problem, $800 each plus labor to remove the old as well as reinstall the new, and no we are not crediting the cheaper monitors you insisted on initially.

I used to work for the NHS in the UK. It was my experience that anything sold for"medical use" would be several times the price of a commercial equivalent.

One example: The Orthopods wanted a pair of bolt croppers to cut the titanium pins that they use. They did not need to be sterile as the pins were sterilised after cutting. They ordered a pair from a theatre equipment catalogue that were exactly the same as a pair from the same manufacturer, sold in an engineers equipment catalogue, but three times the price.

What sort of medical imaging are you viewing? What are the requirements?

Because these NEC monitors sold for medical imaging use are a lot more expensive than the ones you’re talking about. They start at $1,500 and go up to ten times that. The products also mention being cleared by the FDA or having DICOM calibration. Is that necessary for what you’re viewing?

I wouldn’t want to have my radiologist miss a tumor or broken bone because someone went cheapskate on the imaging monitor.

Many off the shelf monitors and video cards already exceed the color gamut perceptible the human eye. I would be willing to bet the imaging resolution of many an MRI is way less granular than any decent monitor.

That may be true, but it doesn’t matter what the color gamut of your monitor is, if there is a requirement that it be on a FDA approved list for medical imaging (and I don’t know if that is a requirement). And even if there is no such requirement, is this something you’re prepared to defend in a malpractice suit?

Edited to add, this PDF document talks about the requirements for medical grade monitors.

I just read through the link, basically 4K monitors meet or exceed the specs here. Thus a $350 monitor can handle it. I can see where a few years back that might have been a $1500/monitor.

It is almost all black and white, any color is false color, so a wide gamut is absolutely not necessary.

I dunno, some of the specs seems more involved. I looked up a bit about the DICOM standard, and it involves a very specific luminance curve. Elsewhere, there’s mention of overall luminance levels that are adjusted for room brightness.

OTOH, you might be able to achieve most of that just by setting the gamma and brightness by eyeball, and nearly all of that by getting a high quality consumer display and calibration tools. But you won’t have any paperwork proving your monitors meet any medical standards…

I’m on the veterinary side, not human, but I can tell you all the monitors I’ve seen for xray, MRI, CT, ultrasound, etc., are standard HDMI desktop monitors, 24" to 30" usually. The only eyes that miss anything are the untrained ones.

I agree with SeaDragon. It is the eyes and not the medium. When I look at studies on my iphone (not for diagnosis) I can see 95% of what is there. I have always thought that many regulations are self serving for those who help develop them.

I don’t work in the Medical field but can easily believe the additional cost for the NEC monitors which feature additional calibration functions and other features (such as sensors to monitor ambient brightness and a “human sensor” to control power-down features.)

I can confirm that any product that requires additional manufacturing steps to guarantee compliance to medical, military or aeronautical standards will incur additional costs regardless of additional features. Something as simple as a bolt for an aircraft becomes expensive when documentation guaranteeing that it’s not counterfeit must be obtained and tracked.

Same experience my dad had in Spain. This included for example regular TVs to be hung on room walls at the new hospital; after recovering from the prices quoted by the approved vendors, he called up his previous job (an electronics manufacturer) and was able to get all monitors and TVs for much less than the ones with “medical” in the invoice. Enough to be able to quell the people grumbling about unapproved vendors.

There were no additional features; in fact, the monitors were normal TVs with one part removed.

Is this even theoretically possible without an integrated colorimeter and full-stack color calibration from the medical imagers to the image formats to the operating system to the display driver to the viewing software to the monitor itself??

Consumer electronics are sold in bins of “good enough” quality with semi-predictable, variable degradation curves. With time the components (LEDs, etc.) will dim and yellow and change color in subtly different ways, and unless constantly corrected for, the colors will just get more and more off with time anyway through regular wear and tear. Do medical monitors really adhere to some sort of time-sensitive, equipment-calibrated set of colors… and do they keep this across the even the same clinic/hospital, much less across different locations?

Just seems so unlikely, but I don’t know anything about this field!

Step 1: Declare that your monitors are approved for medical use.
Step 2: Establish that only your proprietary certification system is acceptable.
Step 3: Jack up the price obscenely.
Step 4: Profit.

↑↑↑ This
Done in a lot of industries.

I’m a cheapskate and also a guy that does a lot of professional photo retouching. My work rarely requires a huge color gamut – but what is important is a wide viewing angle. That usually means an IPS (in-plane switching) monitor with a 178 degree viewing angle. That’s the tech used on Apple 27" screens, and most Dell Ultrasharps.

Without the wide viewing angle, it’s very tricky to tell if you’re looking at a solid, even color or a graduated image that fades from one color into another. On a typical, non-IPS, monitor the middle of the screen can look bright red while the edges of the screen look warm gray. If you move your face so that you’re looking directly at an edge, then the middle of the screen looks like warm gray.

With an IPS (or comparable higher cost technology) panel, the whole screen looks even at normal viewing distance.