View Full Version : What does "computer validation" mean?
04-23-2002, 05:44 PM
I need to know for a job I am looking at. Google finds many hits mostly for pharmacutical companies. This also happens to be the type of company that I was looking at when I saw the phrase in the first place.
This website (http://www.computervalidation.com/basics/basics%20compval1-0.txt) gives the following definition:
"COMPUTER VALIDATION DEFINITION
Documenting that the computerized system does what it's supposed to do, and
ALWAYS does what it supposed to."
If the answer is this simple, why have I never heard of it before? (I have a CIS degree and have worked at several IT related jobs.) Also, why does this "computer validation" concept seem to be related to the pharmacutical industry?
04-23-2002, 06:32 PM
Just to take a wild guess here, but I would imagine the pharmecutical company is using computer modelling for new drugs and chemicals, and want to make sure their computers are accurate (so their models and simulations don't give the wrong results).
Remember the Pentium math bug a few years back, where Intel got egg on their face because their Pentium IIs(?) couldn't divide properly? That's what Computer Validation tries to prevent.
04-23-2002, 09:57 PM
I don't think this term is industry specific. IIRC this is an older, somewhat genralized term dealing with circa 1960 and 1970 computers and "certifying" (assumedly via various diagnostic procedures) that the computer is operating correctly which would have been a very big deal in the tube and early transistor eras.
Modern CPU's are so reliable and have powerful multi-layered built in diagnostic hardware verfication routines built in, so this verification process external to the computer is pretty much un-necessary for the vast majority of tasks.
I would imagine that if a computer is extremely complex and the task is mission critical like some academic, defense or medically related uses, some type of external verification is probably still performed.
04-23-2002, 10:15 PM
Unless formal methods are used, however, it is impossible to validate a complex piece of software.
04-23-2002, 10:56 PM
You may find this thread (http://boards.straightdope.com/sdmb/showthread.php?s=&threadid=85054&highlight=cgmp) helpful.
04-23-2002, 11:35 PM
Machinery used to count ballots has to be validated with a lot of test runs. Years ago I was marginally involved in counting punch out ballots. After each run, the machine was jammed with chad and had to be vacuumed out.
Those problems must have been corrected by now.
04-24-2002, 11:15 AM
Thanks for your thoughts everyone. I guess it is just as simple a concept as it seems. Computer Validation means just that: you validate computers.
It is odd that I haven't heard of it. Of course the job listing mentions it as if everyone should know exactly what it means.
04-24-2002, 12:11 PM
I sent this link to hedra who had to go to a meeting but sent this reply
Especially important in a regulated industry (like the FDA-regulated US pharmaceutical industry, etc.).
RE: Pharmas, here's what you need to know.
The FDA audits the pharma, checking a variety of things to see if all is going the way it should. All software needs to be validated, because if ANY software is not validated, an entire product can be stalled, if not outright killed by the auditor (granted, that would take some serious problems, but even a delay and a competitor may beat you to market). If that happens, you've got millions of dollars down the drain. This includes everything from molecular modeling software, to high-throughput screening, to clinical trials management software, to document management software for publishing the drug application for FDA review (and hopefully approval). Auditors can literally flip through pages of code and spot a portion of code that is not properly commented. They can also spot tests that were not properly run or were 'fudged' on signoff. You need huge binders of information to verify that software was validated. Loads of proof that it does, as you said, exactly what it is supposed to do. Nothing more, nothing less, and in such a way that one could figure out what it was SUPPOSED to do, by looking at the documentation, the comments, etc.
Worth a lot of money, validation experience! At the very least, you need to have a clue why they want it, if they ask in an interview. And be prepared to live with it, if you get hired. You have to be prepared to comment the H*LL out of your code, and to scrupulously update your specs, database design, etc. Testing has to be performed in very specific ways, and everything from installing the hardware to installing the software to instructions for running the program has to be documented, verified, and signed off on. Frustrating? Yep. But I wouldn't want to be the one using a drug whose clinical trials were recorded using software that incorrectly called database information about adverse events or efficacy or toxicity.
Validation also can check beyond the usual testing-to-see-if-it-works, to include testing-to-see-if-it-is-easy-to-screw-up. My sister does medical systems validation for life support systems. Makes major bucks doing so, because she's GOOD. Gotta be. Lives are literally at stake. (I work for a company that produces software for pharmas - and boy, do we hear about validation on a daily basis!)
04-24-2002, 02:02 PM
More on what hedra has to say (she and I work for the same company, in diffent roles).
Validated software, hardware, and processes allow an auditing agency (could be the FDA/Pharma, but it's also applicable in chemical manufacturing, Law, Financial/Fiscal, etc) to verify that:
- Things are done in a standard fashion,
- That the 'standard fashion' is trackable (who did what, when),
- That the standard fashion is appropriate to the task at hand,
- That the 'standard fashion' does what it claims, and nothing else,
- That the 'standard fashion' has no undocumented work-arounds,
- That the 'standard fashion' would not 'break' under any forseeable strains or pressures,
- That 'standard software' is stable and well understood,
- That the 'standard software' does not alter information in any unplanned and controlled fashion,
- That if the 'standard software' should fail, that event is captured and reported,
- That emergency recovery schemes are robust, well-planned, feasable, and appropriate,
- That the hardware will work with all the software, and that various software packages work well with each other and the hardware in any validated configuration,
- That the hardware does not change, in any unexpected fashion the information it proccesses,
- That the 'standard hardware' is reliable, robust, and fault-tollerant...
The list goes on for volumes.
One small example: I helped put together a nifty little hack that filled a technology gap as a client was shifting from one process to another. It took a week and a bit to put the app together, and it was a simple process (although the code wasn't quite so simple). For a tool that was planned to last in production for less than six months, filled exactly one role, did exactly one thing, my team spent 5 additional weeks validating it. We killed a small forest-worth of trees on documentation, too. Now that it's going into full-scale production, we'll kill a few more trees updating the validation, but it'll only take a week or two to complete the process.
One of the nice things about validated systems, software, and processes: Once it's validated, it's easier to maintain, and changes are easy to track and document.
04-25-2002, 07:34 AM
Or: Say you use SAS to analyze your data, as is very common in the pharmaceutical industry. Say SAS comes out with it's latest version, V9. You've been using V8. The FDA does not require you to validate SAS, that would be ridiculous. But if your institution upgrades to V9, they require you validate all of the SAS programs you've written, to be sure that they produce the same results in V9 as they did in V8. If not, any discrepancies must be scrupulously tracked down, diagnosed, and documented.
vBulletin® v3.7.3, Copyright ©2000-2013, Jelsoft Enterprises Ltd.