What is the current state of Cell simulators?

I envision one day where we use massively parallel processing systems(imagine the # of threads a program like this would use!) to simulate exactly a cell based on principles of physics(incidentally, I’ve heard that quantum mechanics states that if you rolled the universe backwards, it’d occur differently, can anyone clarify?), P-chem and biology. I understand that systems exist today which simulate entire human bodies and the such, but does any system today exist which is extremely reliable in it’s predictions?

Basically, I want someone to justify me getting a computer science minor with the biotechnology degree! =)

Well, the PS3 when it comes out will have some sort of parallel processing system called the “Cell” http://news.com.com/2100-1001-948493.html?tag=fd_top

Yea, I read about that. Should make the Ps3 and interesting console to buy, it’d also give some major competition to the intel/amd design.

But I digress.

When I meant cell simulators I mean a single cell such as a skin cell, or egg cell. An e. coli bacterium… etc.

To model even a single cell would take phenomenal processing power (assuming that we were to try to model it right down to the atomic level)…

With biotech expanding like it is, you don’t need this for justification. It’s a hot field, and is likely to remain so for quite some time.

Take an English class or two while you’re at it. Engineers fresh out of college who can’t write are all too common, and it’ll give you a big boost ahead of them.

You wanna see some seriously deficient engineers go over to the workcircuit.com. Those cats need some serious help.

Even folding protiens is giving super-computer headaches and its taking millions of distributed PC’s to simulate it. I doubt full cell simulation is anything short-term.

I’m not 100% sure about the calculations, but I believe that although it would take a phenomenal amount of processing power for a single thread cpu, a massively parallel processing system of a million crosslinked cpu’s would be able to produce results in a reasonable amount of time.

Does anyone know much about the problems of distributed computing? It seems to me whilst it has massive processing power it is * slow * due lag time between nodes. How much faster would a million crosslinked p4’s be compared to a million distrubuted p4’s? Assuming of course properly written software.

Believe it or not i’m a decent writer when i’m not being an ass backward idiot :smack:

Should be a little faster, just cause the data doesn’t have to travel as far.

That wasn’t meant to be a comment on your writing style. It’s advice I’d give to anyone considering an engineering degree.

Well I can comment on my own writing can’t I? =)