Using a GPU for non-graphical processing... possible?

Inspired by a comment made by slaphead in post #11 of this thread…

Now I don’t want to take issue with slaphead on a matter in which he is absolutely in the right, however, whenever I see the form of expression “device X cannot be made to perform function Y”, I cannot help myself trying to imagine a way in which it could be done.

OK, so GPUs are not conventional processors, that much is clear, but I can’t help wondering if they could actually be made to perform logical operations as an emergent artifact of their graphical rendering capabilities; for (very simple)example a visual AND gate could be created by drawing two overlapping, semitransparent lines and testing the value of the pixels where they intersect.
Note: I’m not talking about merely simulating the processing in RAM, then drawing a graphical representation of it upon the display, but rather, utilising the graphical capabilities of the GPU to perform entirely incidental logical processing.

I’m aware that this would be hugely inefficient, as it’s essentially a form of emulation, but am I making any kind of sense at all?

Yes.

In fact theres software that uses a geforce video card and essentially turns it into a sound card.

http://www.bionicfx.com/

…seems like I’ve seen a few mentions of this exact type of (non-video related) use within the last couple years, but it might have been referencing the same U of W paper that the above does.
~

As others have said, basically yes.

GPU are just processors that happen to be very good at doing a particular type of operation. I’m not up on current stuff but they do complex matrix manipulations amongst other things.

Now most programs are translated into a string of instructions suitable for you’re standard processor that operates on distinct integer values etc, etc.

There’s no reason you couldn’t recompile a program such that it uses GPU instructions to achieve the same results. Chuck in some complex maths and you find that you can reduce some algorithms to matrix manipulation, then you get to run them much faster using the GPU.

If the program you recompile for the GPU is an emulator for a standard CPU then you could run other programs directly on your emulated CPU running on your GPU.

Of course that’d be slow and pointless, but there are certain algorithms that are reducible to terms of matrices and other things GPUs are good at. Most of these algorithms are graphical (that’s why GPUs are good at those things!) but I believe there’s also some audio and cryptographic stuff that can be used.

Quick answer, is yes it’s possible and for a certain set of problems desirable.

SD

There is a fair amount of work being done to run Folding@home on GPU’s - the PDF file on that page is quite interesting.