I’d love to hear more details on your approach. The equipment you need depends on the approach you’re taking.
a. Are you using deep learning networks, made using some variant on existing algorithms? If so, you want some servers with multiple NVidia cards in them. It depends on how you are constructing these networks and how much data must pass between them (as an example, a classifier that breaks an image into labeled and classified boxes, like YOLO, takes a large input image and outputs a tiny output).
This determines what hardware is most cost effective. For example, if you have a lot of sparsely coupled networks, where the data traffic between them is low, you can use separate computers, each with several NVidia off the shelf graphics cards, and they will talk to each other through an off the shelf gigabit switch. 250k, even canadian, would buy a ton of these - they’d only cost 2-5k each, depending on configuration.
b. If you’re not using existing algorithms, you may not be able to make the algorithms you use work on graphics cards. CUDA is very limited in what kinds of algorithms it can run efficiently, and if you are at early stages, you may not have the resources (massive amounts of labor) to program your prototype algorithms into CUDA anyway. This is what the Xeon Phi is for. You still have to be a skilled programmer who understands parallelism to use them, but you can get yourself some Xeon Phi based servers for between 5 and 20k Canadian, depending on how many accelerator cards are in them, how many CPU sockets and cores for the main CPU, etc.
c. Do you have to buy equipment before some deadline or can you bank this money for future expenses? Because the most cost effective way to go is to rent equipment through AWS or other cloud services until you know what you need.
What I’m hearing is that you’re not an expert computer engineer. You need to one, unfortunately, or you’re going nowhere. (and I’m not volunteering, my skills are in different areas and I haven’t finished my Masters). I’m guessing you’re a mathematician? Unfortunately it takes very specific skills to go from a paper description or Python/Matlab prototype algorithm to something that runs at high speed, reliably, and uses multiple computers or thousands of separate compute units in a graphics card in parallel. And a heck of a lot of work - a full custom solution involves many person years of development.
If you’re doing something that involves just cascading neural networks together, where those networks use a standard implementation supported through tensorflow or theano or something, then you can do it yourself. In that case, I’d just rent a machine through AWS (about $1 an hour) until you know for certain that these off the shelf tools are going to work. Then I’d just read a guide like this : The $1700 great Deep Learning box: Assembly, setup and benchmarks | by Slav Ivanov | Slav and either build one or get a professionally made server with similar specs in it.
Some major figures in AI have concluded that the best way to build a real one is to build agents that manipulate with the real world - our world - starting with rudimentary manipulations (picking up blocks like a small child does) and gradually increasing in sophistication. Various evolutionary algorithms you might use are going to be far more stable if the data they must solve comes from a real robot, where you cannot cheat to reach the goal, unlike a crude, simulated environment where you maybe can.
You can easily spend all 250k on real robotics hardware, it wouldn’t even be hard. I’d look for something off the shelf, ready to go, and consider that you’ll need a bunch of ancillary equipment. A test cell to put it in. Multiple cameras and data acquisition computers. Protective Plexiglas safety shields. Space. 250k is probably not even enough money, actually, at least not for a fancy setup.
As a side note, politics wise and your attractiveness to grad students - if you can make a real robot do something useful with your fancy ideas, you’ll probably be able to squeeze more money out of the administration and probably more grad students will want to help. Robots that actually do stuff are a lot more interesting than some numbers on a screen.