Just for fun, I want to make a cat-tracking laser turret.
I’ve never done anything like that before, and back in my day Lego Mindstorm was all the rage. Now, I’ve heard of Arduino and Kinect and I’m thinking of using those along with some servos and a laser pointer and fancy software. Would that be a good way to track a cat and shoot a laser in front of it in a semi-random pattern?
Or would the cool kids today use some other set of tools?
Strangely enough, my upstairs neighbor is a roboticist who’s building a cat-tracking laser pointer turret himself. His problem is that his cat blends into the floor so well that he’s hard to track (both are black).
Anyway, the “difficult” part is training the turret to recognize where the cat is and to shoot accordingly. This isn’t my field, so I can’t really help you, but here are some resources which may be useful to you.
Also check www.coursera.com . Coursera hosts Stanford’s online machine learning course, which is not as mathy as Caltech’s and far more practical, as well as a computer vision course which I haven’t taken.
Of course, if you happen to be my upstairs neighbor, you already know all of this.
That’s where the Kinect comes in. It seems like “see which projected dots are changing in a room with only one moving object” should be a simpler computational problem than “identify black pixels in image, determine catness”.
I don’t think any machine learning is required. Just code to say “oh yeah, these dots are moving… shoot the laser 10 ft in front of their averaged direction”
I am bitterly disappointed. I was sure I would learn how to install frickin’ laser beams on my cat’s heads. Sorry, guys, you’ll have to do your hunting the old traditional way.
That’s what a Kinect is. A cheap, 3D infrared camera. It doesn’t track body heat (so not thermal), but it projects a grid of infrared dots and uses their sizes and distortions to calculate a real-time 3D map of the environment and moving objects within it.
Unless there’s another one, I think it uses a light-based system. It sees the shadow of an insect, shoots a identification laser onto them to see it in a high-framerate sensor and count its wingbeats. If it flaps like a female mosquito, they shoot a higher-power UV laser to burn off its wings in about 1/10 of a second.
ETA: At 4:40 they play a sound of the visual information. Maybe that’s what you were thinking about? It’s counting the wingbeats optically but then it artificially produces a sound at the same Hz for humans to hear. It sounds like a mosquito buzzing because that’s the actual speed their wings are beating at, just abstracted and reproduced by a chip.
From that project, I learned that an x-y mirror galvanometer might be better than using servos to move the laser pointer (seems much faster, at least).