littlebitsofspider t1_jebr74v wrote
Reply to comment by BanzoClaymore in Humanoid robots using cameras for eyes will likely experience issues and accidents around spinning objects such as propellers, due to frame rates by scarronline
If you like flicker artifacts and bouncing around gaze vergence, sure. If you want stereo fusion, good edge detection and depth estimation, and smooth saccades, you'll probably want an event camera. Ideally, you'd have a robot eyeball with a beamsplitter inside with an event camera sensor grid on one side, and a traditional camera-CCD sensor on the other. That way, an event at one of the event camera pixels can trigger a pixel dump on the corresponding CCD color subpixels. Better yet would be a Foveon-style stacked-RGB CCD, which could match 1:1 resolution with the event camera in color. You could do the sensor fusion on a dedicated ASIC hooked up to both cameras and let them both do what they do best.
BanzoClaymore t1_jebtlch wrote
Did you really think I would know what you were talking about when you wrote this?
maxcresswellturner t1_jedj5a2 wrote
Dude, you really do not know your Foveon-style stacked RGB CCDs at all.
BanzoClaymore t1_jedkify wrote
I know everything except that
littlebitsofspider t1_jebtvh9 wrote
¯\_(ツ)_/¯
Viewing a single comment thread. View all comments