Prediction of Manipulation Actions

Abstract

This work presents an FPGA implementation of a highly parallel architecture for the motion and disparity estimations of color images. Our system implements the well-known Lucas & Kanade algorithm with multi-scale extension for the computation of large displacements using color cues. We empirically fulfill the real-time requirements computing up to 32 and 36 frames per second for optical flow and disparity, respectively, with a 640 × 480 resolution. In this paper, we present our design technique based on fine pipelines, our architecture, and benchmarks of the different color-based alternatives analyzing the accuracy and resources utilization trade-off. We finally include some qualitative results and the resource utilization for our platform, concluding that we have obtained a system that manages a good trade-off between the increase in resources and the improvement in precision and the density of our results compared with other approaches described in the literature.

Publication
Journal of Real-Time Image Processing
Francisco Barranco
Francisco Barranco
Associate Professor

Associate Professor at the Department of Computer Engineering, Automation and Robotics, Principal Investigator at the Applied Computational Neuroscience Group and the Computer Vision and Robotics Lab of the University of Granada.

Eduardo Ros
Eduardo Ros
Full Professor

Full professor in computer architecture, principal investigator at the Computational Neuroscience and Neurorobotics Lab and principal investigator of the VALERIA lab of the University of Granada.