TY - JOUR
T1 - A neuron-inspired computational architecture for spatiotemporal visual processing
T2 - Real-time visual sensory integration for humanoid robots
AU - Holzbach, Andreas
AU - Cheng, Gordon
N1 - Funding Information:
This work was supported (in part) by the DFG cluster of excellence Cognition for Technical Systems (CoTeSys) of Germany and (in part) by BMBF through the Bernstein Center for Computational Neuroscience Munich (BCCN-Munich).
PY - 2014/6
Y1 - 2014/6
N2 - In this article, we present a neurologically motivated computational architecture for visual information processing. The computational architecture's focus lies in multiple strategies: hierarchical processing, parallel and concurrent processing, and modularity. The architecture is modular and expandable in both hardware and software, so that it can also cope with multisensory integrations - making it an ideal tool for validating and applying computational neuroscience models in real time under real-world conditions. We apply our architecture in real time to validate a long-standing biologically inspired visual object recognition model, HMAX. In this context, the overall aim is to supply a humanoid robot with the ability to perceive and understand its environment with a focus on the active aspect of real-time spatiotemporal visual processing. We show that our approach is capable of simulating information processing in the visual cortex in real time and that our entropy-adaptive modification of HMAX has a higher efficiency and classification performance than the standard model (up to ∼ + 6%).
AB - In this article, we present a neurologically motivated computational architecture for visual information processing. The computational architecture's focus lies in multiple strategies: hierarchical processing, parallel and concurrent processing, and modularity. The architecture is modular and expandable in both hardware and software, so that it can also cope with multisensory integrations - making it an ideal tool for validating and applying computational neuroscience models in real time under real-world conditions. We apply our architecture in real time to validate a long-standing biologically inspired visual object recognition model, HMAX. In this context, the overall aim is to supply a humanoid robot with the ability to perceive and understand its environment with a focus on the active aspect of real-time spatiotemporal visual processing. We show that our approach is capable of simulating information processing in the visual cortex in real time and that our entropy-adaptive modification of HMAX has a higher efficiency and classification performance than the standard model (up to ∼ + 6%).
KW - Biologically-inspired computational architecture
KW - Concurrent information processing
KW - Online image processing
KW - Visual Object Recognition
UR - http://www.scopus.com/inward/record.url?scp=84901837983&partnerID=8YFLogxK
U2 - 10.1007/s00422-014-0597-3
DO - 10.1007/s00422-014-0597-3
M3 - Article
C2 - 24687170
AN - SCOPUS:84901837983
SN - 0340-1200
VL - 108
SP - 249
EP - 259
JO - Biological Cybernetics
JF - Biological Cybernetics
IS - 3
ER -