A neuron-inspired computational architecture for spatiotemporal visual processing: Real-time visual sensory integration for humanoid robots

Andreas Holzbach, Gordon Cheng

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

In this article, we present a neurologically motivated computational architecture for visual information processing. The computational architecture's focus lies in multiple strategies: hierarchical processing, parallel and concurrent processing, and modularity. The architecture is modular and expandable in both hardware and software, so that it can also cope with multisensory integrations - making it an ideal tool for validating and applying computational neuroscience models in real time under real-world conditions. We apply our architecture in real time to validate a long-standing biologically inspired visual object recognition model, HMAX. In this context, the overall aim is to supply a humanoid robot with the ability to perceive and understand its environment with a focus on the active aspect of real-time spatiotemporal visual processing. We show that our approach is capable of simulating information processing in the visual cortex in real time and that our entropy-adaptive modification of HMAX has a higher efficiency and classification performance than the standard model (up to ∼ + 6%).

Original languageEnglish
Pages (from-to)249-259
Number of pages11
JournalBiological Cybernetics
Volume108
Issue number3
DOIs
StatePublished - Jun 2014

Keywords

  • Biologically-inspired computational architecture
  • Concurrent information processing
  • Online image processing
  • Visual Object Recognition

Fingerprint

Dive into the research topics of 'A neuron-inspired computational architecture for spatiotemporal visual processing: Real-time visual sensory integration for humanoid robots'. Together they form a unique fingerprint.

Cite this