Selective Visual Attention: Computational Models and Applications

Selective Visual Attention: Computational Models and Applications

Author(s): Liming Zhang, Weisi Lin

Published Online: 20 MAR 2013 04:51AM EST

Print ISBN: 9780470828120

Online ISBN: 9780470828144

DOI: 10.1002/9780470828144

About this Book

Visual attention is a relatively new area of study combining a number of disciplines: artificial neural networks, artificial intelligence,  vision science and psychology. The aim is to build computational models similar to human vision in order to solve tough problems for many potential applications including object recognition, unmanned vehicle navigation, and image and video coding and processing. In this book, the authors provide an up to date and highly applied introduction to the topic of visual attention, aiding researchers in creating powerful computer vision systems. Areas covered include the significance of vision research, psychology and computer vision, existing computational visual attention models, and the authors' contributions on visual attention models, and applications in various image and video processing tasks.

This book is geared for graduates students and researchers in neural networks, image processing, machine learning, computer vision, and other areas of biologically inspired model building and applications. The book can also be used by practicing engineers looking for techniques involving the application of image coding, video processing, machine vision and brain-like robots to real-world systems. Other students and researchers with interdisciplinary interests will also find this book appealing.

  • Provides a key knowledge boost to developers of image processing applications
  • Is unique in emphasizing the practical utility of attention mechanisms
  • Includes a number of real-world examples that readers can implement in their own work:
  • robot navigation and object selection
  • image and video quality assessment
  • image and video coding
  • Provides codes for users to apply in practical attentional models and mechanisms

Table of contents

SEARCH