【学术报告】A cortex-like memory -- the probabilistic associative memory

报告人: JAMES TING-HO LO

Maryland University,USA

报告时间: 2008-6-17 上午10:00

报告地点: FIT楼1- 312

主办单位: 电子工程系

联系人: 丁晓青教授

Abstract:

In this talk, a new neural network (or learning machine) paradigm, called probabilistic associative memory (THPAM), will be introduced. PAM is a recurrent multilayer network of novel computing elements, each of which is an associative memory that learns by the simple Hebbian rule(i.e., accumulating the outer products of the input and output vectors). It has most of the properties characteristic of memories in the brain and hence has a wide range of applications including recognition and/or understanding of speech, handwriting, face, image, video, etc. PAM is also expected to be an effective building block for constructing cognitive architectures to mechanize cognitive functions.

Some properties of the PAM follow:

Fast offline and online learning and responding to large temporal and spatial patterns (images, videos, speech/language, text/knowledge, etc.)

No or minimal preprocessing (e.g., edge detection, segmentation, single object detection/recognition)

Detecting and recognizing multiple causes jointly or associatively

Modeling hierarchical temporal and spatial worlds (e.g., letters, words and sentences)

Having maximal generalization capabilities to deal with noise; distortion; occlusion; translation; scaling and rotation

Representing and resolving ambiguity and uncertainty with conditional probabilities

BIOGRAPHY OF JAMES TING-HO LO

James Ting-Ho Lo is a Professor in the Department of Mathematics and Statistics at University of Maryland Baltimore County, Baltimore, Maryland, USA. He received the B.A. and Ph.D. degrees in Mechanical Engineering and Aerospace Engineering from National Taiwan University and University of Southern California respectively. He was a Postdoctoral Research Associate consecutively at Stanford and Harvard University.

His research interests have included optimal filtering, signal processing, and neural computing. In 1992, he published a synthetic approach to optimal filtering using recurrent neural networks and obtained a best paper award for it. The approach is the first practical solution of the long-standing optimal nonlinear filtering problem in a most general setting, that has been proven to approximate the optimal nonlinear filter in performance to any accuracy.

He has also developed adaptive feedforward and recurrent neural networks with long- and short-term memories, accommodative neural networks, robust neural networks with any degree of robustness, which form the foundation for a general and systematic approach to designing adaptive, accommodative, and/or robust system identifiers, controllers and filters. His method of convexifying the mean squared error criterion has effectively overcome the local-minimum problem in training neural networks and estimating nonlinear regression models, which has severely plagued the developments of neurocomputing and nonlinear regression.

In recent years, he has been developing a functional model of the neocortex, called probabilistic associative memory (PAM), which is capable of detecting and recognizing multiple causes in temporal and spatial patterns. PAM is expected to have a wide range of applications and to be an effective building block for cognitive architectures.