Application of Statistical Signal Processing Techniques to

Download Report

Transcript Application of Statistical Signal Processing Techniques to

Application of Statistical Techniques to Neural Data Analysis

Aniket Kaloti 03/07/2006

Introduction

 Levels of Analysis in Systems and Cognitive Neuroscience  Spikes: primary neural signals  Single cells and receptive fields  Multiple electrode recordings  fMRI  EEG and ERPs Retinal Ganglion Cell Receptive Field Visual Cortical (V1) Cell Receptive Field

Receptive Field Estimation: A New Information Theoretic Method (Sharpee et al, 2004)      V1 cells of primary concern Linear-Nonlinear Model: estimate the Wiener filter, estimate non-linearity graphically Classically, white noise stimuli were used Works best for Gaussian stimulus ensembles Natural Stimuli: non-Gaussian From Simoncelli et al, 2003

The Model      Receptive field as a special dimension in the high-dimensional stimulus space Hence, reduce dimensionality of the stimulus space conditioned on the neural response To formulate this, define the density  I spike defines the mutual information between the entire stimulus ensemble and the spike In practice, use the time average equation Sharpee et al, 2004

 

Optimization Algortihm and Results

Finding “most informative” dimensions:     I spike : total mutual information; If only a few dimensions in the stimulus space are relevant, then I spike should be equal to mutual information between spike and the relevant subspace in the direction of the vector

v

Find the pdfs of the projections onto the relevant subspace

v

Maximize I v with respect to v to obtain the relevant dimension, i.e., the receptive field Figure: the comparison of the standard method with the present method applied on model in last slide

Independent Component Analysis (ICA)

      Blind source separation Blind: input and transfer function unknown Very ill-posed without further assumptions 

f

linear  A, usually symmetric  

s

are independent (hence ICA) Most commonly:

n

is zero Independece: joint density factorizes Independence: mutual information is zero The problem: estimate independent sources through inversion of the matrix A. Observed signals Unknown sources Additive/observational noise Unknown function

ICA Estimation Techniques    Basic idea: minimize mutual information between the components of

s

. Maximum likelihood (ML) method     Likelihood definition Log-likelihood Batch of T samples Use W = A -1 Maximize L; equivalent to minimizing mutual information

ICA estimation (contd.)    Cumulant (moment) based methods: kurtosis = fourth central moment; mutual information approximations involving kurtosis Negentropy: difference of entropies between Gaussian vector and the vector of interest; measure of non-Gaussianity Infomax ICA: maximize information transmission in a neural network

Applications of ICA     EEG and ERP analysis   Infomax ICA most commonly applied technique; gives rise to temporally independent EEG signals Independent components: can they tell us anything about the brain activity?

fMRI: spatially independent processes (?) Speech separation Natural images: independent components give V1 like receptive fields Source: www.bnl.gov/neuropsychology/ERPs_al.asp

Other techniques applicable to neural science     Point process analysis of neural coding Information theoretic analysis of information coding in the neural system Principal components analysis to neural recordings and spike sorting Recently developed nonlinear dimensionality reduction techniques like Isomap, Hessian eigenmaps, Laplacian eigenmaps etc in face and object recognition.