Sloan-Swartz Centers for Theoretical Neurobiology

     - Sloan-Swartz Centers Main Page

 

Annual Meeting 2008

Princeton University, July 19-22, 2008

 

Ten Bernstein Network Fellows Attend Annual Swartz Foundation Meeting

  • The Swartz Foundation awarded fellowships to ten doctoral students and post docs of the Bernstein Network Computational Neuroscience to allow their participation at the annual meeting of the Sloan-Swartz Centers in Princeton (New Jersey) from July, 19-22, 2008.

Program (as of 19 July 2008)

Page 123

 

Saturday, July 19

 

7:00 PM Reception  

 

8:00 PM Keynote Speaker: Leslie Vosshall (Rockefeller)

Circuit Computation Underlying Concentration-Invariant Odor Perception

 

Most odors are perceived to have the same quality over a large concentration range, but the neural mechanisms that permit concentration-invariant olfactory perception are unknown. In higher animals, odors are sensed by an array of olfactory sensory neurons (OSNs) expressing odorant receptors (ORs) with overlapping but distinct response properties. How combinatorials of ORs with different affinities for the same odor ligand cooperate to mediate perception of a given odorant across a range of odor concentrations is not understood. Ethyl butyrate, perceived by humans as a fruity odor, robustly activates 3 of 21 OSNs in the Drosophila larva with response thresholds that differ 100-fold in concentration. Local inhibitory interneurons (LNs) only respond to high ethyl butyrate concentrations upon summed activation of at least two OSNs. Behavioral data suggest that LNs constitute a sensory filtering mechanism expanding the dynamic range of the larval olfactory system and enabling detection of food odor over a wide range of ethologically relevant concentrations.

 

 

Sunday, July 20

 

9:00 AM Columbia Posters Overview: Larry Abbott

 

9:15 Invited Speaker: Eve Marder (Brandeis)    

 

Variability, Compensation, Homeostasis and Neuromodulation in Rhythmic Motor Circuits

I will summarize recent theoretical and experimental work that shows that similar circuit outputs can be produced with highly variable circuit parameters. This work raised the question of the extent to which neuromodulation can be constant with underlying circuit parameter variation. To address this question, we construct two cell reciprocally inhibitory circuits using the dynamic clamp from biological GM neurons of the crab stomatogastric ganglion. We then describe the output of the circuits while sweeping through a range of synaptic and intrinsic conductances, first in control saline and then in the presence of serotonin. We nd that serotonin extends the ranges of parameters that produce alternating bursting. Moreover, although serotonin's effects are highly robust and significant on the entire population, individual networks respond anomalously. These data demonstrate that while neuromodulation may have robust actions on a population, not all individuals may respond as do the majority. These findings have important implications for evolution.

 

10:00 Stretch-your-legs break  

 

10:10 Talk Sessions: Learning and Memory

 

10:10 Vladimir Itskov (Columbia)

Structured memory spaces arising from recurrent network activity

Neurons belonging to recurrent networks in many brain areas, including neocortex and hippocampus, often fire according to simple receptive fields; these same cells, however, are in many cases controlled by the dynamics of the recurrent network. Investigating the compatibility of these two pictures promises to yield insight into the structure of recurrent networks. We study a simple recurrent network model, where "memories" correspond to dynamically stable groups of co-active neurons given a constant external drive. In this framework, the set of allowed memories for a given network respects a natural structure. Remarkably, we find that these networks can learn any set of memories respecting this structure without incurring additional, unintended (spurious) memory states. If the neurons each fire according to a simple receptive field model, then each "memory" in the recurrent network corresponds to a group of cells with overlapping receptive fields. The set of all memories allowed by the recurrent network encodes the topology and geometry of the underlying stimulus space covered by the receptive fields. In general, we find that random network connectivity implies a complicated, high-dimensional stimulus-space topology. Therefore, if neurons fire according to low-dimensional receptive fields, such as hippocampal place fieds, this imposes constraints on the matrix of recurrent synaptic weights.

 

10:30 Paul Miller (Brandeis)

Learning associations while retaining specificity: competing demands on network plasticity rules

Many tasks require us to make responses that are determined by specific associations or pairings of stimuli. Analysis suggests that neurons tuned to appropriate pairings of stimuli must form in order to enable task completion, in particular when successful behavior requires logic equivalent to the exclusive-or (XOR) problem. Here we address how the trade offs between associativity (whereby cells responsive to one stimulus gain a response to a temporally correlated stimulus) versus specificity (whereby different stimuli or pairs of stimuli are separately encoded) depends on synaptic plasticity and network architecture. In preliminary results we note the importance of plasticity in inhibitory connections (long-term potentiation of inhibition) for maintenance of specificity. We also show that in order for a network to produce persistent activity to specific stimuli through synaptic potentiation, connectivity of cells must be predominantly local as is seen in the cortex. In contrast, the more random architecture ideal for formation of arbitrary associations, as is seen in CA3 of the hippocampus, is unable to generate persistent activity through synaptic feedback without producing runaway, epileptic activity. Thus our studies suggest why we might need to have different types of structure in our brain, and how the differences in connectivity between cortex and hippocampus could lead to their different roles in tasks that require learning of associations and testing of inference.

 

10:50 Yale Posters Overview: XJ Wang

 

11:05 Break  

 

11:25 Talk Sessions: Decision-Making

 

11:25 Adam Kepecs (CSHL)  

Decision confidence; computational models; and neural correlates

 

Decision-making is a fundamental process for adaptive behavior throughout the animal kingdom. In addition to making a choice, estimating the degree of uncertainty or confidence in a decision confers significant benefits for a broad range of activities. Yet, despite its importance, the neural processes for uncertainty estimation and the computational algorithms used during behavior are poorly understood. My talk will describe a combined computational and experimental approach to investigate the algorithms and neural correlates of decision confidence. In particular, I will focus on our efforts to extend simple models of decision making to compute confidence estimates.

 

11:40 Moran Furman (Yale)

A line-attractor network model for multiple-choice decision making

 

The ability to make a goal-directed choice among multiple alternatives is an important aspect of adaptive behavior. We present here a line attractor spiking neuron model for multiple-choice decision making in a motion discrimination task. The network is dominated by recurrent connections that generate competitive interactions and formation of a categorical choice. The model embodies a continuous representation of the sensory input (i.e. direction of motion), and thus can operate independently of the number of choice alternatives or their angular locations. In addition, the continuous model opens up the possibility of studying the implications of overlaps in the neural representation of the choice alternatives and varying degrees of similarity between them. We applied the model to a recent monkey study that has revealed neural correlates of multiple-choice decisions in the lateral intraparietal area (LIP). Simulation results show that the model accounts for a wide range of behavioral and physiological data, including the dynamics of single unit activity in LIP, response times and psychometric functions. Interestingly, the model shows a higher probability of erroneous choices for targets adjacent to the correct one, a prediction that is specific to the continuous network and testable experimentally. Finally, we propose a novel mechanism for flexible control of speed and accuracy during decision making, based on a biologically plausible top-down projection to the network. Taken together, our findings provide further support for the role of attractor neural dynamics as a general mechanism for slow accumulation of sensory information, decision making and storage of a choice in working memory.

 

12:00 Nate Smith (Yale)

Simultaneous versus alternating sampling in multiple-choice decisions

We commonly have to decide between multiple alternatives; previous work has examined optimality in multiple-choice decisions, using either a drift diffusion model or a leaky competing accumulator model. However, the neural circuit mechanism underlying multiple choice decision making remains unclear. Here, we propose a generalized n-variable version of a biophysically realistic cortical model of decision making. In the model, sensory inputs about potential alternatives drive neural pools selective for different choices; there is strong recurrent excitation within each neural pool, and competition between neural pools through shared inhibition. Whichever population rate crossed a prescribed threshold rate determines the choice. The model was applied to a recent human study of decision making in a three alternative forced choice task with a random dot motion stimulus. Previous use of the RDM task only had dots moving in one direction coherently; in this study, each of the three choice directions had a certain percentage of dots moving coherently in that direction. Subjects were instructed to saccade in the direction they thought had the highest coherent motion strength. The model was able to account for the psychometric function and mean reaction times measured in this human experiment. The model's multiple-choice behavior is described by the softmax decision criterion, namely the probability of choosing option is a sigmoid function of the sum of the strengths of evidence for that option and those for the competing alternatives. Furthermore, we investigated the dynamics of decision making in a sequential sampling task, where subjects were allowed free reign to saccade between different targets before making their choice decisions. The timing of saccades was modeled from distributions of sac cade length in previous studies. Rather than relying on simultaneous integration of sensory information, information from each choice had to be held in working memory during a short period when an opposing stimulus was observed. This simple model provides a framework for the neural underpinnings of decision making in general.

 

12:10 KongFatt Wong-Lin (Princeton)

Optimality, robustness, and dynamics of decision-making under norepinephrine modulation: a spiking neuronal network model  

The locus coeruleus (LC), a nucleus in the midbrain, releases norepinephrine (NE) throughout the cortex. NE modulates cortical neuronal responses at the cellular level, while LC activity and NE levels affect wakefulness and task performance in animals. We attempt to bridge from experimental findings at the single cell level to behavior via a network model of spiking neurons. Specifically, we assess performance in a two-alternative forced choice decision-making task using the metric of reward rate -- the fraction of rewarded trials per unit time -- which implies a speed-accuracy tradeoff. The model shows that tonic NE modulation results in an inverted-U performance pattern: low NE levels cause slow and unmotivated behavior, intermediate levels allow optimal reward rates, while high NE produces impulsive and error-prone behavior. The form and balance of NE modulation changes the shape and width of this inverted-U relation: co-modulation of excitatory and inhibitory synaptic conductances improves task performance over a wider range of NE concentrations, increasing the system's robustness. Robustness is greatest when pyramidal cells are modulated somewhat less than inhibitory interneurons: a condition under which we also find inverted-U single-cell evoked responses, possibly due to a network effect. LC neurons also exhibit a phasic ring mode, and the model predicts that, if this mode is to further enhance performance as has been observed, then modulation of glutamatergic synapses should dominate that of GABAergic ones. Finally, we point out some key experiments that are currently lacking. This is joint work with Philip Eckhoff and Phil Holmes.

 

12:30 Lunch

 

2:00 CSHL Posters Overview: Tony Zador

 

2:15 Invited Speaker: Peter Robinson (Sydney)  

Neural activity in the brain has been observed for over a century and is widely used to probe brain function and disorders, through the electroencephalogram (EEG), electrocorticogram, depth electrodes, functional MRI, and many other measures. However, the connections between stimuli, physiology, processing, and measurements have been chiefly qualitative until recently, and most links between stimuli, activity, function, and measurements have been based on phenomenological correlations. Here a quantitative multiscale model of brain stimulus-activity-measurement dynamics is described that includes key features of physiology and anatomy from synapses to the whole brain and from milliseconds up in timescale. Its parameters measure quantities such as synaptic numbers and strengths, signal delays, cellular time constants, and neural ranges, and are all constrained by independent physiological measurements. With the inclusion of measurement effects, the model successfully predicts a wide range of linear and nonlinear phenomena at many scales. These include time series, spectra, evoked responses to stimuli, seizure dynamics, visual gamma (40 Hz) correlations during perception, arousal dynamics, and influences of pharmacology and aging. Fitting to experimental data enables physiological parameters to be inferred in normal and abnormal conditions. A selection of these results will be discussed to demonstrate how multiscale modeling provides a framework within which to interrelate, predict, and interpret diverse phenomena and measurements.

 

3:00 Break

 

3:10 Talk Sessions: Local Field Potential

 

3:10 Josh Milstein (Caltech)

Noise, power laws, and the local field potential

Human electrophysiological recordings of the local field potential (LFP), acquired from a variety of cerebral regions, show a ubiquitous 1/f^2 scaling within the power spectrum. We develop a quantitative model that treats the generation of these fields in an analogous way to that of electronic shot noise, and use this model to specifically address the cause of this 1/f^2 Brownian noise. The model gives way to two analytically tractable solutions, both displaying Brownian noise: 1) uncorrelated cells that display sharp initial activity, whose extracellular fields slowly decay and 2) rapidly firing, temporally correlated cells that generate UP-DOWN states.

 

 

3:35 Chess Stetson (Caltech)

Relationship of spike-field and field-field coherence across parietal and frontal lobes

Coherence, the synchronization of two signals at a certain frequency may be evidence of communication between cortical sites. Recent data from our laboratory have shown that when a monkey plans a reach, cells in the dorsal premotor cortex (PMd) become coherent with local field potentials (LFPs) in the parietal reach region (PRR), and vice versa. Does this increased spike-field coherence represent selective communication between spikes in one lobe of the brain and LFPs in another? One way to look at this question is to determine whether larger scale signals between lobes also show increased coherence. Partial coherence analysis has previously shown that the spike-field coherence cannot be explained exclusively by the field-field coherence. In fact, we show here that in many cases of the field-field coherence between PMd and PRR actually decreases at about the same time as the spike-field coherence increases. We describe the relationship of the cross-cortical spike-LFP and LFP-LFP coherence in two types of decision-making tasks, a target-choice and an effector-choice task. These data suggest rich cortical dynamics during decision-making, whereby spikes can selectively synchronize with distant cortical areas, even as their surrounding fields become desynchronized between the same cortical areas. We discuss these results within the context of a computational framework for cortical communication.

 

4:00 Flavio Frohlich (Yale)

Does the field potential influence normal network activity?

 

Is the field potential generated by normal synchronized cortical activity merely an epiphenomenon or does it actively influence and control cortical network dynamics? We addressed this important question through measurements of the influence of weak electric fields on active cortical networks. Slow oscillations represent a fundamental activity pattern of cortical circuits and are characterized by alternating epochs of pronounced activity (up states) and relative quiescence (down states). This structured network activity gives rise to an endogenous macroscopic electric field of up to several mV/mm. Here, we studied the modulation of cortical slow oscillations in vitro in presence of exogenous fields comparable in amplitude to the endogenous fields occurring in vivo during slow oscillation. We found that while the applied fields have only a small effect on the membrane voltage of individual cells, such fields can strongly shape network activity. Importantly, activity modulation by weak exogenous electric fields occurred only in slices that intrinsically exhibited slow oscillations. These findings show that active cortical networks are susceptible to modulation by weak electric fields. Therefore, we conclude that the endogenous electric fields generated by normal synchronized activity in the cortex actually play an important role in directing cortical network behavior.

 

4:15 Break

 

4:45 Talk Sessions: Network Dynamics

 

4:45 Yoram Burak (Harvard) Path integration and grid cell dynamics in the medial entorhinal cortex

 

When we fixate our gaze, the eyes are not entirely stable. These drift movements are small, but they create a problem of blurring for the visual system: During the response time of a cone photoreceptor in the fovea, the image drifts across several cone diameters. Yet, we are able to detect visual features on the scale of a single cone, e.g. when reading the bottom line of an eye chart. How does our visual system achieve high acuity in the presence of fixational eye movements? Our working hypothesis is that a visual area downstream from the retina converts the moving and blurred retinal image to a static high resolution image. Furthermore, we assume this computation is based solely on the visual inputs, rather than knowledge of the eye movement. We propose a network model that performs this computation. The model is based on an approximation of the ideal Bayesian estimator of the image, known as the mean field approximation. Receptive fields of neurons in the network track the moving image accurately and rapidly. This allows neurons to integrate visual signals over time from the same location on the image, and reconstruct a non-blurred image. We characterize the performance of our model and propose experimental predictions.

 

5:05 Ken Miller (Columbia) Strong, balanced excitation and inhibition yield selective amplification of activity patterns without slowing of dynamics

 

How do neural networks selectively amplify certain activity patterns? A well-known answer is via "Hebbian assemblies": a group of neurons mutually excite one another, and so slow the decay of their activity. Given persistent input, patterns that decay most slowly will accumulate to the largest amplitude and thus be amplified. We show that, given networks with strong excitation (E) balanced or dominated by strong inhibition (I), another form of amplification arises, in which small fluctuations in a pattern of the difference between E and I activity lead to large responses in the sum of E and I activity. This allows large amplification of the sum pattern without dynamical slowing. Mathematically, the effect arises from the non-normal nature of the synaptic connectivity matrix, which is induced by the fact that there are separate E and I cells.

 

5:25 Gergo Orban (Brandeis) Relating evoked and spontaneous cortical activities in a generative modeling framework  

Recently we proposed a computational framework in which we assumed that the visual cortex implicitly implements a generative model of the natural visual environment and performs its functions such as recognition and discrimination by inferring the underlying external causes of the visual input. In the present work, we test this framework by relating synthetic and measured neural data to the predictions of the underlying generative model. Two key elements of the proposal are that ring activity of individual neurons are samples form the underlying probability density function (pdf) that those cells represent, and that the spontaneous activity of the cortex represents the prior knowledge of the system about the external world. In order to test these ideas, a reliable method was developed to estimate the difference between the pdfs of the spontaneous and visually evoked activities based on a limited number of samples. Our method exploits the full statistical structure of the data to estimate the Kullback-Leibler divergence between pdfs of neural activities recorded under different conditions. First, we tested the method on synthetic data to demonstrate its feasibility, then we applied it to analyze neural recording from the primary visual cortex of awake behaving ferrets. Our results conforms the predictions of the generative framework and show how this framework can successfully describe the link between spontaneous and visually evoked activity and give a novel interpretation to the response variability of cortical responses.

 

5:45 Break

 

5:55 Princeton Posters Overview: Bill Bialek

 

6:10 Yi Sun (NYU)

Numerical I&F reduction of Hodgkin-Huxley neurons and their network dynamics.

 

We present efficient numerical methods for simulating the Hodgkin-Huxley (HH) neuronal networks. The key components in our numerical methods involve (i) a pre-computed high resolution data library which contains typical neuronal trajectories (i.e. the time-courses of membrane potential and gating variables) during the interval of an action potential (spike), and allows us to avoid resolving the spikes in detail and to use large numerical time steps for evolving the HH neuronal equations; (ii) an algorithm of spike-spike corrections within groups of strongly coupled neurons to account for spike-spike interactions in a single large time step. By using the library method, we can evolve the HH networks on time steps one order of magnitude larger than the one used for resolving the whole trajectories without the library, while achieving comparable resolution in statistical quantifications, such as average ring rate, interspike interval distribution, power spectra of voltage traces. We compare our library-based method with standard methods, such as Runge-Kutta methods, and find that our method can capture very well asynchronous, synchronous, and chaotic dynamics of HH neuronal networks. It is important to point out that, in essence, our library-based HH solver can be viewed as a numerical reduction of HH to an integrate-and-fire (IF) neuronal representation but without sacrificing the gating dynamics (as normally done in the analytical reduction to an IF neuron).

 

7:30 - 10:30 Banquet

 

Continued...

 

Page 123

 

 

 

Thursday, November 21, 2024
About the Swartz Foundation...
 
The Swartz Foundation was established by Jerry Swartz (bio) in 1994 . . .
more>
 
Follow us...
 
The Swartz Foundation is on Twitter: SwartzCompNeuro
more>
 
 
2013 Stony Brook Mind/Brain Lecture - Michael Wigler, PhD
 
 
2012 Stony Brook Mind/Brain Lecture - John Donoghue
 
 
Sloan-Swartz Centers Annual Meeting 2011
 
 
2011 Stony Brook Mind/Brain Lecture - Allison J. Doupe
 
 
2011 Banbury Workshop
 
 
Sloan-Swartz Centers Annual Meeting 2010
 
 
2010 Stony Brook Mind/Brain Lecture
 
 
Sloan-Swartz Centers Annual Meeting 2009
 
 
Conference on Neural Dynamics
 
 
2009 Stony Brook Mind/Brain Lecture
 
 
Canonical Neural Computation, April 2009
 
 
2009 Banbury Workshop
 
 
Sloan-Swartz Centers Annual Meeting 2008
 
 
Theoretical and Experimental Approaches to Auditory and Visual Attention - Banbury 2008
 
 
Stony Brook Mind/Brain 2008: Patricia Smith Churchland, B. Phil. D
 
 
Sloan-Swartz Centers Annual Meeting 2007
 
 
New Frontiers In Studies Of Nonconscious Processing - Banbury 2007
 
 
Stony Brook Mind/Brain 2007: Professor Michael Shadlen, MD, PhD
 
 
Multi-level Brain Modeling Workshop 2006
 
 
Sloan Swartz Centers Annual Meeting 2006
 
 
Banbury 2006: Computational Approaches to Cortical Functions
 
 
Stony Brook Mind/Brain 2006: Helen Fisher -- Lecture Videos
 
 
Sloan-Swartz Centers for Theoretical Neurobiology
 
 
Swartz Center for Computational Neuroscience
 
 
Banbury Center Workshop Series
 
 
Other Events
 
www.theswartzfoundation.org                           Copyright © The Swartz Foundation 2024