Computational Neuroscience

From 2003 until 2007, I worked on my PhD in the research group of Andreas Herz. Research topic was information processing of temporal signals in sensory systems. One particular example is time-scale invariance: the ability to recognize signals when their relative duration is varied. The grasshopper Chorthippus biguttulus produces syllable-pause sequences which vary up to 300% in duration without limiting the stimulus recognition. We have found a putative neuronal mechanism which can explain this phenomenon.

In 2005, I visited Tali Tishby at the Hebrew University in Jerusalem, studying the relation between linear dynamic systems and information theory. We found that the state space of linear systems can be regarded as an information bottleneck. This may be a step towards a statistical theory of optimal adaptive behavior.

Together with Henning Sprekeler, I was able to show that temporally local predictive coding is - under certain conditions - equivalent to slow features analysis, an established method that can model receptive field properties in cortex.


I deciphered (some of) the neural code behind discriminating "good" and "bad" mating songs of this grasshopper.

Journal articles

  • F. Creutzig, H. Sprekeler (2008)
    Predictive Coding and the Slowness Principle: an Information-Theoretic Approach
    Neural Computation 20(4): 1026-1041 Abstract.


    Understanding the guiding principles of sensory coding strategies is a main goal in computational neuroscience. Among others, the principles of predictive coding and slowness appear to capture aspects of sensory processing. Predictive coding postulates that sensory systems are adapted to the structure of their input signals such that information about future inputs is encoded. Slow feature analysis (SFA) is a method for extracting slowly varying components from quickly varying input signals, thereby learning temporally invariant features. Here, we use the information bottleneck method to state an information-theoretic objective function for temporally local predictive coding. We then show that the linear case of SFA can be interpreted as a variant of predictive coding that maximizes the mutual information between the current output of the system and the input signal in the next time step. This demonstrates that the slowness principle and predictive coding are intimately related.

  • F. Creutzig, A. Globerson, N. Tishby (2009)
    The Past-Future Information Bottleneck of Dynamical Systems
    Physical Review E 79, 041925.Featured in the Virtual Journal of Biological Physics Research. Abstract.


    Biological systems need to process information in real time and must trade off accuracy of presentation and coding costs. Here we operationalize this trade-off and develop an information-theoretic framework that selectively extracts information of the input past that is predictive about the output future, obtaining a generalized eigenvalue problem. Thereby, we unravel the input history in terms of structural phase transitions correspond-ing to additional dimensions of a state space. We elucidate the relation to canonical correlation analysis and give a numerical example. Altogether, this work relates information-theoretic optimization to the joint problem of system identification and model reduction.

  • F. Creutzig, S. Wohlgemuth, A. Stumpner, J. Benda, B. Ronacher, A. V. M. Herz (2009)
    Time-Scale Invariant Representation of Acoustic Communication Signals by a Bursting Neuron

    Journal of Neuroscience, 29(8): 2575:2580
    (web). Abstract.


    Acoustic communication signals often involve temporal sequences in which the relative durations of individual elements, such as sound pulses and brief pauses, but not their absolute durations, convey meaning. Decoding such signals requires an explicit or implicit calculation of the ratios between time intervals. Using grasshopper communication as a model, we demonstrate how this seemingly difficult computation can be solved in real time by a small set of auditory neurons. One of these cells, an ascending interneuron, generates bursts of action potentials in response to the rhythmic syllable-pause structure of grasshopper calls. Our data show that these bursts are preferentially triggered at syllable onset; the number of spikes within the burst is linearly correlated with the duration of the preceding pause. Integrating the number of spikes over a fixed time window therefore leads to a total spike count that reflects the species-characteristic syllable-to-pause ratio while being invariant to playing back the call faster or slower. Such a time-scale invariant recognition is essential under natural conditions, as grasshoppers do not thermoregulate - the call of a sender sitting in the shade will be slower than that of a grasshopper in the sun. Our results show that time-scale invariant stimulus recognition can be implemented at the single-cell level without directly calculating the ratio between pulse and interpulse durations.
  • F. Creutzig, J. Benda, S. Wohlgemuth, A. Stumpner, B. Ronacher, A. V. M. Herz (2010)
    Timescale-invariant pattern recognition by feed-forward inhibition and parallel signal processing

    Neural Computation 22(6):1493-1510
    (web). Abstract.


    The timescale-invariant recognition of temporal stimulus sequences is vital for many species and poses a challenge for their sensory systems. Here we present a simple mechanistic model to address this computational task, based on recent observations in insects that use rhythmic acoustic communication signals for mate finding. In the model framework, feed- forward inhibition leads to burst-like response patterns in one neuron of the circuit. Integrating these responses over a fixed time window by a readout neuron creates a timescale-invariant stimulus representation. Only two additional processing channels, each with a feature detector and a readout neuron, plus one final coincidence detector for all three parallel signal streams, are needed to account for the behavioral data. In contrast to previous solutions to the general time-warp problem, no time delay lines or sophisticated neural architectures are required. Our results suggest a new computational role for feedforward inhibition and underscore the power of parallel signal processing.


Book contributions

  • F. Creutzig (2009)
    From predictive coding to percept - a computational approach
    In: "Geist und Wissenschaft. Interdisziplinäre Perspektiven" (Hrsg. Ansgar Lyssy), Peter-Lang-Verlag.

    Abstract.

    Consciousness remains a difficult problem for philosophers and neuroscientist alike and, even more, there is no agreement on the definition of the problem. It is most difficult to imagine why the physical world generates qualia, the subjectiveness of personal experience. Theoretical neuroscientists argue that answering more tractable problems first may also help to approach such a hard question. Relatively easy problems relate to the substance of consciousness: What are the neural correlates of consciousness and where are they located? What could be the function of consciousness? Analyzing these problems, it is reasonable to investigate the neuronal basis of brain function and see how far we can get. Hence, the hope is that we unravel hidden concepts that give us the language to tackle even the question, why phenomenological states feel like anything at all. Indeed, substantial progress has been made on these questions in the preceding decades (Koch, 2004). Here, we take a basic approach and try to comment on the relation between substance and function of consciousness. From an evolutionary point of view, the nervous system has evolved to relate environmental stimuli to the organism's action. Or, in other words, it has evolved to give an interpration of sensory input such that this interpretation can guide motor output. There is no reason to believe that consciousness is not part of this framework. But what should be the precise function of consciousness? Could a zombie do the same job? In this text, we try to transfer the concept of information into a wider functional view of consciousness and emphasize the dimension of time. First, we review some evidence that sensory processing may follow the guiding principle of filtering temporal regularities. Second, we rephrase experimental results showing that neural correlates of consciousness are localized in cortical brain regions. Third, we try to distinguish between unconscious and conscious sensory-motor interactions. Forth, we integrate some speculations on the evolutionary function of having perceptions and take a viewpoint that regards perceptions as a bottleneck between sensory processing and complex motor response functions, emphasizing the role of time.


Posters


Conference talks

  • Time-Warp Invariant Stimulus Decoding in an Insect Auditory System?
    Featured talk, CNS 2006, Edinburgh

  • Projecting the Past into the Future: the State Space of Dynamical Systems as an Information Bottleneck Workshop: Revealing Hidden Elements of Dynamical Systems, NIPS 2006, Vancouver


PhD Thesis


Nerdy: Erdös number 4