# Arbeitsgruppen-Seminar

Tuesday, 21.07.2015 17:15 c.t.## Maximum Pattern Entropy

*by Dr. Rob Shaw*

Contact person: Theo Geisel

### Location

Ludwig Prandtl lecture hall### Abstract

Can we define the entropy of a pattern of extended objects, for example, coins randomly placed on a table top?
At first the number of possible configurations will grow as objects are added to a fixed domain. But as the situation
becomes more crowded, the positions of objects become constrained, and the number of possible patterns
decreases. At a special density, the entropy is a maximum. For objects on a fixed lattice, the number of configurations
is countable, and the entropy is well-defined. An example is the monomer-dimer gas, random arrangements
of dominos on a checkerboard. We show how to extend the definition of this "pattern entropy" to the continuous case.
This number is proportional to the average amount of information required to specify, to some resolution,
a particular pattern out of the ensemble of possiblities.
An example application would be to use this measure to estimate information propagation in a simple neuronal spike model.
If we fix the refractory period of a neuron, and action potential spikes otherwise occur on average at random, what is the optimum
density of spikes for maximum information transfer down the axon? Too low a firing rate gives a low information transfer, but
the spike train for too high a firing rate becomes nearly periodic, with again low information transfer. This situation is analogous
to the classical Tonks gas, a collection of hard rods moving in one dimension. The optimum spike rate for the neuron model
corresponds to the density giving the peak pattern entropy in the Tonks gas.