ASSIS, J. M.; http://lattes.cnpq.br/9028062830015530; ASSIS, Juliana Martins de.
Resumo:
The aim of this dissertation is to clarify information processing in neurons, which are fundamental cells for animal survival and evolution, by means of information theory. In order to address this problem, we used Hodgkin-Huxley model, one of the most accurate neuronal models and two information measures: Kulback-Leibler distance and mutual information. Also, we model the neuron as a communication channel, where the input is the pair of parameters, mean and standard deviation, of a synaptic current. The output of this channel may be interspike interval or the number of spikes in a fixed time window (rate). Kullback-Leibler distance addresses how the neuron discriminates different inputs. Kullback-Leibler distance estimates revealed that the neuron better discriminates high intensity inputs using interspike interval as output, and low intensity inputs using rate as output. Moreover, mutual information measured dependency between neuronal input and neuronal activity output, revealing a complementary action between interspike interval and rate of spikes as outputs.