ASSIS, J. M.; http://lattes.cnpq.br/9028062830015530; ASSIS, Juliana Martins de.
Resumen:
Many modern system projects depend on their components, which may relate to each
other by dependency or causality relations. What is meant by systems in this work
are those whose components may be evaluated or measured. For example: financial
systems, whose components may be stock markets; medical/biological systems, whose
components may be respiration, blood pressure, and heart rate; neurophysiological system,
whose components may be electroencephalogram or functional magnetic resonance
imaging from different parts of the brain; among many other systems that allow mathematical
treatment. Information theory has been presented as an efficient mean to quantify
the relations in these systems, bringing useful concepts and evaluating measures such as
mutual information, directed information, and transfer entropy. A fundamental question
when dealing with real systems concerns the difficulty to model the true underlying
probability densities of the involved variables. The definitions of mutual information,
directed information, and transfer entropy rely on these densities. In this context, the
present work evolves to investigate and contribute with estimation methods to measure
the relations among variables when studying systems. This work gives a special attention
to neuronal systems.