Integration Measures
Integration Measures
Let’s define precise measures of integration that will play a central role in the phenomenological analysis.
The first is transfer entropy, which captures directed causal influence between components. The transfer entropy from process to process measures the information that provides about the future of beyond what ’s own past provides:
The deepest measure is integrated information (). Following IIT, the integrated information of a system in state is the extent to which the system’s causal structure exceeds the sum of its parts:
where the minimum is over all bipartitions of the system, and is an appropriate divergence (typically Earth Mover’s distance in IIT 4.0).
In practice, computing exactly is intractable. Three proxies make it operational:
- Transfer entropy density—average transfer entropy across all directed pairs:
- Partition prediction loss—the cost of factoring the model:
- Synergy—the information that components provide jointly beyond their individual contributions:
A complementary measure captures the system’s representational breadth rather than its causal coupling. The effective rank of a system with state covariance matrix measures how many dimensions it actually uses:
where are the eigenvalues of . This is bounded by , with when all variance is in one dimension (maximally concentrated) and when variance is uniformly distributed across all active dimensions.