
This dissipation depends explicitly on the logical architecture of the information ratchet's memory and leads to general principles of computational design, helping us navigate a vast array of possible implementations of the same computation. Notably, when absorbing a patterned process as input and transforming it into random uncorrelated outputs, the modularity dissipation is minimized and work production is maximized if and only if the ratchet's memory contains the predictive causal states of the input process. This means that the ratchet must at least store the predictive statistical complexity of the process in order to efficiently harvest energy from it. By contrast, in order to efficiently generate a structured pattern from random inputs, the ratchet's memory must contain the retrodictive causal states. The information it stores must exceed the retrodictive statistical complexity. Thus, we see that natural definitions of complexity emerge from thermodynamic principles of entropy minimization.