Signal View

Raw Signal + Noise

Entropy Distribution

Probability Distribution & Information Content

Entropy Over Time

Raw Entropy
0.00
bits/symbol
Constrained Entropy
0.00
bits/symbol
Entropy Reduction
0.00
bits/symbol
Information Gain
0.00
%

Entropy Meter

0 bits Certain 4 bits Uncertain 8 bits

Simulation Controls

Pure Signal Noisy Random
No Constraints Moderate Strong

Information Entropy Explained

What is Entropy?

Entropy measures uncertainty or randomness in a signal. Higher entropy means more uncertainty and more information needed to describe the signal.

Signal vs Noise

Signal: Meaningful information you want to transmit. Noise: Random interference that increases entropy without adding information.

Constraints Reduce Entropy

Constraint theory reduces uncertainty by enforcing geometric rules (like Pythagorean snapping). This forces signals into discrete states, lowering entropy.

FPS Paradigm Connection

Each agent has partial information (low entropy locally) rather than global knowledge. Spatial constraints naturally limit what each agent perceives.