Entropy (noun, “EN-troh-pee”)
Entropy is a measure of the randomness of particles and energy in a system. High entropy means high randomness. Low entropy means low randomness.
Imagine building a castle from blocks. The blocks start out in a disorganized pile. The pile has high randomness because there are so many ways you could arrange the blocks and still have just a pile. This is a high-entropy state.
Over time, you pick blocks from the pile and arrange them into turrets and walls. A castle requires a very specific setup. You can only arrange the blocks a few ways to make it. So the castle is a less random, lower entropy state.
Gradually, the messy, high-entropy pile becomes an organized, low-entropy structure. You have to spend time and energy to do this. Maintain...


English (US)