> Entropy

Right, without the 'h', of course.

> Entropy of an information stream relates not to "unnecessary slime", but on the contrary to the information density (if the stream is intellegible).

Yeah, got my lines crossed there - I think I get you. The amount of randomness determines the amount of information, right? So paradoxically, the more 'messy' a signal is, the more information it can/does carry. Entropy measures this randomness or propensity towards information carrying - which seems really odd because in thermodynamics you think of an increasingly sparse situation.