Entropy measures this randomness or propensity towards information carrying - which seems really odd because in thermodynamics you think of an increasingly sparse situation. Still not quite there: The first part is ok, but in thermodynamics as well, the higher the possible number of distinct states a system can be in, the higher the entropy (at a given temperature). So there is no real contradiction between the two fields of application.