Yeah, I've been thinking about this topic a little lately with regard to semantic redundancy. Claude Shannon, inventor of information theory, was one of the first to point out the extraordinary redundancy in ordinary communication. Most common language tends to have more than 50% redundancy - which is apparently necessary in a world of mumbling people and typos. People use their shared knowledge to fill in the blanks and depending upon how much shared knowledge you have (hard to quantify) you can theoretically leave out many words like the one at the end of this. So each new portion of language used is partly constrained by what comes before it and its context. I guess our brain slowly comes to expect these rules to be kept which is why we get such a jolt when we see something that doesn't fit within a certain pattern. If you read a poem and it has odd word order, you'd probably not think much of it, but in an e-mail you'd get really confused and reread it over and over to understand what the person was trying to get at.

Communication programmers have a deep understanding of the subject matter on a bits 'n' bytes level because they develop programming elements that can save space in transmissions or for storage. It's intersting to note that one of the 'least efficient' protocols (TCP/IP) has become the de facto standard in computing - most of what is sent and received is checksums and transmission data and what not. So too our DNA holds alot of apparent "junk". So there always seems to be a lot of more or less unnecessary cohesive slime hanging around in communication, whether its digital or not, hey. It's this that is responsible for what is called the 'entrophy' that exists in communication the same as in the physical world.