The discussion of coding in the previous slide, we

were referring to first order coders, in this case.

Each symbol is encoded one at a

time, independently from the rest of the symbols.

Alternatively we can form block code.

In this case, capital N symbols of the original source are combined together to

generate a single new symbol generated by this now combined source, s of n.

Since all capital N combinations of the original source symbols are considered,

the alphabet of this new source S of N increases, and it's equal to N to the N.

It can be shown that in this case the entropy of this new, extended also

this is called source, is N times the entropy of original source.

This is actually a mechanism to increase the efficiency of a

code, as we'll discuss later and also demonstrate through specific examples.

Finally, you'll also have non-block codes.

And arithmetic and Lempel-Ziv codes belong to this class.

And will discuss briefly these codes later in the class.

Coding and entropy of a source are connected through the celebrated

source coding theorem due to Shannon and his 1948 seminal work.