Learn the fundamentals of digital signal processing theory and discover the myriad ways DSP makes everyday life more productive and fun.

Loading...

來自 École Polytechnique Fédérale de Lausanne 的課程

数字信号处理

402 個評分

Learn the fundamentals of digital signal processing theory and discover the myriad ways DSP makes everyday life more productive and fun.

從本節課中

Module 6: Digital Communication Systems - Module 7: Image Processing

- Paolo PrandoniLecturer

School of Computer and Communication Science - Martin VetterliProfessor

School of Computer and Communication Sciences

So, we have proven that we can recover the base band signal,

even when the original base band signal is complex.

The final design for the transmitter is the following.

Bitstream, Scrambler, Mapper, up-sampler.

We multiply the up-sample signal by e to the j omega c n.

We obtain a complex pass-band signal, we take the real part, and

finally we have the sequence of samples that we can send to the D/A converter and

then over onto the channel.

The receiver will receive the analog signal over the channel,

will sample it at the proper sampling rate, and

then demodulate it so the signal is split into two parts that are identical.

The first copy will be multiplied by cosine of omega CN,

and the second copy will be multiplied by sine of omega CN.

Both will go through a low pass filter that is really the matched filter

of the up sampling filter we used as a transmitter, and

this will give us the real part of the original base band signal and

the imaginary part of the original base band signal.

We multiply this by j and we sum them together, and

now we have the base band signal which will go through a down sampler.

We keep one sample out of k and

this will give us the estimated symbol sequence created by the transmitter.

The slicer will find the closest alphabet symbol and

recover the chunk n bits associated to the symbol.

And finally the descrambler will undo the randomization and

recover the original user data stream.

So let's see how we can put everything we've learned so far together and

design a practical system to send data over the telephone channel.

Suppose that the bandwidth constraint for the telephone channel stipulates that we

can only transmit data from 450Hz to 2850Hz.

This gives us a usable bandwidth, W, of 2400Hz.

With the center frequency F(c) of 1650Hz.

Now remember in our all digital paradigm we

have to pick a sample frequency that is at least twice the highest frequency there.

So, F(max).

Times 2, this would give us a sampling frequency of least 5,700 hertz.

But also, we're going to use an upsampling factor which is an integer, and the trick

is to pick a sampling frequency which is an integer multiple of the bandwidth.

So if we pick the multiple equal to three, then we get a sampling frequency

of 7200Hz which of course satisfies the Nyquist criterion.

When we translate this back into the original domain,

we find that the modulating frequency is 0.4585 pi.

Now let's tackle the power constraint and

assume that the telephone line has a maximum SNR or 22dB.

You have to pick a probability of error you can live with, and

let's say that we picked ten to the minus six.

If we use QAM, we can use the formula that we saw in the previous module

to find the size of the alphabet.

Or alternatively the number of bits per symbol that we can send,

and the formula is this one.

And when we plug in the values for the probability of error and

signal to noise ratio we find that we can send at least four bits per symbol,

and with four bits per symbol we will have a consolation of 16 points,

so that would look like this.

The final data rate, remember, is the bot rate times the bits per symbol.

The bot rate is equal to the bandwidth, so 2,400, and so

we have a total of 9600 bits per second.

This is actually an operating mode of a modem standard called V.24.

It was popular in the 90s.

And it is still sometimes used in fax machines.

Now the question is are we doing good with respect to the maximum

amount of information that we can send over this channel?

Remember we use very specific design choices to derive this figure of 9600 bits

per second, a specific modulation scheme, a specific probability of error and

so on, so forth.

What is the best one can do?

Well, this is a complex question and

an exhaustive answer would require several lectures in information theory.

But there is a formula derived by Claude Shannon in the late 1940's that states

the capacity of a channel given its bandwidth and its signal to noise ratio.

And this capacity specifies the amount of information that we can sell reliably,

meaning with a non returning low probability of error, over a channel.

The formula unfortunately is not constructive,

it doesn't tell us how to send this data, but gives an upper bound

on the amount of information that can be sent over the channel.

So for instance for the parameters that we use before, the maximum capacity for

that channel would be 17,500 bits per second.

And with our design scheme we are basically hitting

half the capacity of the channel.

The gap can be narrowed if we use more sophisticated modulation and

data coding techniques.

But as I said, to explore this topics,

we would have to start an entirely new class in information theory.