Safwan El Assad - Digital Communications 1

Здесь есть возможность читать онлайн «Safwan El Assad - Digital Communications 1» — ознакомительный отрывок электронной книги совершенно бесплатно, а после прочтения отрывка купить полную версию. В некоторых случаях можно слушать аудио, скачать через торрент в формате fb2 и присутствует краткое содержание. Жанр: unrecognised, на английском языке. Описание произведения, (предисловие) а так же отзывы посетителей доступны на портале библиотеки ЛибКат.

Digital Communications 1: краткое содержание, описание и аннотация

Предлагаем к чтению аннотацию, описание, краткое содержание или предисловие (зависит от того, что написал сам автор книги «Digital Communications 1»). Если вы не нашли необходимую информацию о книге — напишите в комментариях, мы постараемся отыскать её.

It is a complete training in digital communications in the same book with all the aspects involved in such training: courses, tutorials with many typical problems targeted with detailed solutions, practical work concretely illustrating various aspects of technical implementation implemented. It breaks down into three parts. The Theory of information itself, which concerns both the sources of information and the channels of its transmission, taking into account the errors they introduce in the transmission of information and the means of protect by the use of appropriate coding methods. Then for the technical aspects of transmission, first the baseband transmission is presented with the important concept and fundamental technique of equalization. The performance evaluation in terms of probability of errors is systematically developed and detailed as well as the online codes used. Finally, the third part presents the Transmissions with digital modulation of carriers used in radio transmissions but also on electric cables. A second important aspect in learning a learner's knowledge and skills is this book. It concerns the «Directed Work» aspect of a training. This is an ordered set of 33 typical problems with detailed solutions covering the different parts of the course with practical work. Finally, the last aspect concerns the practical aspects in the proper sense of the term, an essential complement to training going as far as know-how. We propose here a set of 5 practical works.

Digital Communications 1 — читать онлайн ознакомительный отрывок

Ниже представлен текст книги, разбитый по страницам. Система сохранения места последней прочитанной страницы, позволяет с удобством читать онлайн бесплатно книгу «Digital Communications 1», без необходимости каждый раз заново искать на чём Вы остановились. Поставьте закладку, и сможете в любой момент перейти на страницу, на которой закончили чтение.

Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

We denote:

– [X] = [xl, x2, ... , xn]: the set of all the symbols at the input of the channel;

– [y] = [yi, ... , ym]: the set of all the symbols at the output of the channel;

– [P(X)] = [p(x1), p(x2), ...,p(xn)]: the vector of probability of symbols at the input of the channel;

– [P(Y)] = [p(yi), p(y2), ... , p(ym)]: the vector of probability of symbols at the output of the channel.

Because of the perturbations, the space [ Y ] can be different from the space [ X ], and the probabilities P ( Y ) can be different from the probabilities P ( X ).

We define a product space [ XY ] and we introduce the matrix of the probabilities of the joint symbols, input-output [ P ( X, Y )]:

[2.20] We deduce from this matrix of probabilities 221 222 - фото 50

We deduce, from this matrix of probabilities:

[2.21] 222 We then define the following entropies the entropy of the - фото 51

[2.22] We then define the following entropies the entropy of the source223 - фото 52

We then define the following entropies:

– the entropy of the source:[2.23]

– the entropy of variable Y at the output of the transmission channel:[2.24]

– the entropy of the two joint variables (X, Y)Because of the disturbances in the transmission channel, if the symbolinput-output:[2.25]

2.5.1. Conditional entropies

Because of the disturbances in the transmission channel, if the symbol yj appears at the output, there is an uncertainty on the symbol xi, j = 1, ... , ,n which has been sent.

Figure 23 Ambiguity on the symbol at the input when yj is received The - фото 53

Figure 2.3. Ambiguity on the symbol at the input when yj is received

The average value of this uncertainty, or the entropy associated with the receipt of the symbol yj , is:

[2.26] The mean value of this entropy for all the possible symbols yj received is - фото 54

The mean value of this entropy for all the possible symbols yj received is:

[2.27] Which can be written as 228 or 229 - фото 55

Which can be written as:

[2.28] or 229 The entropy H XY is called equivocation ambiguity and - фото 56

or:

[2.29] The entropy H XY is called equivocation ambiguity and corresponds to the - фото 57

The entropy H ( X/Y ) is called equivocation (ambiguity) and corresponds to the loss of information due to disturbances (as I ( X, Y ) = H ( X )− H ( X/Y )). This will be specified a little further.

Because of disturbances, if the symbol xi is issued, there is uncertainty about the received symbol yj, j = 1, ... , m .

Figure 24 Uncertainty on the output when we know the input The entropy of - фото 58

Figure 2.4. Uncertainty on the output when we know the input

The entropy of the random variable Y at the output knowing the X at the input is:

[2.30] This entropy is a measure of the uncertainty on the output variable when that - фото 59

This entropy is a measure of the uncertainty on the output variable when that of the input is known .

The matrix P(Y/X) is called the channel noise matrix:

[2.31] A fundamental property of this matrix is 232 Where p yj xi is the - фото 60

A fundamental property of this matrix is:

[2.32] Where p yj xi is the probability of receiving the symbol yj when the - фото 61

Where: p ( yj / xi ) is the probability of receiving the symbol yj when the symbol xi has been emitted.

In addition, one has:

[2.33] with 234 p yj is the probability of receiving the symbol yj whatever - фото 62

with:

[2.34] p yj is the probability of receiving the symbol yj whatever the symbol xi - фото 63

p ( yj ) is the probability of receiving the symbol yj whatever the symbol xi emitted, and:

[2.35] p xi yj is the probability that the symbol xi was issued when the symbol - фото 64

p ( xi / yj ) is the probability that the symbol xi was issued when the symbol yj is received.

2.5.2. Relations between the various entropies

We can write:

[2.36] In the same way as one has H Y X H X Y therefore 237 In - фото 65

In the same way, as one has: H ( Y, X ) = H ( X, Y ), therefore:

[2.37] In addition one has the following inequalities 238 and similarly 239 - фото 66

In addition, one has the following inequalities:

[2.38] and similarly 239 SPECIAL CASES Noiseless channel in this case on - фото 67

and similarly:

[2.39] SPECIAL CASES Noiseless channel in this case on receipt of yj there is - фото 68

SPECIAL CASES.–

Noiseless channel: in this case, on receipt of yj , there is certainty about the symbol actually transmitted, called xi (one-to-one correspondence), therefore:

[2.40] Consequently 241 and 242 - фото 69

Consequently:

[2.41] and 242 Channel with maximum power noise in this case the variable at - фото 70

Читать дальше
Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Похожие книги на «Digital Communications 1»

Представляем Вашему вниманию похожие книги на «Digital Communications 1» списком для выбора. Мы отобрали схожую по названию и смыслу литературу в надежде предоставить читателям больше вариантов отыскать новые, интересные, ещё непрочитанные произведения.


Отзывы о книге «Digital Communications 1»

Обсуждение, отзывы о книге «Digital Communications 1» и просто собственные мнения читателей. Оставьте ваши комментарии, напишите, что Вы думаете о произведении, его смысле или главных героях. Укажите что конкретно понравилось, а что нет, и почему Вы так считаете.

x