Simon Haykin - Nonlinear Filters

Здесь есть возможность читать онлайн «Simon Haykin - Nonlinear Filters» — ознакомительный отрывок электронной книги совершенно бесплатно, а после прочтения отрывка купить полную версию. В некоторых случаях можно слушать аудио, скачать через торрент в формате fb2 и присутствует краткое содержание. Жанр: unrecognised, на английском языке. Описание произведения, (предисловие) а так же отзывы посетителей доступны на портале библиотеки ЛибКат.

Nonlinear Filters: краткое содержание, описание и аннотация

Предлагаем к чтению аннотацию, описание, краткое содержание или предисловие (зависит от того, что написал сам автор книги «Nonlinear Filters»). Если вы не нашли необходимую информацию о книге — напишите в комментариях, мы постараемся отыскать её.

NONLINEAR FILTERS
Discover the utility of using deep learning and (deep) reinforcement learning in deriving filtering algorithms with this insightful and powerful new resource Nonlinear Filters: Theory and Applications
Nonlinear Filters
Nonlinear Filters: Theory and Applications

Nonlinear Filters — читать онлайн ознакомительный отрывок

Ниже представлен текст книги, разбитый по страницам. Система сохранения места последней прочитанной страницы, позволяет с удобством читать онлайн бесплатно книгу «Nonlinear Filters», без необходимости каждый раз заново искать на чём Вы остановились. Поставьте закладку, и сможете в любой момент перейти на страницу, на которой закончили чтение.

Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Entropy can also be interpreted as the expected value of the term Nonlinear Filters - изображение 268:

(2.97) where is the expectation operator and is the probability density funct - фото 269

where картинка 270is the expectation operator and картинка 271is the probability density function (PDF) of картинка 272. Definition of Shannon's entropy, картинка 273, shows that it is a function of the corresponding PDF. It will be insightful to examine the way that this information measure is affected by the shape of the PDF. A relatively broad and flat PDF, which is associated with lack of predictability, has high entropy. On the other hand, if the PDF is relatively narrow and has sharp slopes around a specific value of картинка 274, which is associated with bias toward that particular value of Nonlinear Filters - изображение 275, then the PDF has low entropy. A rearrangement of the tuples Nonlinear Filters - изображение 276may change the shape of the PDF curve significantly but it does not affect the value of the summation or integral in ( 2.95) or ( 2.96), because summation and integration can be calculated in any order. Since картинка 277is not affected by local changes in the PDF curve, it can be considered as a global measure of the behavior of the corresponding PDF [27].

Definition 2.4 Joint entropy is defined for a pair of random vectors based on their joint distribution as:

(2.98) Definition 25 Conditional entropy is defined as the entropy of a random - фото 278

Definition 2.5 Conditional entropy is defined as the entropy of a random variable (state vector) conditional on the knowledge of another random variable (measurement vector):

(2.99) It can also be expressed as 2100 Definition 26 Mutual information - фото 279

It can also be expressed as:

(2.100) Definition 26 Mutual information between two random variables is a measure of - фото 280

Definition 2.6 Mutual information between two random variables is a measure of the amount of information that one contains about the other. It can also be interpreted as the reduction in the uncertainty about one random variable due to knowledge about the other one. Mathematically it is defined as:

(2.101) Substituting for from 299 into the aforementioned equation we will have - фото 281

Substituting for from (2.99) into the aforementioned equation, we will have:

(2.102) Therefore mutual information is symmetric with respect to and It ca - фото 282

Therefore, mutual information is symmetric with respect to картинка 283and картинка 284. It can also be viewed as a measure of dependence between the two random vectors. Mutual information is nonnegative; being equal to zero, if and only if картинка 285and картинка 286are independent. The notion of observability for stochastic systems can be defined based on the concept of mutual information.

Definition 2.7 (Stochastic observability) The random vector (state) is unobservable from the random vector (measurement), if they are independent or equivalently . Otherwise, is observable from .

Since mutual information is nonnegative, ( 2.101) leads to the following conclusion: if either Nonlinear Filters - изображение 287or Nonlinear Filters - изображение 288, then картинка 289is observable from картинка 290[28].

2.8 Degree of Observability

Instead of considering the notion of observability as a yes/no question, it will be helpful in practice to pose the question of how observable a system may be [29]. Knowing the answer to this question, we can select the best set of variables, which can be directly measured, as outputs to improve observability [30]. With this in mind and building on Section 2.7, mutual information can be used as a measure for the degree of observability [31].

An alternative approach aiming at providing insight into the observability of the system of interest in filtering applications uses eigenvalues of the estimation error covariance matrix. The largest eigenvalue of the covariance matrix is the variance of the state or a function of states, which is poorly observable. Hence, its corresponding eigenvector provides the direction of poor observability. On the other hand, states or functions of states that are highly observable are associated with smaller eigenvalues, where their corresponding eigenvectors provide the directions of good observability [30].

A deterministic system is either observable or unobservable, but for stochastic systems, the degree of observability can be defined as [32]:

(2.103) Nonlinear Filters - изображение 291

which is a time‐dependent non‐decreasing function that varies between 0 and 1. Before starting the measurement process, Nonlinear Filters - изображение 292and therefore, Nonlinear Filters - изображение 293, which makes Nonlinear Filters - изображение 294. As more measurements become available, картинка 295may reduce and therefore, картинка 296may increase, which leads to the growth of up to 1 33 29 Invertibility Observability can be studied regarding the - фото 297up to 1 [33].

Читать дальше
Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Похожие книги на «Nonlinear Filters»

Представляем Вашему вниманию похожие книги на «Nonlinear Filters» списком для выбора. Мы отобрали схожую по названию и смыслу литературу в надежде предоставить читателям больше вариантов отыскать новые, интересные, ещё непрочитанные произведения.


Отзывы о книге «Nonlinear Filters»

Обсуждение, отзывы о книге «Nonlinear Filters» и просто собственные мнения читателей. Оставьте ваши комментарии, напишите, что Вы думаете о произведении, его смысле или главных героях. Укажите что конкретно понравилось, а что нет, и почему Вы так считаете.

x