Alexander Vlaskin - Artificial Intelligence Glossarium - 1000 terms

Здесь есть возможность читать онлайн «Alexander Vlaskin - Artificial Intelligence Glossarium - 1000 terms» — ознакомительный отрывок электронной книги совершенно бесплатно, а после прочтения отрывка купить полную версию. В некоторых случаях можно слушать аудио, скачать через торрент в формате fb2 и присутствует краткое содержание. ISBN: , Жанр: Прочая научная литература, Прочая околокомпьтерная литература, Технические науки, Руководства, на английском языке. Описание произведения, (предисловие) а так же отзывы посетителей доступны на портале библиотеки ЛибКат.

Artificial Intelligence Glossarium: 1000 terms: краткое содержание, описание и аннотация

Предлагаем к чтению аннотацию, описание, краткое содержание или предисловие (зависит от того, что написал сам автор книги «Artificial Intelligence Glossarium: 1000 terms»). Если вы не нашли необходимую информацию о книге — напишите в комментариях, мы постараемся отыскать её.

Dear reader!Your attention is invited to a unique book!A modern glossary of over 1000 popular terms and definitions for artificial intelligence.This book is also unique in that it was written by practicing experts who worked together on the Program of the Center for Artificial Intelligence of the Bauman Moscow State Technical University.This text was previously published as part of the book «Glossary on artificial intelligence: 2500 terms» (Russian and English versions of the book).

Artificial Intelligence Glossarium: 1000 terms — читать онлайн ознакомительный отрывок

Ниже представлен текст книги, разбитый по страницам. Система сохранения места последней прочитанной страницы, позволяет с удобством читать онлайн бесплатно книгу «Artificial Intelligence Glossarium: 1000 terms», без необходимости каждый раз заново искать на чём Вы остановились. Поставьте закладку, и сможете в любой момент перейти на страницу, на которой закончили чтение.

Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Average precision (Средняя точность) –A metric for summarizing the performance of a ranked sequence of results. Average precision is calculated by taking the average of the precision values for each relevant result (each result in the ranked list where the recall increases relative to the previous result ) [ 71 71 Average precision [Электронный ресурс] // jonathan-hui.medium.com URL: https://jonathan-hui.medium.com/map-mean-average-precision-for-object-detection-45c121a31173 (дата обращения: 28.01.2022) ].

Ayasdi (Платформа Ayasdi)is an enterprise scale machine intelligence platform that delivers the automation that is needed to gain competitive advantage from the company’s big and complex data. Ayasdi supports large numbers of business analysts, data scientists, endusers, developers and operational systems across the organization, simultaneously creating, validating, using and deploying sophisticated analyses and mathematical models at scale.

“B”

Backpropagation (Обратное распространение ошибки) – Backpropagation, also called “backward propagation of errors,” is an approach that is commonly used in the training process of the deep neural network to reduce errors.

Backpropagation through time (BPTT) (Обратное распространение во времени) –A gradient-based technique for training certain types of recurrent neural networks. It can be used to train Elman networks. The algorithm was independently derived by numerous researchers.

Backward Chaining (Обратная цепочка (или обратное рассуждение)) – Backward chaining, also called goal-driven inference technique, is an inference approach that reasons backward from the goal to the conditions used to get the goal. Backward chaining inference is applied in many different fields, including game theory, automated theorem proving, and artificial intelligence [ 72 72 Backward Chaining [Электронный ресурс] www.educba.com URL: https://www.educba.com/backward-chaining/ (дата обращения 11.03.2022) ].

Bag-of-words model (Модель мешка слов) —A simplifying representation used in natural language processing and information retrieval (IR). In this model, a text (such as a sentence or a document) is represented as the bag (multiset) of its words, disregarding grammar and even word order but keeping multiplicity. The bag-of-words model has also been used for computer vision. The bag-of-words model is commonly used in methods of document classification where the (frequency of) occurrence of each word is used as a feature for training a classifier [ 73 73 Bag-of-words model [Электронный ресурс] // machinelearningmastery.ru URL: https://www.machinelearningmastery.ru/gentle-introduction-bag-words-model/ (дата обращения: 11.03.2022) ].

Bag-of-words model in computer vision (Модель мешка слов в компьютерном зрении) —In computer vision, the bag-of-words model (BoW model) can be applied to image classification, by treating image features as words. In document classification, a bag of words is a sparse vector of occurrence counts of words; that is, a sparse histogram over the vocabulary. In computer vision, a bag of visual words is a vector of occurrence counts of a vocabulary of local image features.

Baldwin effect (Эффект Балдвина) – the skills acquired by organisms during their life as a result of learning, after a certain number of generations, are recorded in the genome.

Baseline (Базовый уровень) – A model used as a reference point for comparing how well another model (typically, a more complex one) is performing. For example, a logistic regression model might serve as a good baseline for a deep model. For a particular problem, the baseline helps model developers quantify the minimal expected performance that a new model must achieve for the new model to be useful.

Batch (Пакет) – The set of examples used in one gradient update of model training.

Batch Normalization (Пакетная нормализация) – A preprocessing step where the data are centered around zero, and often the standard deviation is set to unity.

Batch size (Размер партии) –The number of examples in a batch. For example, the batch size of SGD is 1, while the batch size of a mini-batch is usually between 10 and 1000. Batch size is usually fixed during training and inference; however, TensorFlow does permit dynamic batch sizes.

Bayes’s Theorem (Теорема Байеса) –A famous theorem used by statisticians to describe the probability of an event based on prior knowledge of conditions that might be related to an occurrence.

Bayesian classifier in machine learning (Байесовский классификатор в машинном обучении)is a family of simple probabilistic classifiers based on the use of the Bayes theorem and the “naive” assumption of the independence of the features of the objects being classified.

Bayesian Filter ( Фильтрация по Байесу) is a program using Bayesian logic. It is used to evaluate the header and content of email messages and determine whether or not it constitutes spam – unsolicited email or the electronic equivalent of hard copy bulk mail or junk mail. A Bayesian filter works with probabilities of specific words appearing in the header or content of an email. Certain words indicate a high probability that the email is spam, such as Viagra and refinance [ 74 74 Bayesian Filter [Электронный ресурс] //certsrv.ru URL: http://certsrv.ru/eset_ss.ru/pages/bayes_filter.htm (дата обращения: 12.02.2022) ].

Bayesian Network (Байесовская сеть) – also called belief network, or probabilistic directed acyclic graphical model, is a probabilistic graphical model (a statistical model) that represents a set of variables and their conditional dependencies via a directed acyclic graph [ 75 75 Bayesian Network [Электрчатонный ресурс] // dic.academic.ru URL: https://dic.academic.ru/dic.nsf/ruwiki/1738444 (дата обращения: 31.01.2022) ].

Bayesian optimization (Байесовская оптимизация) –A probabilistic regression model technique for optimizing computationally expensive objective functions by instead optimizing a surrogate that quantifies the uncertainty via a Bayesian learning technique. Since Bayesian optimization is itself very expensive, it is usually used to optimize expensive-to-evaluate tasks that have a small number of parameters, such as selecting hyperparameters.

Bayesian programming (Байесовское программирование) –A formalism and a methodology for having a technique to specify probabilistic models and solve problems when less than the necessary information is available.

Bees algorithm (Алгоритм пчелиной колонии) —A population-based search algorithm which was developed by Pham, Ghanbarzadeh and et al. in 2005. It mimics the food foraging behaviour of honey bee colonies. In its basic version the algorithm performs a kind of neighbourhood search combined with global search, and can be used for both combinatorial optimization and continuous optimization. The only condition for the application of the bees algorithm is that some measure of distance between the solutions is defined. The effectiveness and specific abilities of the bees algorithm have been proven in a number of studies.

Читать дальше
Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Похожие книги на «Artificial Intelligence Glossarium: 1000 terms»

Представляем Вашему вниманию похожие книги на «Artificial Intelligence Glossarium: 1000 terms» списком для выбора. Мы отобрали схожую по названию и смыслу литературу в надежде предоставить читателям больше вариантов отыскать новые, интересные, ещё непрочитанные произведения.


Отзывы о книге «Artificial Intelligence Glossarium: 1000 terms»

Обсуждение, отзывы о книге «Artificial Intelligence Glossarium: 1000 terms» и просто собственные мнения читателей. Оставьте ваши комментарии, напишите, что Вы думаете о произведении, его смысле или главных героях. Укажите что конкретно понравилось, а что нет, и почему Вы так считаете.

x