A. K. Md. Ehsanes Saleh - Rank-Based Methods for Shrinkage and Selection

Здесь есть возможность читать онлайн «A. K. Md. Ehsanes Saleh - Rank-Based Methods for Shrinkage and Selection» — ознакомительный отрывок электронной книги совершенно бесплатно, а после прочтения отрывка купить полную версию. В некоторых случаях можно слушать аудио, скачать через торрент в формате fb2 и присутствует краткое содержание. Жанр: unrecognised, на английском языке. Описание произведения, (предисловие) а так же отзывы посетителей доступны на портале библиотеки ЛибКат.

Rank-Based Methods for Shrinkage and Selection: краткое содержание, описание и аннотация

Предлагаем к чтению аннотацию, описание, краткое содержание или предисловие (зависит от того, что написал сам автор книги «Rank-Based Methods for Shrinkage and Selection»). Если вы не нашли необходимую информацию о книге — напишите в комментариях, мы постараемся отыскать её.

Rank-Based Methods for Shrinkage and Selection
A practical and hands-on guide to the theory and methodology of statistical estimation based on rank Rank-Based Methods for Shrinkage and Selection: With Application to Machine Learning
Rank-Based Methods for Shrinkage and Selection

Rank-Based Methods for Shrinkage and Selection — читать онлайн ознакомительный отрывок

Ниже представлен текст книги, разбитый по страницам. Система сохранения места последней прочитанной страницы, позволяет с удобством читать онлайн бесплатно книгу «Rank-Based Methods for Shrinkage and Selection», без необходимости каждый раз заново искать на чём Вы остановились. Поставьте закладку, и сможете в любой момент перейти на страницу, на которой закончили чтение.

Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

It is my pleasure to write this foreword for Professor Saleh’s latest book, “Rank-based Methods for Shrinkage and Selection with Application to Machine Learning”. I have known Professor Saleh for many decades as a leader in Canadian statistics and looked forward to meeting him regularly at the Annual Meeting of the Statistical Society of Canada.

We are well into the golden age of probability and statistics with the emergence of data science and machine learning. Many decades ago, we could attract many bright students to this field of endeavor but today the interest is overwhelming. The connection between theoretical statistics and applied statistics is an important part of machine learning and data science. In order to engage fully in data science, one needs a solid understanding of both the theoretical and the practical aspects of probability and statistics.

The book is unique in presenting a comprehensive approach to inference in regression models based on ranks. It starts with the basics, which enables rapid understanding of the innovative ideas in the rest of the book. In addition to the more familiar aspects of rank-based methods such as comparisons among groups, linear regression, and time series, the authors show how many machine-learning tools can be made more robust via rank-based methods. Modern approaches to model selection, logistic regression, neural networks, elastic net, and penalized regression are studied through this lens. The work is presented clearly and concisely, and highlights many areas for further investigation.

Professor Saleh’s wealth of experience with ridge regression, Stein’s method, and preliminary test estimation informs the arc of the book, and he and his co-authors have built on his expertise to expand the application of rank-based approaches to modern big-data and high-dimensional settings. Careful attention is paid throughout to both theoretical rigor and engaging applications. The applications are made accessible through detailed discussion of computational methods and their implementation in the R and Python computing environments. Its broad coverage of topics, and careful attention to both theory and methods, ensures that this book will be an invaluable resource for students and researchers in statistics.

The authors have identified many areas of useful future research that could be pursued by graduate students and practitioners alike. In this regard, this book is an important contribution in the ongoing research towards robust data science.

Professor N. Reid

University of Toronto

June 2021

Preface

The objective of this book is to introduce the audience to the theory and application of robust statistical methodologies using rank-based methods. We present a number of new ideas and research directions in machine learning and statistical analysis that the reader can and should pursue in the future. We begin by noting that the well-known least squares and likelihood principles are traditional methods of estimation in machine learning and data science. One of the most widely read books is the Introduction to Statistical Learning (James et al., 2013) which describes these and other methods. However, it also properly identifies many of their shortcomings, especially in terms of robustness in the presence of outliers. Our book describes a number of novel ideas and concepts to resolve these problems, many of which are worthy of further investigation. Our goal is to motivate the interest of more researchers to pursue further activities in this field. We build on this motivation to carry out a rigorous mathematical analysis of rank-based penalty estimators.

From our point of view, outliers are present in almost all real-world data sets. They may be the result of human error, transmission error, measurement error or simply due to the nature of the data being collected. Whatever be the reason, we must first recognize that all data sets have some form of outliers and then build solutions based on this fact. Outliers may greatly affect the estimates and lead to poor prediction accuracy. As a result, operations such as data cleaning, outlier detection and robust regression methods are extremely important in building models that provide suitably accurate prediction capability. Here, we describe rank-based methods to address many such problems. Indeed, many researchers are now involved in these and other methods towards robust data science . Most of the methods and results presented in this book were derived from our implementations in R and Python which are languages used routinely by statisticians and by practitioners in machine learning and data science. Some of the problems at the end of each chapter involve the use of R. The reader will be well-served to follow the descriptions in the book while implementing the ideas wherever possible in R or Python. This is the best way to get the most out of this book.

Rank regression is based on the linear rank dispersion function described by Jaeckel (1972). The dispersion function replaces the least squares loss function to enable estimates based on the median rather than the mean. This book is intended to guide the reader in this direction starting with basic principles such as the importance of the median vs. the mean, comparisons of rank vs. least squares methods on simple linear problems, and the role of penalty functions in improving the accuracy of prediction. We present new practical methods of data cleaning, subset selection and shrinkage estimation in the context of rank-based methods. We then begin our theoretical journey starting with basic rank statistics for location and simple linear models, and then move on to multiple regression, ANOVA and problems in a high-dimensional setting. We conclude with new ideas not published elsewhere in the literature in the area of rank-based logistic regression and neural networks to address classification problems in machine learning and data science.

We believe that most practitioners today are still employing least squares and log-likelihood methods that are not robust in the presence of outliers. This is due to the long history of these estimation methods in statistics and their natural adoption in the machine learning community over the past two decades. However, the history of estimation theory actually changed its course radically many decades prior when Stein (1956) and James and Stein (1961) proved that the sample mean based on a sample from a p -dimensional multivariate normal distribution is inadmissible under a quadratic loss function for p ≥ 3. This result gave birth to a class of shrinkage estimators in various forms and set-ups. Due to the immense impact of Stein’s theory, scores of technical papers appeared in the literature covering many areas of application. Beginning in the 1970s, the pioneering work of Saleh and Sen (1978, 1983, 1984b, a, 1985a, a, b, c, d, e, 1986, 1987) expanded the scope of this class of shrinkage estimators using the “quasi-empirical Bayes” method to obtain robust (such as R-, L-, and M-estimation) Stein-type estimators. Details are provided in Saleh (2006).

Of particular interest here is the use of penalty estimators in the context of robust R-estimation. Next generation “shrinkage estimators” known as “ridge regression estimators” for the multiple linear regression model were developed by Hoerl and Kennard (1970) based on “Tikhonov’s regularization” (Tikhonov, 1963). The ridge regression (RR) estimator is the result of minimizing the penalized least squares criterion using an L 2-penalty function. Ridge regression laid the foundation of penalty estimation. Later, Tibshirani (1996) proposed the “least absolute shrinkage and selection operator” (LASSO) by minimizing the penalized least squares criterion using an L 1-penalty function which went viral in the area of model selection.

Читать дальше
Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Похожие книги на «Rank-Based Methods for Shrinkage and Selection»

Представляем Вашему вниманию похожие книги на «Rank-Based Methods for Shrinkage and Selection» списком для выбора. Мы отобрали схожую по названию и смыслу литературу в надежде предоставить читателям больше вариантов отыскать новые, интересные, ещё непрочитанные произведения.


Отзывы о книге «Rank-Based Methods for Shrinkage and Selection»

Обсуждение, отзывы о книге «Rank-Based Methods for Shrinkage and Selection» и просто собственные мнения читателей. Оставьте ваши комментарии, напишите, что Вы думаете о произведении, его смысле или главных героях. Укажите что конкретно понравилось, а что нет, и почему Вы так считаете.

x