Edward O. Pyzer-Knapp - Deep Learning for Physical Scientists

Здесь есть возможность читать онлайн «Edward O. Pyzer-Knapp - Deep Learning for Physical Scientists» — ознакомительный отрывок электронной книги совершенно бесплатно, а после прочтения отрывка купить полную версию. В некоторых случаях можно слушать аудио, скачать через торрент в формате fb2 и присутствует краткое содержание. Жанр: unrecognised, на английском языке. Описание произведения, (предисловие) а так же отзывы посетителей доступны на портале библиотеки ЛибКат.

Deep Learning for Physical Scientists: краткое содержание, описание и аннотация

Предлагаем к чтению аннотацию, описание, краткое содержание или предисловие (зависит от того, что написал сам автор книги «Deep Learning for Physical Scientists»). Если вы не нашли необходимую информацию о книге — напишите в комментариях, мы постараемся отыскать её.

Discover the power of machine learning in the physical sciences with this one-stop resource from a leading voice in the field  Deep Learning for Physical Scientists: Accelerating Research with Machine Learning Designed to teach researchers to think in useful new ways about how to achieve results in their research, the book provides scientists with new avenues to attack problems and avoid common pitfalls and problems. Practical case studies and problems are presented, giving readers an opportunity to put what they have learned into practice, with exemplar coding approaches provided to assist the reader. 
From modelling basics to feed-forward networks, the book offers a broad cross-section of machine learning techniques to improve physical science research. Readers will also enjoy: 
A thorough introduction to the basic classification and regression with perceptrons An exploration of training algorithms, including back propagation and stochastic gradient descent and the parallelization of training An examination of multi-layer perceptrons for learning from descriptors and de-noising data Discussions of recurrent neural networks for learning from sequences and convolutional neural networks for learning from images A treatment of Bayesian optimization for tuning deep learning architectures Perfect for academic and industrial research professionals in the physical sciences, 
 will also earn a place in the libraries of industrial researchers who have access to large amounts of data but have yet to learn the techniques to fully exploit that access. 
Perfect for academic and industrial research professionals in the physical sciences, Deep Learning for Physical Scientists: Accelerating Research with Machine Learning will also earn a place in the libraries of industrial researchers who have access to large amounts of data but have yet to learn the techniques to fully exploit that access.  This book introduces the reader to the transformative techniques involved in deep learning. A range of methodologies are addressed including: •Basic classification and regression with perceptrons •Training

Deep Learning for Physical Scientists — читать онлайн ознакомительный отрывок

Ниже представлен текст книги, разбитый по страницам. Система сохранения места последней прочитанной страницы, позволяет с удобством читать онлайн бесплатно книгу «Deep Learning for Physical Scientists», без необходимости каждый раз заново искать на чём Вы остановились. Поставьте закладку, и сможете в любой момент перейти на страницу, на которой закончили чтение.

Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Table of Contents

1 Cover

2 Title Page

3 Copyright Page

4 About the Authors

5 Acknowledgements

6 1 Prefix – Learning to “Think Deep” 1.1 So What Do I Mean by Changing the Way You Think?

7 2 Setting Up a Python Environment for Deep Learning Projects2.1 Python Overview 2.2 Why Use Python for Data Science? 2.3 Anaconda Python 2.4 Jupyter Notebooks

8 3 Modelling Basics3.1 Introduction 3.2 Start Where You Mean to Go On – Input Definition and Creation 3.3 Loss Functions 3.4 Overfitting and Underfitting 3.5 Regularisation 3.6 Evaluating a Model 3.7 The Curse of Dimensionality 3.8 Summary

9 4 Feedforward Networks and Multilayered Perceptrons4.1 Introduction 4.2 The Single Perceptron 4.3 Moving to a Deep Network 4.4 Vanishing Gradients and Other “Deep” Problems 4.5 Improving the Optimisation 4.6 Parallelisation of learning 4.7 High and Low‐level Tensorflow APIs 4.8 Architecture Implementations 4.9 Summary 4.10 Papers to Read

10 5 Recurrent Neural Networks5.1 Introduction 5.2 Basic Recurrent Neural Networks 5.3 Long Short‐Term Memory (LSTM) Networks 5.4 Gated Recurrent Units 5.5 Using Keras for RNNs 5.6 Real World Implementations 5.7 Summary 5.8 Papers to Read

11 6 Convolutional Neural Networks6.1 Introduction 6.2 Fundamental Principles of Convolutional Neural Networks 6.3 Graph Convolutional Networks 6.4 Real World Implementations 6.5 Summary 6.6 Papers to Read

12 7 Auto‐Encoders7.1 Introduction 7.2 Getting a Good Start – Stacked Auto‐Encoders, Restricted Boltzmann Machines, and Pretraining 7.3 Denoising Auto‐Encoders 7.4 Variational Auto‐Encoders 7.5 Sequence to Sequence Learning 7.6 The Attention Mechanism 7.7 Application in Chemistry: Building a Molecular Generator 7.8 Summary 7.9 Real World Implementations 7.10 Papers to Read

13 8 Optimising Models Using Bayesian Optimisation8.1 Introduction 8.2 Defining Our Function 8.3 Grid and Random Search 8.4 Moving Towards an Intelligent Search 8.5 Exploration and Exploitation 8.6 Greedy Search 8.7 Diversity Search 8.8 Bayesian Optimisation 8.9 Summary 8.10 Papers to Read

14 Case Study 1: Solubility Prediction Case Study CS 1.1 Step 1 – Import Packages CS 1.2 Step 2 – Importing the Data CS 1.3 Step 3 – Creating the Inputs CS 1.4 Step 4 – Splitting into Training and Testing CS 1.5 Step 5 – Defining Our Model CS 1.6 Step 6 – Running Our Model CS 1.7 Step 7 – Automatically Finding an Optimised Architecture Using Bayesian Optimisation

15 Case Study 2: Time Series Forecasting with LSTMs CS 2.1 Simple LSTM CS 2.2 Sequence‐to‐Sequence LSTM

16 Case Study 3: Deep Embeddings for Auto‐Encoder‐Based Featurisation

17 Index

18 End User License Agreement

List of Tables

1 Chapter 3 Table 3.1 A rule of thumb guide for understanding AUC‐ROC scores.

List of Illustrations

1 Chapter 3 Figure 3.1 Examples of ROC curves. Figure 3.2 Optimal strategy without knowing the distribution. Figure 3.3 Optimal strategy when you know 50% of galaxies are elliptical and... Figure 3.4 A graphical look at the bias–variance trade‐off. Figure 3.5 A flow chart for dealing with high bias or high‐variance situatio...Figure 3.6 Graphical representation of the holdout‐validation algorithm.Figure 3.7 The effects of different scales on a simple loss function topolog...

2 Chapter 4Figure 4.1 An overview of a single perceptron learning.Figure 4.2 The logistic function.Figure 4.3 Derivatives of the logistic function.Figure 4.4 How learning rate can affect the training, and therefore performa...Figure 4.5 A schematic of a multilayer perceptron.Figure 4.6 Plot of ReLU activation function.Figure 4.7 Plot of leaky ReLU activation function.Figure 4.8 Plot of ELU activation function.Figure 4.9 Bias allows you to shift the activation function along the X ‐axis...Figure 4.10 Training vs. validation error.Figure 4.11 Validation error from training model on the Glass dataset.

3 Chapter 5Figure 5.1 A schematic of a RNN cell. X and Y are inputs and outputs, respec...Figure 5.2 Connections in a feedforward layer in an MLP (a) destroy the sequ...Figure 5.3 An example of how sequential information is stored in a recurrent...Figure 5.4 A schematic of information flow through an LSTM cell. As througho...Figure 5.5 An LSTM cell with the flow through the forget gate highlighted.Figure 5.6 An LSTM cell with the flow through the input gate highlighted.Figure 5.7 An LSTM cell with the flow through the output gate highlighted.Figure 5.8 An LSTM cell with peephole connections highlighted.Figure 5.9 A schematic of information flow through a GRU cell. Here, X refer...

4 Chapter 6Figure 6.1 Illustration of convolutional neural network architecture.Figure 6.2 Illustration of average and max pooling algorithms.Figure 6.3 Illustration of average and max pooling on face image.Figure 6.4 Illustration of average and max pooling on handwritten character ...Figure 6.5 Illustration of the effect of stride on change in data volume.Figure 6.6 Illustration of stride.Figure 6.7 Illustration of the impact of sparse connectivity on CNN unit's r...Figure 6.8 Illustration of graph convolutional network.Figure 6.9 Example graph.Figure 6.10 Example adjacency matrix.

5 Chapter 7Figure 7.1 A schematic of a shallow auto‐encoder.Figure 7.2 Representing a neural network as a stack of RBMs for pretraining....Figure 7.3 Training an auto‐encoder from stacked RBMs. (1) Train a stack of ...Figure 7.4 Comparison of standard auto‐encoder and variational auto‐encoder....Figure 7.5 Illustration of sequence to sequence model.

6 Chapter 8Figure 8.1 Schematic for greedy search.Figure 8.2 Bayes rule.

Guide

1 Cover Page

2 Title Page

3 Copyright Page

4 About the Authors

5 Acknowledgements

6 Table of Contents

7 Begin Reading

8 Index

9 Wiley End User License Agreement

Pages

1 iii

2 iv

3 xi

4 xii

5 1

6 2

7 3

8 5

9 6

10 7

11 8

12 9

13 10

14 11

15 12

16 13

17 14

18 15

19 16

20 17

21 18

22 19

23 20

24 21

25 22

26 23

27 24

28 25

29 26

30 27

31 28

32 29

33 30

34 31

35 32

36 33

37 34

38 35

39 36

40 37

41 38

42 39

43 41

44 42

45 43

46 44

47 45

48 46

49 47

50 48

51 49

52 50

53 51

54 52

55 53

56 54

57 55

58 56

59 57

60 58

61 59

62 60

63 61

64 62

65 63

66 64

67 65

68 66

69 67

70 68

71 69

72 70

73 71

74 72

75 73

76 74

77 75

78 77

79 78

80 79

81 80

82 81

83 82

84 83

85 84

86 85

87 86

88 87

89 88

90 89

91 90

92 91

93 92

94 93

95 94

96 95

97 96

98 97

99 98

100 99

101 100

102 101

103 102

104 103

105 104

106 105

107 106

108 107

109 108

110 109

111 110

112 111

113 112

114 113

115 114

116 115

117 116

118 117

119 118

120 119

121 120

122 121

123 122

124 123

125 124

126 125

127 126

128 127

129 128

130 129

131 130

132 131

133 132

134 133

135 134

136 135

137 136

138 137

139 138

140 139

141 140

142 141

143 142

144 143

145 144

146 145

Читать дальше
Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Похожие книги на «Deep Learning for Physical Scientists»

Представляем Вашему вниманию похожие книги на «Deep Learning for Physical Scientists» списком для выбора. Мы отобрали схожую по названию и смыслу литературу в надежде предоставить читателям больше вариантов отыскать новые, интересные, ещё непрочитанные произведения.


Отзывы о книге «Deep Learning for Physical Scientists»

Обсуждение, отзывы о книге «Deep Learning for Physical Scientists» и просто собственные мнения читателей. Оставьте ваши комментарии, напишите, что Вы думаете о произведении, его смысле или главных героях. Укажите что конкретно понравилось, а что нет, и почему Вы так считаете.

x