1 Cover
2 Title Page Artificial Intelligence and Quantum Computing for Advanced Wireless Networks Savo G. Glisic Worcester Polytechnic Institute, Massachusetts, USA Beatriz Lorenzo University of Massachusetts, Amherst, USA
3 Copyright Page
4 Preface
5 Part I: Artificial Intelligence 1 Introduction1.1 Motivation 1.2 Book Structure References 2 Machine Learning Algorithms 2.1 Fundamentals 2.2 ML Algorithm Analysis References 3 Artificial Neural Networks3.1 Multi‐layer Feedforward Neural Networks 3.2 FIR Architecture 3.3 Time Series Prediction 3.4 Recurrent Neural Networks 3.5 Cellular Neural Networks (CeNN) 3.6 Convolutional Neural Network (CoNN) References 4 Explainable Neural Networks 4.1 Explainability Methods 4.2 Relevance Propagation in ANN 4.3 Rule Extraction from LSTM Networks 4.4 Accuracy and Interpretability References 5 Graph Neural Networks 5.1 Concept of Graph Neural Network (GNN) 5.2 Categorization and Modeling of GNN 5.3 Complexity of NN Appendix 5.A Notes on Graph Laplacian Appendix 5.B Graph Fourier Transform References 6 Learning Equilibria and Games6.1 Learning in Games 6.2 Online Learning of Nash Equilibria in Congestion Games 6.3 Minority Games 6.4 Nash Q‐Learning 6.5 Routing Games 6.6 Routing with Edge Priorities References 7 AI Algorithms in Networks7.1 Review of AI‐Based Algorithms in Networks 7.2 ML for Caching in Small Cell Networks 7.3 Q‐Learning‐Based Joint Channel and Power Level Selection in Heterogeneous Cellular Networks 7.4 ML for Self‐Organizing Cellular Networks 7.5 RL‐Based Caching 7.6 Big Data Analytics in Wireless Networks 7.7 Graph Neural Networks 7.8 DRL for Multioperator Network Slicing 7.9 Deep Q ‐Learning for Latency‐Limited Network Virtualization 7.10 Multi‐Armed Bandit Estimator (MBE) 7.11 Network Representation Learning References
6 Part II: Quantum Computing 8 Fundamentals of Quantum Communications8.1 Introduction 8.2 Quantum Gates and Quantum Computing 8.3 Quantum Fourier Transform (QFT) References 9 Quantum Channel Information Theory 9.1 Communication Over a Channel 9.2 Quantum Information Theory 9.3 Channel Description 9.4 Channel Classical Capacities 9.5 Channel Quantum Capacity 9.6 Quantum Channel Examples References 10 Quantum Error Correction 10.1 Stabilizer Codes 10.2 Surface Code 10.3 Fault‐Tolerant Gates 10.4 Theoretical Framework 10.A Binary Fields and Discrete Vector Spaces 10.B Some Noise Physics References 11 Quantum Search Algorithms 11.1 Quantum Search Algorithms 11.2 Physics of Quantum Algorithms References 12 Quantum Machine Learning 12.1 QML Algorithms 12.2 QNN Preliminaries 12.3 Quantum Classifiers with ML: Near‐Term Solutions 12.4 Gradients of Parameterized Quantum Gates 12.5 Classification with QNNs 12.6 Quantum Decision Tree Classifier Appendix 12.7 Matrix Exponential References 13 QC Optimization13.1 Hybrid Quantum‐Classical Optimization Algorithms 13.2 Convex Optimization in Quantum Information Theory 13.3 Quantum Algorithms for Combinatorial Optimization Problems 13.4 QC for Linear Systems of Equations 13.5 Quantum Circuit 13.6 Quantum Algorithm for Systems of Nonlinear Differential Equations References 14 Quantum Decision Theory 14.1 Potential Enablers for Qc 14.2 Quantum Game Theory (QGT) 14.3 Quantum Decision Theory (QDT) 14.4 Predictions in QDT References 15 Quantum Computing in Wireless Networks 15.1 Quantum Satellite Networks 15.2 QC Routing for Social Overlay Networks 15.3 QKD Networks References 16 Quantum Network on Graph 16.1 Optimal Routing in Quantum Networks 16.2 Quantum Network on Symmetric Graph 16.3 QWs 16.4 Multidimensional QWs References 17 Quantum Internet 17.1 System Model 17.2 Quantum Network Protocol Stack References
7 Index
8 End User License Agreement
1 Chapter 3 Table 3.1 Multi‐layer network notation. Table 3.2 Finite impulse response (FIR) multi‐layer network notation. Table 3.3 Variables, for the derivation of gradient withϕ ↔ φ...
2 Chapter 4 Table 4.1 An example of the (discrete) membership functions for both antece...
3 Chapter 5 Table 5.1 [1] Learning algorithm. Table 5.2 Time complexity of the most expensive instructions of the learnin...
4 Chapter 7Table 7.1 Neural network architecture parameters.
5 Chapter 8Table 8.1 Operation of a CU gate.Table 8.2 Operational of a Toffoli gate.
6 Chapter 9Table 9.1 Quantum depolarizing channels.Table 9.2 Maximum number of computational steps that can be performed witho...
7 Chapter 10Table 10.1 The syndrome table for the two‐qubit code.Table 10.2 The syndrome table for all bit‐flip errors on the three‐qubit co...Table 10.3 The syndrome table for the [[4, 2, 2]] code for all single‐qubit...Table 10.4 The syndrome table for single‐qubit X‐ and Z‐errors on the nine‐...Table 10.5 Stabilizers for the distance 3 planar qubit of Figures 10.9d and...
8 Chapter 17Table 17.1 Expected number of entanglement swaps for ring and grid network ...Table 17.2 Comparison of the number of qubits that have to be stored in a d...
1 Chapter 2 Figure 2.1 If X and Y are two jointly normally distributed random variables,... Figure 2.2 The regression line for predicting Y* from X* is not the 45° line... Figure 2.3 Decision tree. Figure 2.4 Tree terminology. Figure 2.5 Data classification. Figure 2.6 Classification with outliers. Figure 2.7 Classifiers with nonlinear transformations. Figure 2.8 Illustration of the nearest neighbor (NN) classification algorith... Figure 2.9 Illustration of the plane partitioning of a two‐dimensional datas... Figure 2.10 Illustration of the nearest neighbor (NN) decision boundary. Figure 2.11 Example of clustering. Figure 2.12 k‐Means algorithm. Figure 2.13 k = 3 means clustering on 2D dataset. Figure 2.14 Concept of data projection. Figure 2.15 Successive data projections. Figure 2.16 Decision tree presenting response to direct mailing. Figure 2.17 Predicting email spam. Figure 2.18 Top‐down algorithmic framework for decision tree induction. The ... Figure 2.19 Black circles represent the input data, x n; red squares repres...
2 Chapter 3 Figure 3.1 From biological to mathematical simplified model of a neuron. Figure 3.2 Block diagram of feedforward network. Figure 3.3 Schematic representation of supervised learning. Figure 3.4 Illustration of backpropagation. Figure 3.5 Finite impulse response (FIR) neuron and neural network. Figure 3.6 Finite impulse response (FIR) network unfolding. Figure 3.7 Temporal backpropagation. Figure 3.8 Oversimplified finite impulse response (FIR) network. Figure 3.9 Network prediction configuration. Figure 3.10 Nonlinear AR/ARMA predictors. Figure 3.11 Recurrent neural network. Figure 3.12 Canonical form of a recurrent neural network for prediction. Figure 3.13 Recurrent neural network (RNN) architectures: (a) activation fee... Figure 3.14 General locally recurrent–globally feedforward (LRGF) architectu... Figure 3.15 An example of Elman recurrent neural network (RNN). Figure 3.16 An example of Jordan recurrent neural network (RNN). Figure 3.17 A fully connected recurrent neural network (RNN; Williams–Zipser... Figure 3.18 Nonlinear IIR filter structures. (a) A recurrent nonlinear neura... Figure 3.19 A long short‐term memory (LSTM) memory cell. Figure 3.20 A bidirectional recurrent neural network (BRNN). (for more detai... Figure 3.21 (Top) Cellular neural networks (CeNN) architecture, (bottom) cir... Figure 3.22 Memristor‐based cellular nonlinear/neural network (MCeNN). Figure 3.23 Illustration of the convolution operation. If we overlap the con... Figure 3.24 RGB image/three channels and three kernels. (for more details se... Figure 3.25 Computing ∂z / ∂X . (for more details see the color fig... Figure 3.26 Illustration of pooling layer operation. (for more details see t... Figure 3.27 Illustration of preprocessing in a cooperative neural network (C...
Читать дальше