Deep learning architectures : a mathematical approach / Ovidiu Calin.

Por: Ovidiu, Calin [autor]Tipo de material: TextoTextoEditor: Cham : Springer, 2020Descripción: 768 páginasTipo de contenido: text Tipo de medio: unmediated Tipo de portador: volumeISBN: 9783030367206Tema(s): APRENDIZAJE AUTOMATICO (Inteligencia Artificial)Clasificación CDD: 006.3
Contenidos:
Contents: Intro -- Foreword -- Overview -- Part I -- Part II -- Part III -- Part IV -- Part V -- Bibliographical Remarks -- Chapters Diagram -- Notations and Symbols -- Calculus -- Linear Algebra -- Probability Theory -- Measure Theory -- Information Theory -- Differential Geometry -- Neural Networks -- Contents -- Part I Introduction to Neural Networks -- 1 Introductory Problems -- 1.1 Water in a Sink -- 1.2 An Electronic Circuit -- 1.3 The Eight Rooks Problem -- 1.4 Biological Neuron -- 1.5 Linear Regression -- 1.6 The Cocktail Factory Network -- 1.7 An Electronic Network -- 1.8 Summary Contents: 1.9 Exercises -- 2 Activation Functions -- 2.1 Examples of Activation Functions -- 2.2 Sigmoidal Functions -- 2.3 Squashing Functions -- 2.4 Summary -- 2.5 Exercises -- 3 Cost Functions -- 3.1 Input, Output, and Target -- 3.2 The Supremum Error Function -- 3.3 The L2-Error Function -- 3.4 Mean Square Error Function -- 3.5 Cross-entropy -- 3.6 Kullback-Leibler Divergence -- 3.7 Jensen-Shannon Divergence -- 3.8 Maximum Mean Discrepancy -- 3.9 Other Cost Functions -- 3.10 Sample Estimation of Cost Functions -- 3.11 Cost Functions and Regularization -- 3.12 Training and Test Errors Contents: 3.13 Geometric Significance -- 3.14 Summary -- 3.15 Exercises -- 4 Finding Minima Algorithms -- 4.1 General Properties of Minima -- 4.1.1 Functions of a real variable -- 4.1.2 Functions of several real variables -- 4.2 Gradient Descent Algorithm -- 4.2.1 Level sets -- 4.2.2 Directional derivative -- 4.2.3 Method of Steepest Descent -- 4.2.4 Line Search Method -- 4.3 Kinematic Interpretation -- 4.4 Momentum Method -- 4.4.1 Kinematic Interpretation -- 4.4.2 Convergence conditions -- 4.5 AdaGrad -- 4.6 RMSProp -- 4.7 Adam -- 4.8 AdaMax -- 4.9 Simulated Annealing Method Contents: 4.9.1 Kinematic Approach for SA -- 4.9.2 Thermodynamic Interpretation for SA -- 4.10 Increasing Resolution Method -- 4.11 Hessian Method -- 4.12 Newton's Method -- 4.13 Stochastic Search -- 4.13.1 Deterministic variant -- 4.13.2 Stochastic variant -- 4.14 Neighborhood Search -- 4.14.1 Left and Right Search -- 4.14.2 Circular Search -- 4.14.3 Stochastic Spherical Search -- 4.14.4 From Local to Global -- 4.15 Continuous Learning -- 4.16 Summary -- 4.17 Exercises -- 5 Abstract Neurons -- 5.1 Definition and Properties -- 5.2 Perceptron Model -- 5.3 The Sigmoid Neuron -- 5.4 Logistic Regression Contents: 5.4.1 Default probability of a company -- 5.4.2 Binary Classifier -- 5.4.3 Learning with the square difference cost function -- 5.5 Linear Neuron -- 5.6 Adaline -- 5.7 Madaline -- 5.8 Continuum Input Neuron -- 5.9 Summary -- 5.10 Exercises -- 6 Neural Networks -- 6.1 An Example of Neural Network -- 6.1.1 Total variation and regularization -- 6.1.2 Backpropagation -- 6.2 General Neural Networks -- 6.2.1 Forward pass through the network -- 6.2.2 Going backwards through the network -- 6.2.3 Backpropagation of deltas -- 6.2.4 Concluding relations -- 6.2.5 Matrix form
Etiquetas de esta biblioteca: No hay etiquetas de esta biblioteca para este título. Ingresar para agregar etiquetas.
Valoración
    Valoración media: 0.0 (0 votos)
Existencias
Tipo de ítem Biblioteca actual Colección número de clasificación Copia número Estado Notas Fecha de vencimiento Código de barras
Libro General Libro General Biblioteca Campus San Joaquín
Colección General 006.3 O96 2020 (Navegar estantería(Abre debajo)) 1 Disponible Donación Sr. Sebastián Alvarado 3560900271663

Contents: Intro -- Foreword -- Overview -- Part I -- Part II -- Part III -- Part IV -- Part V -- Bibliographical Remarks -- Chapters Diagram -- Notations and Symbols -- Calculus -- Linear Algebra -- Probability Theory -- Measure Theory -- Information Theory -- Differential Geometry -- Neural Networks -- Contents -- Part I Introduction to Neural Networks -- 1 Introductory Problems -- 1.1 Water in a Sink -- 1.2 An Electronic Circuit -- 1.3 The Eight Rooks Problem -- 1.4 Biological Neuron -- 1.5 Linear Regression -- 1.6 The Cocktail Factory Network -- 1.7 An Electronic Network -- 1.8 Summary
Contents: 1.9 Exercises -- 2 Activation Functions -- 2.1 Examples of Activation Functions -- 2.2 Sigmoidal Functions -- 2.3 Squashing Functions -- 2.4 Summary -- 2.5 Exercises -- 3 Cost Functions -- 3.1 Input, Output, and Target -- 3.2 The Supremum Error Function -- 3.3 The L2-Error Function -- 3.4 Mean Square Error Function -- 3.5 Cross-entropy -- 3.6 Kullback-Leibler Divergence -- 3.7 Jensen-Shannon Divergence -- 3.8 Maximum Mean Discrepancy -- 3.9 Other Cost Functions -- 3.10 Sample Estimation of Cost Functions -- 3.11 Cost Functions and Regularization -- 3.12 Training and Test Errors
Contents: 3.13 Geometric Significance -- 3.14 Summary -- 3.15 Exercises -- 4 Finding Minima Algorithms -- 4.1 General Properties of Minima -- 4.1.1 Functions of a real variable -- 4.1.2 Functions of several real variables -- 4.2 Gradient Descent Algorithm -- 4.2.1 Level sets -- 4.2.2 Directional derivative -- 4.2.3 Method of Steepest Descent -- 4.2.4 Line Search Method -- 4.3 Kinematic Interpretation -- 4.4 Momentum Method -- 4.4.1 Kinematic Interpretation -- 4.4.2 Convergence conditions -- 4.5 AdaGrad -- 4.6 RMSProp -- 4.7 Adam -- 4.8 AdaMax -- 4.9 Simulated Annealing Method
Contents: 4.9.1 Kinematic Approach for SA -- 4.9.2 Thermodynamic Interpretation for SA -- 4.10 Increasing Resolution Method -- 4.11 Hessian Method -- 4.12 Newton's Method -- 4.13 Stochastic Search -- 4.13.1 Deterministic variant -- 4.13.2 Stochastic variant -- 4.14 Neighborhood Search -- 4.14.1 Left and Right Search -- 4.14.2 Circular Search -- 4.14.3 Stochastic Spherical Search -- 4.14.4 From Local to Global -- 4.15 Continuous Learning -- 4.16 Summary -- 4.17 Exercises -- 5 Abstract Neurons -- 5.1 Definition and Properties -- 5.2 Perceptron Model -- 5.3 The Sigmoid Neuron -- 5.4 Logistic Regression
Contents: 5.4.1 Default probability of a company -- 5.4.2 Binary Classifier -- 5.4.3 Learning with the square difference cost function -- 5.5 Linear Neuron -- 5.6 Adaline -- 5.7 Madaline -- 5.8 Continuum Input Neuron -- 5.9 Summary -- 5.10 Exercises -- 6 Neural Networks -- 6.1 An Example of Neural Network -- 6.1.1 Total variation and regularization -- 6.1.2 Backpropagation -- 6.2 General Neural Networks -- 6.2.1 Forward pass through the network -- 6.2.2 Going backwards through the network -- 6.2.3 Backpropagation of deltas -- 6.2.4 Concluding relations -- 6.2.5 Matrix form