Processing math: 100%
World Scientific
Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.
Linear Algebra and Optimization with Applications to Machine Learning cover
IMPORTANT!
This ebook can only be accessed online and cannot be downloaded. See further usage restrictions.
Also available at Amazon and Kobo

 

Volume 2 applies the linear algebra concepts presented in Volume 1 to optimization problems which frequently occur throughout machine learning. This book blends theory with practice by not only carefully discussing the mathematical under pinnings of each optimization technique but by applying these techniques to linear programming, support vector machines (SVM), principal component analysis (PCA), and ridge regression. Volume 2 begins by discussing preliminary concepts of optimization theory such as metric spaces, derivatives, and the Lagrange multiplier technique for finding extrema of real valued functions. The focus then shifts to the special case of optimizing a linear function over a region determined by affine constraints, namely linear programming. Highlights include careful derivations and applications of the simplex algorithm, the dual-simplex algorithm, and the primal-dual algorithm. The theoretical heart of this book is the mathematically rigorous presentation of various nonlinear optimization methods, including but not limited to gradient decent, the Karush-Kuhn-Tucker (KKT) conditions, Lagrangian duality, alternating direction method of multipliers (ADMM), and the kernel method. These methods are carefully applied to hard margin SVM, soft margin SVM, kernel PCA, ridge regression, lasso regression, and elastic-net regression. Matlab programs implementing these methods are included.

 

Request Inspection Copy

 

Sample Chapter(s)
Preface
Introduction

 

Contents:

  • Preface
  • Introduction
  • Preliminaries for Optimization Theory:
    • Topology
    • Differential Calculus
    • Extrema of Real-Valued Functions
    • Newton's Method and Its Generalizations
    • Quadratic Optimization Problems
    • Schur Complements and Applications
  • Linear Optimization:
    • Convex Sets, Cones, H-Polyhedra
    • Linear Programs
    • The Simplex Algorithm
    • Linear Programming and Duality
  • NonLinear Optimization:
    • Basics of Hilbert Spaces
    • General Results of Optimization Theory
    • Introduction to Nonlinear Optimization
    • Subgradients and Subdifferentials of Convex Functions ⊛
    • Dual Ascent Methods; ADMM
  • Applications to Machine Learning:
    • Positive Definite Kernels
    • Soft Margin Support Vector Machines
    • Ridge Regression, Lasso, Elastic Net
    • ν-SV Regression
  • Appendix A: Total Orthogonal Families in Hilbert Spaces
  • Appendix B: Matlab Programs
  • Bibliography
  • Index

 

Readership: Students going for advanced undergraduate applied math classes; master levels classes in engineering optimization, machine learning, and applied mathematics; and doctoral seminar classes in machine learning and engineering optimization.