Machine Learning from Perspective of Linear Algebra

Linear albegra world view

  1. Every 'object' can be represented by a vector, even if it has uncountably many features.

  2. Conversely, once we decide on a set of features, every object can be represented by a vector.

  3. Thus, once we decide on a set of features, every object is comparable to every object, by a simple dot product.

  4. A property is derived by multiplying the vector representing the object by a matrix corresponding to that property.

  5. A set of features defines a characterization of the object.

Concrete examples:

Learning

Given y = f(x) and a 'large enough' distribution of (x, y),

it is possible to find g such that for 'almost all' x, y = g(x) holds.

Concrete examples:

Introducing non-linearity

Audio signals

Kernel trick

Neural networks