**Linear Algebra for Data Science** and machine learning is very essential as the concepts of linear algebra are used to understand the working of algorithms. In this post, we are going to discuss the basic concepts of linear algebra.

## Why Linear Algebra?

Enormous datasets mostly contain hundreds to a large number of individual data objects.

It is simpler to work with this information and operate on it when it is characterized in the form of matrices and vectors.

**Linear Algebra** is a branch of mathematics that manages vectors and tasks on vectors. Linear Algebra is significantly essential for Artificial Intelligence and information handling calculations.

Data Scientists must have basic knowledge of mathematics to solve complex data problems in an efficient way to boost the company revenue.

Linear Algebra is mostly concerned with operations on vectors and matrices, so let’s start learning about matrices and vectors.

## What is the Matrix?

Usually, it corresponds to a collection of information stored in an arranged manner. Mathematically, it states to a set of numbers, variables or functions arranged in rows and columns.

Matrices are generally represented by the capital English alphabets like A, B, C……, etc. For example,

\(\displaystyle A=\left[ {\begin{array}{*{20}{c}} 3 & 4 & 5 \\ 6 & 7 & 8 \\ 9 & 2 & 3 \end{array}} \right] \)

The order of a matrix is defined by the number of rows and columns in a matrix.

**Order of a matrix = number of rows × number of columns **

\(\displaystyle A=\left[ {\begin{array}{*{20}{c}} 1 & 2 & 3 \\ 2 & 5 & 8 \\ 8 & 2 & 3 \end{array}} \right]\)

In above example, number of rows is 3 and number of columns is also 3, therefore,

Order of matrix A is 3 × 3.

There are following types of matrices:

- Row Matrix
- Column Matrix
- Unit or Identity Matrix
- Null or Zero matrix
- Square Matrix
- Rectangular Matrix
- Diagonal Matrix
- Scalar Matrix
- Negative of a Matrix
- Transpose of a Matrix
- Symmetric Matrix
- Skew-Symmetric Matrix.

The basic operations on matrices are:

- Addition of Matrices
- Subtraction of Matrices
- Product of Matrices

## What is a Vector?

Vectors can be considered as an array of numbers having two independent properties that is magnitude and direction where the order of the numbers also matters.

A vector is the simplest linear algebraic object. Vectors are usually represented by a lowercase bold letter like x, y. For example,

\(\displaystyle x=\left[ {\begin{array}{*{20}{c}} {{{x}_{1}}} \\ {{{x}_{2}}} \\ . \\ . \\ . \\ . \\ . \\ . \\ . \\ {{{x}_{n}}} \end{array}} \right]\)

Usually, there are two types of vectors i.e. row vectors and column vectors, examples of which are given below:-

### Row Vector

\(\displaystyle \left[ {\begin{array}{*{20}{c}} 1 & 2 & 3 \end{array}} \right] \)

The length of this vector is 3.

### Column Vector

\(\displaystyle \left[ {\begin{array}{*{20}{c}} 1 \\ 2 \\ 3 \\ 4 \end{array}} \right] \)

The length of this vector is 4.

The addition of two vectors can be performed by adding the corresponding elements of each vector.

The dot product or scalar product of two vectors is the addition of the product of the individual components of the two vectors.

Furthermore, the two vectors ‘a’ and ‘b’ are called orthogonal to each other if their dot product is zero. However, if both the orthogonal vectors have unit norm then they are known as orthonormal vectors.

A set of vectors (v_{1}, v_{2}, .., v_{n}) is known as linearly independent if no vector of the set can be denoted as a linear combination of other vectors.

Simply, you can say that a matrix is a collection of vectors. For example,

\(\displaystyle \left[ {\begin{array}{*{20}{c}} 1 & 4 & 1 \\ 2 & 5 & 3 \\ 3 & 6 & 2 \end{array}} \right]\)

In above example, there are three vectors. First vector contains 1, 2, 3, second vector contains 4, 5, 6 and third vector contain 1, 3, 2.

Therefore, a matrix has two dimensions M by n vectors, whereas, the vector has a single dimension ‘m’ by 1 while scalar has no dimension as shown below: –

\(\displaystyle Matrix\,\,{{A}_{{m\times n}}}=\left[ {\begin{array}{*{20}{c}} {{{a}_{{11}}}} & {{{a}_{{12}}}} & {{{a}_{{13}}}} & {…..} & {{{a}_{{1n}}}} \\ {{{a}_{{21}}}} & {{{a}_{{22}}}} & {{{a}_{{23}}}} & {…..} & {{{a}_{{2n}}}} \\ {{{a}_{{31}}}} & {{{a}_{{32}}}} & {{{a}_{{33}}}} & {…..} & {{{a}_{{3n}}}} \\ . & . & . & . & . \\ . & . & . & . & . \\ . & . & . & . & . \\ . & . & . & . & . \\ . & . & . & . & . \\ . & . & . & . & . \\ {{{a}_{{m1}}}} & {{{a}_{{m2}}}} & {{{a}_{{m3}}}} & {…..} & {{{a}_{{mn}}}} \end{array}} \right]\)

\(\displaystyle Vector\,\,\,\,\,x=\left[ {\begin{array}{*{20}{c}} {{{x}_{1}}} \\ {{{x}_{2}}} \\ . \\ . \\ . \\ . \\ . \\ . \\ . \\ {{{x}_{m}}} \end{array}} \right]\)

\(\displaystyle Scalar\,\,=\,\,\,\left[ x \right]\) A scalar has no dimension and direction.

Now, let’s explore the linear algebra and geometry.

A two-dimensional space can be defined by two lines and two lines mean two vectors.

As we have already learned that a matrix is a collection of vectors in linear algebra. So, any two-dimensional space can easily be represented by a matrix.

The following two matrices can be graphically represented in two dimensions as

\(\displaystyle \,\left[ {\begin{array}{*{20}{c}} 2 \\ 4 \end{array}} \right]\)

\(\displaystyle \left[ {\begin{array}{*{20}{c}} {-2} \\ {-4} \end{array}} \right]\)

You can see that, the direction is always from the origin of the graph to the end point. The vector -2 and – 4 is exactly opposite of the first one. Therefore, if we take two vector.

\(\displaystyle \left[ {\begin{array}{*{20}{c}} 1 \\ 0 \end{array}} \right]\) and

\(\displaystyle \,\left[ {\begin{array}{*{20}{c}} 0 \\ 1 \end{array}} \right]\)

then by taking these two vectors together we can generate a matrix i.e. \(\displaystyle \,\left[ {\begin{array}{*{20}{c}} 1 & 0 \\ 0 & 1 \end{array}} \right]\) which is made up of small portions of two axes x and y as shown in figure.

## Linear Algebra Applications in Data Science

- Vectorized Code (also known as Array Programming)
- Image Recognition
- Deep Learning

- CNNs (Convolutional Neural Networks)

- Dimensionality Reduction
- Eigenvalues

- Eigenvector

**Next Post: Scalars, Vector and Matrices in Python**

**Read also: Statistics for Data Science**

Very nice article and very useful. I appreciate your work, thanks for sharing for all the informative content.

I am glad that this article is useful for you.