Linear Algebra in TensorFlow

Linear Algebra in TensorFlow (Scalars, Vectors & Matrices)

Linear Algebra in TensorFlow: TensorFlow is open source software under Apache Open Source license for dataflow which is frequently being used for machine learning applications like deep-neural-network to improve the performance of search engines, such as, Google, image captioning, recommendation and translation.

For example, when a user types a keyword in Google’s search bar, it provides a recommendation which could be helpful for users or researcher. The stable version of TensorFlow appeared in 2017 which was developed by the Google Brain Team to improve the services of Gmail and Google search engine.

Its architecture performs in three parts such as data preprocessing, model building, model training and model estimation.

TensorFlow acquires input as multi dimensional array and its library fit in various API to make at scale deep learning architecture such as CNN or RNN.

TensorFlow runs on GPU and CPU. It is based on graph calculation which permits the developer to visualize the construction of the Neural Network with TensorBoard as it runs on GPU and CPU.  

These are the algorithms supported by TensorFlow.

  • Classificationtf.estimator.LinearClassifier
  • Deep Learning Classificationtf.estimator.DNNClassifier
  • Deep Learning wipe and deeptf.estimator.DNNLinearCombinedClassifier
  • Boosted Tree Classificationtf.estimator.BoostedTreesClassifier
  • Linear Regressiontf.estimator.LinearRegressor
  • Boosted Tree Regressiontf.estimator.BoostedTreesRegressor

You can see more details here.

Before to start a practical example of TensorFlow, it is essential to recall the concepts of scalar, vector, and matrix.

A scalar is always one by one so, it has the lowest dimensionality, whereas, each element of a vector is a scalar and dimension of a vector is (m x 1) or (1 x m) matrix and a matrix is a collection of vectors (m x n) or a collection of scalars.

A few instances of scalar, vector, and matrix are given below.

Examples of 1 x 1 Scalar:

  • [2]
  • [4]

Examples of m x 1 Vector:

  • $latex \displaystyle \left[ {\begin{array}{*{20}{c}} 1 \\ 2 \\ 3 \\ 4 \end{array}} \right]$
  • $latex \displaystyle \left[ {\begin{array}{*{20}{c}} 4 \\ 6 \\ 2 \end{array}} \right]$

Examples of m x n Matrices:

  • $latex \displaystyle \left[ {\begin{array}{*{20}{c}} 1 & 4 & 1 \\ 2 & 5 & 3 \\ 3 & 6 & 2 \end{array}} \right]$
  • $latex \displaystyle \left[ {\begin{array}{*{20}{c}} 3 & 4 & 7 \\ 1 & 3 & 0 \\ 8 & 2 & 5 \end{array}} \right]$

Let’s start a practical example of TensorFlow…

Practical Example of TensorFlow

Before to create a Tensor, it is essential, first to import the relevant library in Jupyter Notbook as shown in below snap.

Import the relevant library:

import numpy as np

Creating a Tensor and checking its shape: Now we are going to create and Tensor and check its shape. Tensor can be stored in an array like this,

creating tensor

In this example, firs,t we take two matrices t1 and t2 and create an array with two elements t1 and t2 and the result obtained in the form of an array which contains these two matrices.

Now we check this shape like this,


The above result depicts that this array contains two matrices, each of which is 2 by 3.

Manually creating a Tensor:

We can also create a tensor manually, but in fact, it is a bit difficult as various brackets are involved.

This is an example of Linear Algebra in TensorFlow

Linear Algebra in TensorFlow
Linear Algebra in TensorFlow

Read related article: Linear Algebra for Data Science

1 thought on “Linear Algebra in TensorFlow (Scalars, Vectors & Matrices)”

  1. These partial derivatives are often grouped together—in matrices—to allow more straightforward calculation. Even the most elementary machine learning models such as linear regression are optimised with these linear algebra techniques.

Leave a Comment

Your email address will not be published. Required fields are marked *