Are you stuck on how to use Tensorflow? and want to know what is a graph and session in TensorFlow. Well, here we go from very scratch!!
Machine learning and Neural networks are complicated fields, but implementing machine learning models into practice is much easier now than it formerly was. This is because of machine learning and neural network frameworks, TensorFlow, sklearn, and Keras, that makes it simpler to gather data, train models, deliver predictions, and improve future outcomes. In this article, we are going to discuss a graph and session in Tensorflow. We will understand the working of TensorFlow and will go through its two most important components which are a graph and a session.
You May like:How to do image processing with neural networks using TensorFlow
TensorFlow is a complete open-source framework for building machine learning applications. It is a symbolic math toolkit that carries out several operations targeted at deep neural network training and inference using dataflow and differentiable programming. It enables programmers to build machine learning applications utilizing a range of instruments, frameworks, and community assets. Google’s TensorFlow is currently the most well-known deep learning library in the world. All of Google’s products incorporate machine learning to enhance the search engine, translation, image captioning, or recommendations.
Before going to understand the working of Tensorflow, make sure that you have installed the TensorFlow on your system as in this article, we will also have some practical examples. You can use simple pip commands to install TensorFlow on your system. Once the installation is complete, import the TensorFlow library and run the file.
# importing tensorflow import tensorflow as tf
If the file runs, without error then congratulations!!!! You have successfully installed TensorFlow on your system.
What is Tensor in TensorFlow?
TensorFlow accepts inputs as a multi-dimensional array called Tensor, allowing you to create dataflow graphs and structures to specify how data goes through a graph. It enables you to create a flowchart of the operations that can be carried out on these inputs, with the output appearing at the other end.
As you can see in the above picture how a tensor differs from other usual data structures.
All datatypes in TensorFlow are represented in tensor. In other words, the tensor is a multidimensional collection of data. There are four main attributes of a tensor which are type, rank, shape, and label.
- Rank is the number of dimensions of the tensor. For example, a single column of a dataset will have a rank of 1, while a table will have a rank of 2, and so on.
- shape refers to the total size of the tensor. Total size means the sum of the sizes of all the dimensions.
- A Tensor has only one data type which is defined by the user while installing.
- A label refers to the name of the tensor. Each tensor in TensorFlow are objects that are defined by
Features of TensorFlow
The following are some of the main features of TensorFlow which make it a unique and popular machine-learning library.
- It is a free and open-source library that has a strong and supportive team.
- It can run on various platforms including Andriod, cloud, IOS, mac, and other architectures such as CPU and GPU.
- It has its own designed hardware to train neural models which are called Cloud TPUs.
- One of the powerful features of TensorFlow is its fast debugging method. It provides a Tensor Board which visualizes the code and helps to debug the code easily.
- As we already discussed, TensorFlow takes works with Tensors which are multidimensional arrays.
- Using TensorFlow we can reduce the total lines of codes.
- It supports the Keras API.
What is a graph and session in TensorFlow?
The best thing about TensorFlow is that it represents the computation without actually performing them until explicitly asked to calculate the computation. It might seem confusing how it can represent the computation without actually performing them. Well, here comes the Graph and session in TensorFlow which helps to represent and calculate the computations.
- Every computation in TensorFlow is represented by a Graph.
- To execute the computations which are stored in the form of a graph, we need to initialize the session.
So, basically, TensorFlow needs graphs and sessions to store and execute the computations. There are many advantages of having Graphs and Sessions in TensorFlow.
One of the biggest advantages is that when we create a graph, we are not bounded to run the whole computation. In fact, using Session in TensorFlow, we can run only the graph or computation that is needed at a time which provides huge flexibility with the model.
Let us now jump into these processes and understand them in more detail.
Explanation of Graph in TensorFlow
As we discussed in the above section, the graph in TensorFlow is nothing more than a representation of computation without actually performing it. In TensorFlow, anything that we do with the model is represented in a computational graph. This makes it easier to debug and gives you more control over the structure of the model.
A Graph in TensorFlow is simply an arrangement of different nodes that represent the operations in our model. Let us take a simple example of addition to understanding how the computation is represented in a Graph in Tensorflow.
Let us assume that we have two variables x = 10 and y = 15 and we are adding them using TensorFlow computation methods. When we add the two numbers in TensorFlow using the
tf.add() method, instead of getting a summation, we will get a graph that looks like this:
The above graph shows that there are two variables named x and y and then need to be added once they are asked to do. Let us now use TensorFlow to add two variables and see what we get.
# defining the variables x = 10 y = 15 # addition using tensorflow Summation = tf.add(x, y, name='Add') # printing the sum print(Summation)
As you can see, we get a lot of information as output instead of just an addition of the two numbers. In the code above, the
tf.add() takes the names of the variables and then the operation. In our case, the operation is “Add” which mean we want to add the two variable.
In the output section, we get the type of the output and then the graph itself. We can use the Tensor Dashboard to visualize these computational graphs.
Explanation of session in TensorFlow
As we discussed earlier that the computation in TensorFlow are not executed unless they are not explicitly asked to. The session is the process that tells the TensorFlow code to execute the computations when called.
For example in the above example, when we added two numbers we saw that we didn’t get the result of addition, in fact, we get information about the graph. In order to get the result of the computation, we need to initialize the session.
In TensorFlow, a Session object represents a connection to a TensorFlow runtime, which is responsible for executing TensorFlow operations. The Session object provides methods to drive the computation by feeding tensors and running the operations.
Here is an example of using a Session in TensorFlow:
import tensorflow as tf # Build a computational graph a = tf.constant(3.0) b = tf.constant(4.0) c = tf.add(a, b) # Launch the graph in a session sess = tf.Session() # Evaluate the tensor c print(sess.run(c)) # Output: 7.0 # Close the session to release resources sess.close()
In the example above, we first define two constant tensors a and b, and add them together to obtain a new tensor c. Then we create a Session object and use its run method to execute the computation and evaluate the value of c. Finally, we close the Session to release the resources it uses.
It is also possible to use a block to manage the Session automatically:
with tf.Session() as sess: print(sess.run(c)) # Output: 7.0
In this case, the Session will be automatically closed at the end of the block.
In TensorFlow, a Graph represents a set of computations to be executed, in the form of a directed acyclic graph (DAG). It consists of a set of nodes, which represent operations, and edges, which represent the input/output relationships between the nodes. The graph defines the flow of data through the model and specifies what computations should be performed.
A Session is an environment for running a graph. It provides methods for executing nodes in the graph and for evaluating tensors. When you build a model in TensorFlow, you first define the computations in the graph and then create a Session to run the graph.
For example, you might define a graph that represents a simple linear model, with a single input node, a single output node, and a single weight parameter. You would then create a Session to run the graph and evaluate the output node given a particular input value and weight. In this article, we discussed and learned about graphs and sessions in TensorFlow using various code examples.
1 thought on “What is a graph and session in Tensorflow?”