Computational Graphs in Deep Learning With Python
1. Computational Graphs – Objective
In this Deep Learning With Python tutorial, we will tell you about computational graphs in Deep Learning. We will show you how to implement those Computational graphs with Python. Moreover, while implementing Deep Learning Computational Graphs in Python, we will look at dynamics and Forward-Backward Propagation.
So, let’s begin Computation Graphs in Deep Learning With Python.
2. Deep Learning Computational Graphs
In fields like Cheminformatics and Natural Language Understanding, it is often useful to compute over data-flow graphs. Computational Graph form an integral part of Deep Learning. Not only do they help us simplify working with large datasets, they’re simple to understand. So in this tutorial, we will introduce them to you and then show you how to implement them using Python. For this, we will use the Dask library from Python PyPI.
Do you know about Python Library
3. What are Computational Graphs in Deep Learning?
A computational graph is a way to represent a mathematical function in the language of graph theory. Nodes are input values or functions for combining them; as data flows through this graph, the edges receive their weights.
So we said we have two kinds of nodes- input nodes and function nodes. Outbound edges from input nodes bear the input value, and those from function nodes bear the composite of the weights of the inbound edges. The graph above represents the following expression:
Have a look at deep Learning vs Machine Learning
Of the five nodes, the leftmost three are input nodes; the two on the right are function nodes. Now if we’d have to compute f(1,2,3), we’d get the following computation graph-
4. Need of Computational Graph
Well, this was a simple computational graph with 5 nodes and 5 edges. But even simpler deep neural networks observe hundreds of thousands of nodes and edges- say, more than one million? In such a case, it would be practically impossible to calculate a function expression for it. Then, computational graphs come in handy.
Such graphs also help us describe backpropagation more precisely.
5. Computational Graphs Example – Composite Function
We take an example of the function f(x)=esin(x**2). Let’s decompose this-
Let’s revise the Python Machine Learning Tutorial
We have the following computational graph for this-
6. Visualizing a Computation Graph in Python
Now let’s use the Dask library to produce such a graph.
>>> from dask import delayed,compute >>> import dask >>> @delayed def square(num): print("Square function:",num) print() return num*num >>> @delayed def sum_list(args): print("Sum_list function:",args) return sum(args) >>> items=[1,2,3] >>> computation_graph = sum_list([square(i) for i in items]) >>> computation_graph.visualize()
Note that for this code to work, you will need to perform three tasks-
- Install the dask library with pip:
pip install dask
- Download and unzip the Windows packages for Dask-
- Add the first line of code to your User Path and the next to your system path in environment variables:
C:\Users\Ayushi\Desktop\graphviz-2.38\release\bin (Use your own path for bin)
7. Dynamic Deep Learning Python Computational Graphs
DCGs suffer from the issues of inefficient batching and poor tooling. When each data in a data set has its type or shape, it becomes a problem to have the neural network batch such data with a static graph. As a workaround, we use an algorithm we call Dynamic Batching.
In other words, a Dynamic Computational Graph is a mutable directed graph with operations as vertices and data as edges. In effect, it is a system of libraries, interfaces, and components. These deliver a flexible, programmatic, runtime interface that lets us construct and modify systems by connecting operations.
Have a look at the Python Machine Learning Environment Set up
8. Forward and Backward Propagation in Computational Graphs
First, let’s talk about forward propagation. Here, we loop over nodes in a topological order. In other words, we pass the values of the variables in the forward direction (left to right). Given a node’s inputs, we compute its value.
In backward propagation, however, we start at a final goal node and loop over the nodes in a reverse topological order. Here, we compute the derivatives of the final goal node value with respect to each edge’s tail node.
So, this was all in Computational Graphs Deep Learning With Python. Hope you like our explanation.
9. Conclusion: Computational Graphs
Hence, we wind up computational graphs for deep learning with Python. Moreover, we discussed Computational Graphs Propagation and implementing graphs in Python. Furthermore, if you have any query, feel free to ask in the comment box.
See also –
Deep Learning With Python Applications