When to use While, For, and Map for iterations in Python?

Python has a really sophisticated way of handling iterations. The only thing it does not have “GOTO Labels” which I think is good.

Let us compare the three common ways of iterations in Python: While, For and Map by the way of an example. Imagine that you have a list of numbers and you would like to find the square of each number.

nums = [1,2,3,5,10]
result = []
for num in nums:
    result.append(num*num)
print(result)

It would print [1, 4, 9, 25, 100]

Continue reading “When to use While, For, and Map for iterations in Python?”

How to handle Command Line Arguments in Python?

When you are running python programs from the command line, you can pass various arguments to the program and your program can handle it.

Here is a quick snippet of code that I will be explaining later:

import sys
if __name__ == "__main__":
    print("You passed: ", sys.argv)

When you run this program from the command line, you will get this kind of results:

$ python cmdargs.py
 You passed:  ['cmdargs.py']

Notice that the sys.argv is an array of strings containing all arguments passed to the program. And the first value(at zeroth index) of this array is the name of the program itself. You can put all kinds of check on it.

Continue reading “How to handle Command Line Arguments in Python?”

Coding Backpropagation and Gradient Descent From Scratch without using any libraries

Backpropagation is considered one of the core algorithms in Machine Learning. It is mainly used in training the neural network. And backpropagation is basically gradient descent. What if we tell you that understanding and implementing it is not that hard?  Anyone who knows basic Mathematics and has knowledge of the basics of Python Language can learn this in 2 hours. Let’s get started.

Though there are many high-level overviews of the backpropagation and gradient descent algorithms what I found is that unless one implements these from scratch, one is not able to understand many ideas behind neural networks.

Continue reading “Coding Backpropagation and Gradient Descent From Scratch without using any libraries”