• Rewriting the cat Command from Scratch

    The cat (short for concatnate) command is one of the most frequently used commands in Linux/Unix like operating systems. The cat command allows us to create single or multiple files, view contents of a file, concatenate files and redirect output in terminal or files.

  • Finding a Minimum Vertex Cover Using Quantum Computing

    A vertex cover is a set of vertices such that each edge of the graph is incident with at least one vertex in the set. A minimum vertex cover is the vertex cover of the smallest size. In graph theory; two vertices are adjacent if they are connected by an edge and incident if they share a vertex.

  • Getting Started with D-Wave Leap

    I recently got access to the D‑Wave Leap™ Quantum Application Environment (QAE). Leap is the first cloud-based QAE providing real-time access to a live quantum computer. The QAE features 2038 qubits and has a typical operating qubit temperature of 14.5±1 mK. D-Wave allows developers to develop software that will run on the D-Wave system using their D-Wave Ocean SDK for Python. D-Wave recommends that we work in a virtual environment when developing software to run on the QAE.

  • Newton's Method for Finding Roots

    First order optimization techniques are usually less computationally expensive to compute and less time expensive, converging pretty fast on large datasets. Second order optimization techniques on the other hand are faster when the second derivative is known and easy to compute. But the second derivative is often intractable to compute, requiring lots of computation. For certain problems, gradient descent can get stuck along paths of slow convergence around saddle points, whereas second order methods wont’t.

  • Writing a Minimalistic x86 Bootloader

    In computing, programs are loaded by progressively smaller and smaller programs, and a bootloader is the smallest of such programs. When you power on a computer, it will first boot the BIOS (Basic Input-Output System) firmware which performs some tests and then boots into the Operating System (OS). More precisely the standard boot up process is as follows:

  • The Mathematics of Gradient Descent

    In this blog post, I’ll be decrypting the mathematics behind the gradient descent optimizer. The equation of a best fit line as you all know is y = mx + c and the y value is going to change for different values of x, so let’s write that down as an equation.

  • Deep Feed Forward Neural Networks with TensorFlow

    The deep feed forward neural network is similar to the softmax regression model but is composed of more hidden layers. First, let’s import TensorFlow and our MNIST data set.

  • Softmax Regression with TensorFlow

    A vast majority of us got initiated into programming through the typical “Hello World.” program where you just learn to print the phrase “Hello World.” onto the terminal. Like programming, machine learning too has a “Hello World.” program and it is called MNIST. The MNIST (Modified National Institute of Standards and Technology) data set contains 70,000 images of hand written digits along with labels which tell us which image corresponds to which number. The numbers range from 0 to 9. In this tutorial, we are going to train a model to look at images and predict what digits they are. The prediction is going to be a probability rather than a definitive prediction of class and to do that we will be using softmax regression.

  • Linear Regression with TensorFlow

    In this tutorial, we will be looking at how we can use TensorFlow to implement a linear regression model on a given data set. In our case, the data set is going to be very, very small compared to real life data sets that we will be looking at later in the series (our data set only has 4 points). Most of us just get hit by the term linear regression (if you have not taken some advanced math classes), linear regression, in layman’s terms is all about drawing the best fit line that best describes a given data set. So now that we know what linear regression is, let’s get started.

  • Introduction to Linear Regression

    In statistics, linear regression is a linear approach to modelling the relationship between a scalar response and one or more explanatory variables. The case of one explanatory variable is called simple linear regression. For more than one explanatory variable, the process is called multiple linear regression. Put simply, linear regression is all about finding the optimal values of m and c in y = mx + c.

Subscribe via RSS