Sponsored by AI Academies

Deep Learning

Part 1

Part 1

20pts

No code

Algorithms

Math

What is deep learning and machine learning

When you first heard the word ‘deep learning’, you probably assumed that it was the same as machine learning. This actually couldn’t be further from the truth.

Before diving deeper into neural networks, let’s go over some important math concepts in machine learning.

A **function** defines a relationship between an independent and dependent variable. Functions are generally represented as y=f(x) where the function takes in some input x and gives output y.

A **derivative** is a change in y for a small change in x. In other words, it is the slope of a function at a point or the rate of change at a single point. Again, the slope is the change in y / change in x.

Neural networks allow us to model nonlinear and complex relationships within data. Since many real-world problems have non-linear complex relationships, neural networks prove especially useful. The accuracy of neural networks also increases proportionally to the amount of data, which can’t be said for other traditional machine learning algorithms.

The most basic form of a neural network is the **perceptron**. It is essentially a linear machine-learning algorithm for binary classification tasks.

Think of a linear function y=mx+b that you might have learned in math class; that is similar to what a perceptron is doing. In the image above, the importance of the input x_{1}, x_{2}, and x_{3} are determined by the respective weights w_{1}, w_{2}, and w_{3} assigned to the input. The formula is output = w_{1}x_{1} + w_{2}x_{2} + w_{3}x_{3}. If the output is above a certain threshold value, it will result in a 1; if not, it will result in a 0 (an example of binary classification).

Obviously, this simple model is not ideal. A better model would use multiple layers: an input layer, hidden layer(s), and an output layer. The **input layer** represents the dimensions of the input vector (or list). The **hidden layer** consists of intermediate nodes that divide the input into different regions. Think of the hidden layer as a function f(x) that transforms an input into a given output. The **output layer** is simply the output of the network or the network’s final decision.

In Deep Learning Part 2, we will learn more about how neural networks work.

Join our discord server for updates, announcements, future event planning, to ask questions, or just for fun!

Copyright ©2021 Howard County Hour of Code

Copyright ©2021 Howard County Hour of Code