
Derivatives
Writing derivatives using the Leibniz notation
All of these are the different ways to write the derivative of a function.
The derivative of
constant function
The derivative equals 0 since there’s no change from one
x
value to any otherx
values. It means that there’s no slope.
linear function
The derivative normally equals 1.
y
changes by the same amount for every change ofx
.
The derivative of a linear function equals to the slope
m
(i.e.m = 2
).
quadratic function
At any point
x
, the slope of the tangent line will be6x
.
quadratic function with addition
multidimensional function
Remember, the derivative of a constant is always 0.

Softmax Activation Function
Exponential function

Activating ReLU function in the hidden layers
Our neural network, which is not densely connected, has 2 hidden layers with 8 neurons each. There’s only a onetoone correspondence between each neuron in the input, layer 1, layer 2, and the output. We will use both hidden layers to fit the line to the sine wave function.
We start by assigning baseline values–giving us this output.
Setting the hidden layers’
weight
values to1.0
results in this linear line (i.e. weights tend to affect the slope of the line).Increasing the weights increases the slope even more.
Increasing layer 2 ’s
bias
by a half nudges the entire line up by a half.Flipping layer 1 ’s weight results in flipping the line (*explanation after the illustration).
ReLU note:
y = 0 if x <= 0 else x
that’s why the line does not descend past0
Code
NumPy:
Moving on. Let’s flip layer 2 ’s weight to flip the line vertically.
To move that whole line upward by a half, we set the
bias
of layer 2 of the bottommost neuron to0.5
.At this stage we have completed the first section of the process. The slope of the leftmost line, more or less, follows the contour of that section of the sine wave.
The aim is to get to this stage where every section follow the form of the sine wave.

ReLU activation on a twoneuron network
Starting with a
weight = 0
andbias = 0
on a single neuron:Setting
weight = 1.0
while keepingbias
to0
:Setting
bias
to0.5
:To flip the line horizontally, negate the
weight
from1.0
to1.0
:Now let’s introduce the second neuron with values
weight = 1.0
andbias = 1.0
:This caused a vertical shift of the activation function. Note that the bias moves the line vertically.
Let’s negate the second neuron’s
weight
by 2 (i.e. 2.0):And now to compute the
output
: 
Forward pass on a dense layer neural network
A dense layer is a fullyconnected neural network layer.

Generating a random 2x4 matrix in NumPy
The
0.01
value is for making the randomly generated values a magnitude smaller (akin to a volume knob that controls the volume of a sound). 
Neural network with two hidden layers
Hidden layer 1 (with code)

A neural network with one hidden layer
This neural network has 4 features in the input layer and 3 neurons in the hidden layer. Each of the three neurons is composed of 4 different weights (coming from the 4 inputs) and has shape of
(3, 4)
. 
Visualizing matrix dot product
Given a pair of variables
inputs
(3x4 matrix) andweights
(3x4 matrix). The matrix on the right side (i.e. weight) must be transposed first.Only can then we perform a matrix dot multiplication.

Row and column vectors in NumPy

Matrix transposition visualization
Transposition modifies a matrix in such a way that rows become columns and columns become rows.

Visualizing matrix multiplication

Replacing the front caster from a wheel to a ball made from a rollon deodorant applicator.

Update: Light seeking robot project
Making the wirings pretty.

Installing headless JupyterLab on a remote Linux server
Prerequisite:
python3pip
packagepython3dev
package
Installation
This is only done per new project:
remote@~ $ mkdir project_foo && cd project_foo remote@~/project_foo $ python3 m venv env_foo
This is done every time you launch the JupyterLab server remotely:
local $ ssh L 8080:localhost:8080 remote@my.server remote@~/project_foo $ source ˜ /project_foo/bin/activate (env_foo) $ jupyter lab nobrowser port 8080 ... [I 17:13:43.778 LabApp] Serving notebooks from local directory: /home/remote [I 17:13:43.778 LabApp] Jupyter Notebook 6.1.3 is running at: [I 17:13:43.779 LabApp] http://localhost:8080/?token=35a68d738b9cbd72a0910452f8d4446c6b250c34c37b2fa2 [I 17:13:43.779 LabApp] or http://127.0.0.1:8080/?token=35a68d738b9cbd72a0910452f8d4446c6b250c34c37b2fa2 [I 17:13:43.779 LabApp] Use ControlC to stop this server and shut down all kernels (twice to skip confirmation).
And then in your local browser:
http://localhost:8080/?token=35a68d738b9cbd72a0910452f8d4446c6b250c34c37b2fa2

Using a Python 3 virtual environment for Python projects
$ mkdir proj_foo && cd proj_foo or $ take proj_foo $ python3 m venv env_foo $ source env_foo/bin/activate (env_foo) proj_foo $

When ssh connection drops out after a few minutes of inactivity
This is the error message it displays:
… send disconnect: Broken pipe
Solution:
In
/etc/ssh/sshd_config
, make sure these two lines are uncommented:ClientAliveInterval 30 ClientAliveCountMax 5

Prototyping the battery holder bracket.

Light following robot project
It’s finally done–testing and all. Now, to make it a little less messy.
#arduino #electronics #sensors
subscribe via RSS