Writing derivatives using the Leibniz notation
All of these are the different ways to write the derivative of a function.
The derivative of
The derivative equals 0 since there’s no change from one
xvalue to any other
xvalues. It means that there’s no slope.
The derivative normally equals 1.
ychanges by the same amount for every change of
The derivative of a linear function equals to the slope
m = 2).
At any point
x, the slope of the tangent line will be
quadratic function with addition
Remember, the derivative of a constant is always 0.
Softmax Activation Function
Our neural network, which is not densely connected, has 2 hidden layers with 8 neurons each. There’s only a one-to-one correspondence between each neuron in the input, layer 1, layer 2, and the output. We will use both hidden layers to fit the line to the sine wave function.
We start by assigning baseline values–giving us this output.
Setting the hidden layers’
1.0results in this linear line (i.e. weights tend to affect the slope of the line).
Increasing the weights increases the slope even more.
Increasing layer 2 ’s
biasby a half nudges the entire line up by a half.
Flipping layer 1 ’s weight results in flipping the line (*explanation after the illustration).
y = 0 if x <= 0 else xthat’s why the line does not descend past
Moving on. Let’s flip layer 2 ’s weight to flip the line vertically.
To move that whole line upward by a half, we set the
biasof layer 2 of the bottommost neuron to
At this stage we have completed the first section of the process. The slope of the leftmost line, more or less, follows the contour of that section of the sine wave.
The aim is to get to this stage where every section follow the form of the sine wave.
Starting with a
weight = 0and
bias = 0on a single neuron:
weight = 1.0while keeping
To flip the line horizontally, negate the
Now let’s introduce the second neuron with values
weight = 1.0and
bias = 1.0:
This caused a vertical shift of the activation function. Note that the bias moves the line vertically.
Let’s negate the second neuron’s
weightby 2 (i.e. -2.0):
And now to compute the
A dense layer is a fully-connected neural network layer.
Generating a random 2x4 matrix in NumPy
0.01value is for making the randomly generated values a magnitude smaller (akin to a volume knob that controls the volume of a sound).
Hidden layer 1 (with code)
A neural network with one hidden layer
This neural network has 4 features in the input layer and 3 neurons in the hidden layer. Each of the three neurons is composed of 4 different weights (coming from the 4 inputs) and has shape of
Given a pair of variables
inputs(3x4 matrix) and
weights(3x4 matrix). The matrix on the right side (i.e. weight) must be transposed first.
Only can then we perform a matrix dot multiplication.
Row and column vectors in NumPy
Matrix transposition visualization
Transposition modifies a matrix in such a way that rows become columns and columns become rows.
Visualizing matrix multiplication
Replacing the front caster from a wheel to a ball made from a roll-on deodorant applicator.
Update: Light seeking robot project
Making the wirings pretty.
This is only done per new project:
remote@~ $ mkdir project_foo && cd project_foo remote@~/project_foo $ python3 -m venv env_foo
This is done every time you launch the JupyterLab server remotely:
local $ ssh -L 8080:localhost:8080 firstname.lastname@example.org remote@~/project_foo $ source ˜ /project_foo/bin/activate (env_foo) $ jupyter lab --no-browser --port 8080 ... [I 17:13:43.778 LabApp] Serving notebooks from local directory: /home/remote [I 17:13:43.778 LabApp] Jupyter Notebook 6.1.3 is running at: [I 17:13:43.779 LabApp] http://localhost:8080/?token=35a68d738b9cbd72a0910452f8d4446c6b250c34c37b2fa2 [I 17:13:43.779 LabApp] or http://127.0.0.1:8080/?token=35a68d738b9cbd72a0910452f8d4446c6b250c34c37b2fa2 [I 17:13:43.779 LabApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).
And then in your local browser:
Using a Python 3 virtual environment for Python projects
$ mkdir proj_foo && cd proj_foo or $ take proj_foo $ python3 -m venv env_foo $ source env_foo/bin/activate (env_foo) proj_foo $
When ssh connection drops out after a few minutes of inactivity
This is the error message it displays:
… send disconnect: Broken pipe
/etc/ssh/sshd_config, make sure these two lines are uncommented:
ClientAliveInterval 30 ClientAliveCountMax 5
Prototyping the battery holder bracket.
It’s finally done–testing and all. Now, to make it a little less messy.
#arduino #electronics #sensors
subscribe via RSS