List element comparison exercises

Given an instance of a class and an instance of an object, how do they differ from each other.

So the class instance and the object instance has 23 properties and/or methods in common. Which three extra properties does the class instance have over the object instance?

First attempt

Solution

Refactoring exercises

Using list comprehension

Python Descriptors

Looking up an attribute of Describe instance 1 from a Client instance or a Client class requires knowing its specific attribute name (i.e. value). Either way, both lookups return the same exact value because they accessed the same attribute to begin with. Encapsulation does not exist here.

But what if you want to control access to the attributes of the Describe instance such that you can control what exact values are returned based on whether the lookup was via the instance object or via the class?

Here you don’t even need to know the specific attribute name. The Descriptor class takes care of that for you. Proper encapsulation exists here, as it should.

Very important note: For it to work, descriptor objects MUST be a class attribute within the client.

__get__(self, instance, owner)

  • instance is the object we wish to encapsulate
  • owner essentially means class (i.e. the class of the instance object)

Descriptor’s __get__ method is that mechanism where you can influence what gets returned or gets processed from a member instance lookup.

__set__(self, instance, value)

The __set__ method is where you can influence processing or validation of the values right before assigning it to a property. 2

Here we use that descriptor behavior to validate the input values.

If it’s not within the range of the required value then it doesn’t proceed.


  1. client.desc [return]
  2. It gets called right about when you assign something to the descriptor object. [return]

Regular Expressions

Character match

Matches the exact characters in the sequence1:

Beans (capital B) does not match:

Matches beans with either B and b in the beginning:


Excludes all bean:

  • s?: Matches none or one s

  • \w+: Any character except spaces. Can be one or more character.
  • \w+\s: Like above but must have one space at the end
  • Matches any word with [Bb]eans as the second word
  • Beans does not match in this example because previous characters are comma and space

Range match

  • [a-z]: Any one character from small a to z
  • [a-z]+: One or more characters from small a to z
  • [a-z]+e: Like the above but must have an e after it
  • [a-z]+e\b: Also like the above but must have no more characters after it (boundary)

Meta-characters

  • .: Any character that isn’t \n
  • \s: The whitespace character
  • \d: Any number digit (i.e. [0-9])
  • \D: Not \d
  • \w: Any alphabets, numbers, and _
  • \W: Not \w (i.e. punctuations, space, etc.)

Examples

  • .....: Matches any 5 characters that isn’t \n
  • .{5}: Same as above

  • \w+\d+@: Matches one or more characters with one or more numbers before the @ (i.e. [a-z]+[0-9]+@)
  • [a-z]+[^0-9\s]+@: Matches any email address that does not contain numbers and/or whitespaces

Anchors

  • ^ or \A: At the beginning of the line

  • $ or \Z: At the end of the line

  • \b: Word boundary
  • \B: Non-word boundary (i.e. in-between the characters of a word)

Quantifiers

  • *: Zero or more
  • +: One or more
  • ?: Zero or one
  • {m,n}: Between m and n inclusive

Remember, only the letter i is matched here.


  1. Beans has capital B and bean has no s that’s why they’re not a match [return]

An Automator workflow for generating random integers with Python 2.7

This workflow generates 10 random integers between 1 and 100 and sends them to the pasteboard.

Script

Uses list comprehension:

import random, subprocess

utf8 = {'LANG': 'en_US.UTF-8'}
pipe = subprocess.PIPE
proc = subprocess.Popen('pbcopy', env=utf8, stdin=pipe)
randints = [str(random.randint(1,100)) for _ in range(10)]
proc.communicate(', '.join(randints).encode())

Table of mutable, hashable, and iterable objects in Python

  • Sets can only have hashable objects as values
  • Dicts can only have hashable object as keys

Does the object contain this

Given that the soup is a container:

if fly in soup:
  tell_the_waiter()

We search the soup for any fly (i.e. Is that a fly in my soup?)

Implementation

For our example below, we have a KidsMovie class that can suggest which age range of kids might enjoy this particular movie:

You only need to implement this method:

  • __contains__()

Code

Class diagram for people who enjoy their UMLs:

Creating our own subscriptable object

In my previous post, the Basket object can be iterated (for example, over a for loop). But it cannot be subscriptable:

To make it subscriptable, we refactor the original code on the right so it becomes like the one on the left:

We only need to implement these methods:

  • __getitem__(index)
  • __len__()

Impementing our own iterable object

You only need to implement the following methods:

  • __iter__()
  • __next__()

Example

for b in Basket('potato', 'apple', 'cheese'):
  print(b)

-> potato
-> apple
-> cheese

Code

Using list.pop():

class Basket:
  def __init__(self, *arg):
    self._list = list(arg)
    self._list.reverse()
    
  def __iter__(self):
    return self
  
  def __next__(self):
    if len(self._list) == 0:
      raise StopIteration()
    return self._list.pop()

Without list.pop():

class Basket:
   def __init__(self, *arg):
      self._list = list(arg)
      self._index = len(self._list)
      self._list.reverse()

   def __iter__(self):
       return self

   def __next__(self):
      self._index -= 1

      if self._index < 0:
         raise StopIteration()

      return self._list[self._index]

Sequence diagram

  1. Python calls Basket’s iter() method
  2. Basket calls it’s __iter__() magic method…
  3. …which just returns itself
  4. From here on, Python will be calling Basket’s next() method in a loop
  5. …which in turn causes Basket to call its __next__() magic method
  6. …which returns the element
  7. When there is no more element, the process stops through StopIteration exception

#iterable #python

Demystifying Python context managers

with open(secret_file) as eyes_only:
	read_discreetly(eyes_only)

The point behind this is that if you opened a secret file, you do not need to make it 💥 explicitly.

Here’s the same implementation without a context manager:

eyes_only = open(secret_file)
try:
	read_discreetly(eyes_only)
finally:
	eyes_only.destroy()

Notice that here you MUST NOT forget to 💥 after reading said secret file.

How

A class can become a context manager just by implementing these:

  • __enter__()
  • __exit__()

Example:

with Freezer() as icebox:
	look_for('yummy treats')
class Freezer():

	def __enter__(self) -> self:
		open_door()
		return self
		
	def __exit__(self, ex_type, ex_value, ex_traceback) -> self:
		close_door()
  1. The Freezer’s door opens for you automatically
  2. You look for yummy treats…
  3. The Freezer’s door closes for you automatically

The mechanism that makes this happen:

The Pythonic way of reading lines of text in a file

The above contents as processed by the for loop line by line:

The r parameter means it’s in read-only mode

Python for loop behind the scenes

Here we have a string iterated by a for loop:

Below is how it was done:

When it goes past the last element, it raises an StopIteration exception (which is handled gracefully by the for loop).

When your GitHub credentials in Homebrew becomes invalid

Clear the stale credential from the MacOS keychain:

printf \

Generate a new personal access token in your GitHub dashboard (and do copy the new access token somewhere because it will be shown to you by GitHub only once):

Copy the generated access token and export it to your .rc file:

echo 'export HOMEBREW_GITHUB_API_TOKEN=new_token_xyz' >> ~/.zshrc” /></p>
        
      </div>
	  
    </div>
    
    <div class=

Fun with Arduino: Controlling a servo with potentiometer

#electronics, #sensors, #actuators, #IoT

Fun with Electronics: Using a transistor to drive a relay

#electronics

Creating a basic Django app with Github integration and deployed to Heroku

#commandline, #vscode, #backend, #showdonttell

Fun with Arduino: Controlling servo with flex sensor

#electronics, #sensors, #actuators, #microcontrollers, #arduino

Fun with Arduino: Controlling speed and spin direction of DC motor

#electronics, #potentiometer, #pushbutton, #switch, #dc motor, #h-bridge, #transistor, #ic

Fun with Arduino: ‘softpot’ variable resistor

#electronics, #arduino, #IoT, #microcontrollers, #embeddedsystems

Neural Network Backpropagation

Forward pass

The forward pass, leading up to the ReLU activation function can be summed up by this, where z = sum of wx + bias:

Step 1: Multiply the inputs (x) with the weights (w)

Step 2: Group the weighted inputs (xw0-2) and the bias (b)

Step 3: Add them all together

Step 4: Feed the result to the activation function

YouTube video here

Rust: Unpacking functions

Partial Derivatives

The partial derivative of the sum with respect to x or y equals 1 (i.e. if you only see addition, the result be always 1):

The partial derivative of x multiplied by y, with respect to x or y, equals the other output (i.e. if you only see multiplication between x and y, they exchange their values with each other, always):

The partial derivative of max() of 2 variables with respect to either of them is 1 if that variable is the biggest, otherwise 0:

The derivative of the max() of a single variable x equals 1 if x > 0, otherwise 0:

  • 1 if x > y else 0
  • 1 if x > 0 else 0

The derivative of chained functions equals the product of the partial derivatives of subsequent functions:


Gradient

The gradient is a vector of all possible partial derivatives (i.e. the inverted triangle is just the notation/symbol for the “things” where the red arrow is pointing):

Rust: Ownership

SomeObject (represented by the blue ball) resides in a memory in some location (0xabc123beef is the address). It is then assigned to the identifier x making it its owner:

x is passed as a parameter to the function process_object().:

The process_object() function has now ownership. x gets released and ceases to exist:

Now, if the function creates some new object it becomes its owner:

If the function returns an object, it can be assigned to a new owner:

Here, the function relinquishes ownership to y:

Derivatives

Writing derivatives using the Leibniz notation

All of these are the different ways to write the derivative of a function.

The derivative of

constant function

The derivative equals 0 since there’s no change from one x value to any other x values. It means that there’s no slope.


linear function

The derivative normally equals 1. y changes by the same amount for every change of x.

The derivative of a linear function equals to the slope m (i.e. m = 2).


quadratic function

At any point x, the slope of the tangent line will be 6x.


quadratic function with addition


multi-dimensional function


Remember, the derivative of a constant is always 0.

Softmax activation function and exponential function

Softmax function: An activation function for classification

Exponential function

  • value is always non-negative
  • if value is negative (sic), it returns a 0
  • if value is 0, it returns a 1
  • otherwise, value increases exponentially (which is why its results must be tempered by Softmax)

Activating ReLU function in the hidden layers

Our neural network, which is not densely connected, has 2 hidden layers with 8 neurons each. There’s only a one-to-one correspondence between each neuron in the input, layer 1, layer 2, and the output. We will use both hidden layers to fit the line to the sine wave function.

We start by assigning baseline values–giving us this output.

Setting the hidden layers’ weight values to 1.0 results in this linear line (i.e. weights tend to affect the slope of the line).

Increasing the weights increases the slope even more.

Increasing layer 2 ’s bias by a half nudges the entire line up by a half.

Flipping layer 1 ’s weight results in flipping the line (*explanation after the illustration).

ReLU note: y = 0 if x <= 0 else x that’s why the line does not descend past 0

Code

NumPy:


Moving on. Let’s flip layer 2 ’s weight to flip the line vertically.

To move that whole line upward by a half, we set the bias of layer 2 of the bottommost neuron to 0.5.

At this stage we have completed the first section of the process. The slope of the leftmost line, more or less, follows the contour of that section of the sine wave.

The aim is to get to this stage where every section follow the form of the sine wave.