Covid Isolation Week 11 Day 2 – domestication

It comes to something where I have a day off work and do nothing other than clean the house…. that’s how bad this lockdown is getting.

That’s a slight exaggeration – my new found domesticatedness started just before lockdown but it has certainly been cemented as a habit. I tend to now sort things when I spot they need doing instead of leaving them to another time. It has helped massively on the domestic front and has helped keep all those petty housework arguments to a minimum.

So today has been a day of tiding the kitchen and giving it a good deep clean. In between that my parents dropped some shopping off from Cristyn and we arranged to go for a socially responsible, regulation distance, walk over Beechwood park later in the week. I’m hoping that on Friday the Welsh government start relaxing the rules slightly – certainly I’m hoping to be able to drive somewhere else for a walk.

Anyway… the normal walk isn’t too bad 🙂

Covid-19 Symptom Check

Still good…

Covid Isolation Week 10 Day 6 – films and box-sets

There are many things keeping us going over the long lock down – not just debating the actions of politicians and their special advisers. Films and box-sets have been great….some new finds, some old classics.

So for today’s post here are the top ones for the past 10 weeks.

Films:-

  • High Fidelity 
  • Grosse Pointe Blank
  • Knocked Up
  • 10 things I hate about you
  • Point Break
  • Old School
  • School of Rock
  • Super Bad
  • Leon
  • Breakfast club
  • Knights Tale
  • Breakfast club
  • Ferris buglers day off
  • The Hangover

With some great series being:-

  • Line of Duty
  • Better Call Saul
  • The Wire
  • Ozarks
  • Killing Eve

 

Covid Isolation Week 10 Day 3/4 –

More of the same… but things seem to be moving to lifting restrictions, being able to meet outdoors – not until the end of next week but there is hope. I’m not sure how wise or how damaging this might be but I get the feeling that there is a move to open things up as a vaccine might not be here for a long while.
Which leads to the question of what we do as things start lifting?
I’m 100% sure I don’t want to get it and if it were just me I’d happily isolate for the rest of the year.

But it’s not just me and Steph and Isaac are far more social… it’s a tricky call.

Plus we could all do with a holiday to look forward to….somewhere, sometime soon… somewhere other than the regular route:

Anyway, Isaac continues to face-time his friends and nag me about a YouTube channel… which I’ve been putting off but might relent next week in half term. In the meantime I’ve been teasing him with some photoshoping…

Covid-19 Symptom Check

Still good – still stir crazy

Covid Isolation Week 10 Day 2 – The lock down grind, grinds on

It really is getting boring now… I’m even bored of the same walk in the nature reserve. On the plus side Isaac is now proficient in his tables, forwards and backwards – except his 12s but then who uses their 12s. His school work seems a little on the easy side overall, with him normally powering through three days worth of maths in an hour. Even his English seems to be flying along – but it’s hard to judge given the lack of contact with other parents.

Steph has started taking him for a run each morning to try and up the amount of activity he’s doing. So far they’ve both enjoyed it – long may it continue.

I think the bit I’m stuggling with is the disrupted work pattern.

I start at 7:30…get online…and work through until 12:30. Then lunch and schooling Isaac, usual walk…and then try and get back online to work around 3, finishing about 17:30. What I find is that I’ve jut cleared my emails and caught up with people when it’s time for lunch….and then after lunch there never seems enough time.

First world Covid problems.

Anyway today’s walk…

Covid-19 Symptom Check

A O K

Machine Learning – Tutorial 28

Visualization and Predicting with our Custom SVM

https://pythonprogramming.net/predictions-svm-machine-learning-tutorial/

 

import matplotlib.pyplot as plt
from matplotlib import style
import numpy as np
style.use('ggplot')

# build SVM class

class Support_Vector_Machine:
    # The __init__ method of a class is one that runs whenever an object is created with the class
    # calling self in the class allows sharing of variables across the class, so is included in all function defs
    def __init__(self, visualisation=True):
        # sets visualisations to what ever the user specifies (defaults to True)
        self.visualisation = visualisation
        # defines colours for the two states 1 & -1
        self.colors = {1:'r', -1:'b'}
        # sets some standards for the graphs
        if self.visualisation:
            self.fig = plt.figure()
            self.ax = self.fig.add_subplot(1,1,1)
    # train
    def fit(self, data):
        # set up access to the data that's passed when the function is called
        self.data = data
         # { ||w||: [w,b] }
        opt_dict = {}
        #
        transforms = [[1,1],
                      [-1,1],
                      [-1,-1],
                      [1,-1]]
        # finding values to work with for our ranges.
        all_data = [] # set up a placeholder for the values
        # for loop to step through data and append it to all_data (list of values)
        for yi in self.data:
            for featureset in self.data[yi]:
                for feature in featureset:
                    all_data.append(feature)
        # next define the max and min value in list
        self.max_feature_value = max(all_data)
        self.min_feature_value = min(all_data)
        # free up memory once we've got the values
        all_data=None
        # define step size for optimisation Big through to small
        step_sizes = [self.max_feature_value * 0.1,
                      self.max_feature_value * 0.01,
                      # starts getting very high cost after this.
                      self.max_feature_value * 0.001]

        # extremely expensive
        b_range_multiple = 5
        b_multiple = 5
        # first element in vector w
        latest_optimum = self.max_feature_value*10

        ## Begin the stepping process
        for step in step_sizes:
            w = np.array([latest_optimum,latest_optimum])
            # we can do this because convex
            optimized = False
            while not optimized:
                # we're not optimising b as much as w (not needed)
                for b in np.arange(-1*(self.max_feature_value*b_range_multiple),
                                   self.max_feature_value*b_range_multiple,
                                   step*b_multiple):
                    for transformation in transforms:
                        w_t = w*transformation
                        found_option = True
                        # weakest link in the SVM fundamentally
                        # SMO attempts to fix this a bit
                        # yi(xi.w+b) >= 1
                        #
                        # #### add a break here later..
                        for i in self.data:
                            for xi in self.data[i]:
                                yi=i
                                if not yi*(np.dot(w_t,xi)+b) >= 1:
                                    found_option = False

                        if found_option:
                            opt_dict[np.linalg.norm(w_t)] = [w_t,b]

                if w[0]<0:
                    optimized = True
                    print('optimised a step')
                else:
                    w = w - step

            # break out of while loop
            # take a list of the magnitudes and sort them
            norms = sorted([n for n in opt_dict]) # sorting lowest to highest
            #||w|| : [w,b]
            opt_choice = opt_dict[norms[0]] # smallest magnitude
            self.w = opt_choice[0] # sets w to first element in the smallest mag
            self.b = opt_choice[1] # sets b to second element in the smallest mag
            latest_optimum = opt_choice[0][0]+step*2  # resetting the opt to the latest

    def predict(self,features):
        # sign( x.w+b )
        classification = np.sign(np.dot(np.array(features),self.w)+self.b)
        if classification !=0 and self.visualisation:
            self.ax.scatter(features[0], features[1], s=100, marker='*', c=self.colors[classification])

        return classification

    def visualise(self):
        #scattering known featuresets using a one line for loop
        [[self.ax.scatter(x[0],x[1],s=100,color=self.colors[i]) for x in data_dict[i]] for i in data_dict]
        # hyperplane = x.w+b
        def hyperplane(x,w,b,v):
            # v = (w.x+b)
            return (-w[0]*x-b+v) / w[1]

        datarange = (self.min_feature_value*0.9,self.max_feature_value*1.1) # gives space on the graph
        hyp_x_min = datarange[0]
        hyp_x_max = datarange[1]

        # w.x + b = 1
        # pos sv hyperplane
        psv1 = hyperplane(hyp_x_min, self.w, self.b, 1) # define the ys
        psv2 = hyperplane(hyp_x_max, self.w, self.b, 1) # define the ys
        self.ax.plot([hyp_x_min,hyp_x_max], [psv1,psv2], "k") # plot xs, ys then colour k=black g-- = green
                # w.x + b = -1
        # negative sv hyperplane
        nsv1 = hyperplane(hyp_x_min, self.w, self.b, -1)
        nsv2 = hyperplane(hyp_x_max, self.w, self.b, -1)
        self.ax.plot([hyp_x_min,hyp_x_max], [nsv1,nsv2], "k")

        # w.x + b = 0
        # decision
        db1 = hyperplane(hyp_x_min, self.w, self.b, 0)
        db2 = hyperplane(hyp_x_max, self.w, self.b, 0)
        self.ax.plot([hyp_x_min,hyp_x_max], [db1,db2], "g--")

        plt.show()

# define data dictionary
data_dict = {-1:np.array([[1,7],
                          [2,8],
                          [3,8],]),

             1:np.array([[5,1],
                         [6,-1],
                         [7,3],])}

svm = Support_Vector_Machine()
svm.fit(data=data_dict)
predict_us = [[0,10],
              [1,3],
              [3,4],
              [3,5],
              [5,5],
              [5,6],
              [6,-5],
              [5,8]]

for p in predict_us:
    svm.predict(p)

svm.visualise()

Machine Learning – Tutorial 27

Support Vector Machine Optimization in Python part 2

https://pythonprogramming.net/svm-optimization-python-2-machine-learning-tutorial/

 

import matplotlib.pyplot as plt
from matplotlib import style
import numpy as np
style.use('ggplot')

# build SVM class

class Support_Vector_Machine:
    # The __init__ method of a class is one that runs whenever an object is created with the class
    # calling self in the class allows sharing of variables across the class, so is included in all function defs
    def __init__(self, visualisation=True):
        # sets visualisations to what ever the user specifies (defaults to True)
        self.visualisation = visualisation
        # defines colours for the two states 1 &amp; -1
        self.colors = {1:'r', -1:'b'}
        # sets some standards for the graphs
        if self.visualisation:
            self.fig = plt.figure()
            self.ax = self.fig.add_subplot(1,1,1)
    # train
    def fit(self, data):
        # set up access to the data that's passed when the function is called
        self.data = data
         # { ||w||: [w,b] }
        opt_dict = {}
        #
        transforms = [[1,1],
                      [-1,1],
                      [-1,-1],
                      [1,-1]]
        # finding values to work with for our ranges.
        all_data = [] # set up a placeholder for the values
        # for loop to step through data and append it to all_data (list of values)
        for yi in self.data:
            for featureset in self.data[yi]:
                for feature in featureset:
                    all_data.append(feature)
        # next define the max and min value in list
        self.max_feature_value = max(all_data)
        self.min_feature_value = min(all_data)
        # free up memory once we've got the values
        all_data=None
        # define step size for optimisation Big through to small
        step_sizes = [self.max_feature_value * 0.1,
                      self.max_feature_value * 0.01,
                      # starts getting very high cost after this.
                      self.max_feature_value * 0.001]

        # extremely expensive
        b_range_multiple = 5
        b_multiple = 5
        # first element in vector w
        latest_optimum = self.max_feature_value*10

        ## Begin the stepping process
        for step in step_sizes:
            w = np.array([latest_optimum,latest_optimum])
            # we can do this because convex
            optimized = False
            while not optimized:
                # we're not optimising b as much as w (not needed)
                for b in np.arange(-1*(self.max_feature_value*b_range_multiple),
                                   self.max_feature_value*b_range_multiple,
                                   step*b_multiple):
                    for transformation in transforms:
                        w_t = w*transformation
                        found_option = True
                        # weakest link in the SVM fundamentally
                        # SMO attempts to fix this a bit
                        # yi(xi.w+b) >= 1
                        # 
                        # #### add a break here later..
                        for i in self.data:
                            for xi in self.data[i]:
                                yi=i
                                if not yi*(np.dot(w_t,xi)+b) >= 1:
                                    found_option = False
                                    
                        if found_option:
                            opt_dict[np.linalg.norm(w_t)] = [w_t,b]
                                
                if w[0]<0:
                    optimized = True
                    print('optimised a step')
                else:
                    w = w - step

            # break out of while loop  
            # take a list of the magnitudes and sort them          
            norms = sorted([n for n in opt_dict]) # sorting lowest to highest
            #||w|| : [w,b]
            opt_choice = opt_dict[norms[0]] # smallest magnitude
            self.w = opt_choice[0] # sets w to first element in the smallest mag
            self.b = opt_choice[1] # sets b to second element in the smallest mag
            latest_optimum = opt_choice[0][0]+step*2  # resetting the opt to the latest               

    def predict(self,features):
        # sign( x.w+b )
        classification = np.sign(np.dot(np.array(features),self.w)+self.b)

        return classification

# define data dictionary
data_dict = {-1:np.array([[1,7],
                          [2,8],
                          [3,8],]),

             1:np.array([[5,1],
                         [6,-1],
                         [7,3],])}