So, without further ado...

###
**Artificial Neural Networks**

I'm going to assume *some*familiarity with ANNs - what they are, their general structure/philosophy. See here for a comprehensive introduction.

As a first pass, I'll introduce a motivating example, go through the math and then implement (in Python) an ANN from scratch.

###
**Definition of the problem**

We'll look at a classification problem in 2 dimensions (to make visualisation easier, read: possible) where our old friend Logistic Regression struggles. It will highlight a key fact that needs to be understood in any sort of machine learning / statistical problem; understanding when and what tools to use to attack a problem.#### The data:

We'll use a toy dataset from sklearn called

*make_moons*, with the following call:```
import pandas as pd
```

`import sklearn.datasets`

`import seaborn as sb`

`import sklearn.linear_model`

`X,y = sklearn.datasets.make_moons(200, noise=0.2) `

`df = pd.DataFrame() `

`df['x1'] = X[:,0] `

`df['x2'] = X[:,1] `

`df['z'] = y`

```
sb.lmplot('x1','x2',hue='z', data=df, fit_reg = False)
```

####
**The task:**

We want to calculate the decision boundary between the two (blue and green) classes. At a glance it's clearly non-linear, so let's see how a Logistic Regression classification goes!

####

We can see why the Logistic Regression decision boundary is linear from the following:####
**Results:**

We can see that the Logistic Regression can only fit a linear decision boundary (see below) - even though the actual decision boundary is decidedly non-linear.

$$ P_{boundary} \equiv \frac{1}{1+e^{- \theta \cdot x}} = 0.5 $$

$$ \implies 1 = e^{- \theta \cdot x}$$

$$ \implies \theta \cdot x = 0 $$ which defines a plane (line in 2 dimensions).

In part 2 I will define an ANN that we will work through to establish exactly how it works and how it is trained. We will then construct and implement it in python to get an idea of the decision boundaries it can produce.

## No comments:

## Post a Comment