Introduction to Artifical Neural Networks with Kera

Perceptron

$$ X = |X_{0} X_{1} X_{2}|$$$$ W = \begin{bmatrix} w_{00} & w_{01} & w_{02} \\ w_{10} & w_{11} & w_{12} \\ w_{20} & w_{21} & w_{22} \end{bmatrix}$$

$w_{ij}$, i, index in layer 0, j, index in layer 1

$$ L = |L_{0} L_{1} L_{2}|$$$$ L = XW $$$$ Outputs = \phi (L) $$

$\phi$ is called the activation function

Multilayer Perceptron (MLP)

Regression MLPs

Classification MLPs

$$ \sigma(z)_{i} = \frac{e^{z_{i}}}{\Sigma^{k}_{j=1} e^{z_{j}}} $$

Functional API

Subclassing API

Save and Restore a Model

Using Callbacks

Early Stop

Check points

TensorBoard

Fine-Tuning Libraries

Fine-Tuning Services

Grid Search a Regression Model with Sklearn

Grid Search a Classification Model with Sklearn

Hyperparameter Selection

Reference