Neural Network#
teaching: 20 exercises: 0 questions:
“How to use Neural Network in Machine Learning model” objectives:
“Learn how to use ANN in ML model” keypoints:
“ANN”
11 Neural Network#
Formulation of Neural Network
Here, x1,x2….xn are input variables. w1,w2….wn are weights of respective inputs. b is the bias, which is summed with the weighted inputs to form the net inputs.
Bias and weights are both adjustable parameters of the neuron.
Parameters are adjusted using some learning rules.
The output of a neuron can range from -inf to +inf.
The neuron doesn’t know the boundary. So we need a mapping mechanism between the input and output of the neuron.
This mechanism of mapping inputs to output is known as Activation Function.
Activation functions:
Neural Network formulation
Basic Type of Neural Network:
11.1 Implementation#
install.packages("neuralnet")
Split the data
library(caret)
library(neuralnet)
datain <- mtcars
set.seed(123)
#Split training/testing
indT <- createDataPartition(y=datain$mpg,p=0.6,list=FALSE)
training <- datain[indT,]
testing <- datain[-indT,]
#scale the data set
smax <- apply(training,2,max)
smin <- apply(training,2,min)
trainNN <- as.data.frame(scale(training,center=smin,scale=smax-smin))
testNN <- as.data.frame(scale(testing,center=smin,scale=smax-smin))
Fit the Neural Network using 1 hidden layer with 10 neurons using backpropagation:
set.seed(123)
ModNN <- neuralnet(mpg~cyl+disp+hp+drat+wt+qsec+carb,trainNN, hidden=10,linear.output = T)
plot(ModNN)
#Predict using Neural Network
predictNN <- compute(ModNN,testNN[,c(2:7,11)])
predictmpg<- predictNN$net.result*(smax-smin)[1]+smin[1]
postResample(testing$mpg,predictmpg)