nnet in caret. Bootstrapping or cross-validation?

I want to train shallow neural network with one hidden layer using nnet in caret. In trainControl, I used method = cv to perform 3-fold cross-validation. The snipped the code and results summary are below.

  myControl - trainControl(## 3-fold CV
    method = cv,
    number = 3)
  
  nnGrid -  expand.grid(size = seq(1, 10, 3),
                         decay = c(0, 0.2, 0.4))
  
  set.seed(1234)
  
  nnetFit - train(choice ~ .,
                   data = db,
                   method = nnet,
                   maxit = 1000,
                   tuneGrid = nnGrid,
                   trainControl = myControl)

I have few doubts -

  1. The results (attached below) suggests that it performed bootstrapping (25 reps) not cross validation (I was accepting model to be trained only three times for 3-cv for each set of hyperparameter).

  2. I just want to 100% sure on whether model used the original data for training without any pre-processing such as centering and scaling.

  3. I used verboseIter = FALSE in trainControl, but it still prints all the iterations

  4. Are other libraries such as neuralnet, mxnet better than nnet and I can replace them here.

  5. I wan to be sure whether nnet use sigmoid activation function in hidden layer.

Can someone please advise?

No pre-processing
Resampling: Bootstrapped (25 reps) 
Summary of sample sizes: 3492, 3492, 3492, 3492, 3492, 3492, ... 
Resampling results across tuning parameters:

  size  decay  Accuracy   Kappa       
   1    0.0    0.4947424  -0.002382083
   1    0.2    0.5686601   0.141749447
   1    0.4    0.5711497   0.143637446
   4    0.0    0.5076199   0.022765002
   4    0.2    0.7333516   0.468625768
   4    0.4    0.7253675   0.452584882
   7    0.0    0.5002912   0.006079340
   7    0.2    0.7440360   0.488933678
   7    0.4    0.7676500   0.536547080
  10    0.0    0.5064281   0.013648966
  10    0.2    0.7668795   0.535370693
  10    0.4    0.7566465   0.513652332

Accuracy was used to select the optimal model using the largest value.
The final values used for the model were size = 7 and decay = 0.4.

Topic bootstraping cross-validation neural-network r machine-learning

Category Data Science


  1. There is a mistake in your train function. It should be trControl not trainControl.

  2. By default the train function doesn't do any pre-processing (preProcess = NULL)

  3. Have trace = FALSE in the train function to stop printing the trace to the screen.

  4. You can find available model for caret at https://topepo.github.io/caret/available-models.html

  5. Looks like nnet uses sigmoidal activation functions for the hidden layer; see here https://stats.stackexchange.com/questions/78252/whats-the-activation-function-used-in-the-nodes-of-hidden-layer-from-nnet-libra

Additionally, you might want to set linout parameter TRUE for regression and FALSE for classification.

About

Geeks Mental is a community that publishes articles and tutorials about Web, Android, Data Science, new techniques and Linux security.