Skip to content

Commit

Permalink
Consistency in numbered list
Browse files Browse the repository at this point in the history
Following #1136 heads-up @jannes-m I think this is better but please check
  • Loading branch information
Robinlovelace authored Oct 2, 2024
1 parent d16477c commit e811ac3
Showing 1 changed file with 5 additions and 6 deletions.
11 changes: 5 additions & 6 deletions 12-spatial-cv.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -565,12 +565,11 @@ Repeated 100 times means fitting a total of 125,000 models to identify optimal h
These are used in the performance estimation, which requires the fitting of another 500 models (5 folds \* 100 repetitions; see Figure \@ref(fig:partitioning)).
To make the performance estimation processing chain even clearer, let us write down the commands we have given to the computer:

1. Performance level (upper left part of Figure \@ref(fig:inner-outer)) - split the dataset into five spatially disjoint (outer) subfolds
1. Tuning level (lower left part of Figure \@ref(fig:inner-outer)) - use the first fold of the performance level and split it again spatially into five (inner) subfolds for the hyperparameter tuning.
Use the 50 randomly selected hyperparameters\index{hyperparameter} in each of these inner subfolds, i.e., fit 250 models
1. Performance estimation: use the best hyperparameter combination from the previous step (tuning level) and apply it to the first outer fold in the performance level to estimate the performance (AUROC\index{AUROC})
1. Repeat steps 2 and 3 for the remaining four outer folds
1. Repeat steps 2 to 4, 100 times
1. Performance level (upper left part of Figure \@ref(fig:inner-outer)): split the dataset into five spatially disjoint (outer) subfolds.
1. Tuning level (lower left part of Figure \@ref(fig:inner-outer)): use the first fold of the performance level and split it again spatially into five (inner) subfolds for the hyperparameter tuning. Use the 50 randomly selected hyperparameters\index{hyperparameter} in each of these inner subfolds, i.e., fit 250 models.
1. Performance estimation: use the best hyperparameter combination from the previous step (tuning level) and apply it to the first outer fold in the performance level to estimate the performance (AUROC\index{AUROC}).
1. Repeat steps 2 and 3 for the remaining four outer folds.
1. Repeat steps 2 to 4, 100 times.

The process of hyperparameter tuning and performance estimation is computationally intensive.
To decrease model runtime, **mlr3** offers the possibility to use parallelization\index{parallelization} with the help of the **future** package.
Expand Down

1 comment on commit e811ac3

@jannes-m
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks 👍

Please sign in to comment.