Skip to main content

Table 1 Short description of the classifiers used in the paper

From: Boosting for high-dimensional two-class prediction

Name

Base classifier

Boosting method

Number of boosting iterationsa

CART(1)

Stump

-

-

CART(5)

CART-5

-

-

AdaBoost.M1(1)

Stump

AdaBoost.M1

10, 100, 200, 300

AdaBoost.M1(5)

CART-5

AdaBoost.M1

10, 100, 200, 300

AdaBoost.M1.ICV(5)

CART-5

AdaBoost.M1b

10,100, 200, 300

GrBoost(1)

Stump

Gradient Boosting

10, 100, 200, 300

GrBoost(5)

CART-5

Gradient Boosting

10, 100, 200, 300

ST-GrBoost(1)

Stump

Gradient Boostingc

100, 300, 500, optimald

ST-GrBoost(5)

CART-5

Gradient Boostingc

100, 300, 500, optimald

LogitBoost(1)

Stump

LogitBoost

10, 100, 200, 300

  1. aThis is the number of boosting iterations considered when evaluating the effect of the sample size and between class difference and in the reanalysis of real data. In the other simulation settings up to 1000 iterations were considered for each classifier
  2. bCross-validated error rate was used to update the weights
  3. cIn each boosting iteration 50 % of training set samples are randomly selected and used to fit the model
  4. dOptimal number of boosting iterations based on out-of-bag estimate