Skip to main content

Table 3 Maximum absolute bias or difference between true* and batch effect conditions following batch adjustment

From: Propensity scores as a novel method to guide sample allocation and minimize batch effects during the design of high throughput experiments

  

Combat adjustment

Regression adjustment

Average max

RMS

Average max

RMS

Case (\({\upbeta }_{1})\)

Randomization versus True

3.16E-01

1.64E-01

2.89E-01

1.35E-01

Stratified Randomization versus True

2.74E-01

1.18E-01

2.46E-01

9.05E-02

Optimal versus True

1.76E-01

1.31E-02

1.71E-01

1.25E-02

Age (\({\upbeta }_{2})\)

Randomization versus True

1.16E-02

5.95E-03

1.05E-02

4.99E-03

Stratified Randomization versus True

1.14E-02

5.86E-03

1.03E-02

4.62E-03

Optimal versus True

6.34E-03

4.47E-04

6.17E-03

4.28E-04

HbA1c (\({\upbeta }_{3})\)

Randomization versus True

3.00E-01

1.46E-01

2.68E-01

1.15E-01

Stratified Randomization versus True

3.01E-01

1.46E-01

2.72E-01

1.16E-01

Optimal versus True

1.72E-01

1.44E-02

1.64E-01

1.20E-02

  1. True* condition represents gene expression values from GSE50397 before batch effects were added to the expression sets in each of the 1000 simulation iterations
  2. Within each experimental iteration, absolute bias was calculated as the absolute value of the difference between observed beta coefficient and the ‘true’ coefficient across the 10,000 most variable genes in the expression dataset, maximum bias represents maximum absolute bias across these genes
  3. Average Max mean of maximum absolute bias values across all simulation iterations. RMS root mean square of maximum absolute bias value across all simulation iterations
  4. The emboldened values represent the smallest, lowest bias, values under each experimental condition