CRAN Package Check Results for Package SuperLearner

Last updated on 2018-05-27 10:50:43 CEST.

Flavor Version Tinstall Tcheck Ttotal Status Flags
r-devel-linux-x86_64-debian-clang 2.0-23 6.31 626.85 633.16 WARN
r-devel-linux-x86_64-debian-gcc 2.0-23 4.75 491.27 496.02 WARN
r-devel-linux-x86_64-fedora-clang 2.0-23 770.93 OK
r-devel-linux-x86_64-fedora-gcc 2.0-23 725.67 OK
r-devel-windows-ix86+x86_64 2.0-23 8.00 711.00 719.00 OK
r-patched-linux-x86_64 2.0-23 4.12 588.76 592.88 OK
r-patched-solaris-x86 2.0-23 1065.50 ERROR
r-release-linux-x86_64 2.0-23 5.79 585.93 591.72 OK
r-release-windows-ix86+x86_64 2.0-23 12.00 846.00 858.00 OK
r-release-osx-x86_64 2.0-23 OK
r-oldrel-windows-ix86+x86_64 2.0-23 6.00 738.00 744.00 OK
r-oldrel-osx-x86_64 2.0-23 NOTE

Check Details

Version: 2.0-23
Check: for unstated dependencies in ‘tests’
Result: WARN
    '::' or ':::' import not declared from: ‘doMC’
Flavors: r-devel-linux-x86_64-debian-clang, r-devel-linux-x86_64-debian-gcc

Version: 2.0-23
Check: package dependencies
Result: NOTE
    Package suggested but not available for checking: ‘xgboost’
Flavor: r-patched-solaris-x86

Version: 2.0-23
Check: tests
Result: ERROR
     Running ‘testthat.R’ [282s/300s]
    Running the tests in ‘tests/testthat.R’ failed.
    Complete output:
     > library(testthat)
     > library(SuperLearner)
     Loading required package: nnls
     Super Learner
     Version: 2.0-23
     Package created on 2018-03-09
    
     >
     > test_check("SuperLearner")
     ── 1. Error: (unknown) (@test-XGBoost.R#2) ────────────────────────────────────
     there is no package called 'xgboost'
     1: library(xgboost) at testthat/test-XGBoost.R:2
     2: stop(txt, domain = NA)
    
     lasso-penalized linear regression with n=506, p=13
     At minimum cross-validation error (lambda=0.0068):
     -------------------------------------------------
     Nonzero coefficients: 12
     Cross-validation error (deviance): 23.55
     R-squared: 0.72
     Signal-to-noise ratio: 2.58
     Scale estimate (sigma): 4.853
     lasso-penalized logistic regression with n=506, p=13
     At minimum cross-validation error (lambda=0.0024):
     -------------------------------------------------
     Nonzero coefficients: 12
     Cross-validation error (deviance): 0.65
     R-squared: 0.51
     Signal-to-noise ratio: 1.03
     Prediction error: 0.128
     lasso-penalized linear regression with n=506, p=13
     At minimum cross-validation error (lambda=0.0238):
     -------------------------------------------------
     Nonzero coefficients: 11
     Cross-validation error (deviance): 24.01
     R-squared: 0.72
     Signal-to-noise ratio: 2.52
     Scale estimate (sigma): 4.900
     lasso-penalized logistic regression with n=506, p=13
     At minimum cross-validation error (lambda=0.0026):
     -------------------------------------------------
     Nonzero coefficients: 12
     Cross-validation error (deviance): 0.67
     R-squared: 0.50
     Signal-to-noise ratio: 0.98
     Prediction error: 0.128
    
     Call:
     SuperLearner(Y = Y_gaus, X = X, family = gaussian(), SL.library = c("SL.mean",
     "SL.biglasso"), cvControl = list(V = 2))
    
    
     Risk Coef
     SL.mean_All 84.77675 0.00326959
     SL.biglasso_All 23.67154 0.99673041
    
     Call:
     SuperLearner(Y = Y_bin, X = X, family = binomial(), SL.library = c("SL.mean",
     "SL.biglasso"), cvControl = list(V = 2))
    
    
     Risk Coef
     SL.mean_All 0.2382946 0.02063139
     SL.biglasso_All 0.1024175 0.97936861
     Y
     0 1
     66 34
     $grid
     NULL
    
     $names
     [1] "SL.randomForest_1"
    
     $base_learner
     [1] "SL.randomForest"
    
     $params
     $params$ntree
     [1] 100
    
    
     [1] "SL.randomForest_1" "X" "Y"
     [4] "create_rf" "data"
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = create_rf$names,
     cvControl = list(V = 2))
    
    
     Risk Coef
     SL.randomForest_1_All 0.050813 1
     $grid
     mtry
     1 1
     2 4
     3 20
    
     $names
     [1] "SL.randomForest_1" "SL.randomForest_2" "SL.randomForest_3"
    
     $base_learner
     [1] "SL.randomForest"
    
     $params
     list()
    
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = create_rf$names,
     cvControl = list(V = 2))
    
    
     Risk Coef
     SL.randomForest_1_All 0.05805154 0.2008891
     SL.randomForest_2_All 0.04498379 0.6278342
     SL.randomForest_3_All 0.04960304 0.1712766
     $grid
     alpha
     1 0.00
     2 0.25
     3 0.50
     4 0.75
     5 1.00
    
     $names
     [1] "SL.glmnet_0" "SL.glmnet_0.25" "SL.glmnet_0.5" "SL.glmnet_0.75"
     [5] "SL.glmnet_1"
    
     $base_learner
     [1] "SL.glmnet"
    
     $params
     list()
    
     [1] "SL.glmnet_0" "SL.glmnet_0.25" "SL.glmnet_0.5" "SL.glmnet_0.75"
     [5] "SL.glmnet_1"
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = ls(learners),
     cvControl = list(V = 2), env = learners)
    
    
     Risk Coef
     SL.glmnet_0_All 0.04764882 0.5990378
     SL.glmnet_0.25_All 0.05046096 0.0000000
     SL.glmnet_0.5_All 0.04818493 0.4009622
     SL.glmnet_0.75_All 0.05707002 0.0000000
     SL.glmnet_1_All 0.06232678 0.0000000
    
     Call:
     SuperLearner(Y = Y, X = X_clean, family = binomial(), SL.library = c("SL.mean",
     svm$names), cvControl = list(V = 3))
    
    
     Risk Coef
     SL.mean_All 0.2355795 0.1272125
     SL.svm_polynomial_All 0.1746844 0.0000000
     SL.svm_radial_All 0.1662527 0.1505339
     SL.svm_sigmoid_All 0.1588301 0.7222536
     Length Class Mode
     call 24 -none- call
     first.sigma 100 -none- numeric
     sigma 1000 -none- numeric
     sigest 1 -none- numeric
     yhat.train 30000 -none- numeric
     yhat.train.mean 30 -none- numeric
     yhat.test 30000 -none- numeric
     yhat.test.mean 30 -none- numeric
     varcount 13000 -none- numeric
     y 30 -none- numeric
     Length Class Mode
     call 24 -none- call
     yhat.train 30000 -none- numeric
     yhat.test 30000 -none- numeric
     varcount 13000 -none- numeric
     binaryOffset 30 -none- numeric
     Length Class Mode
     call 24 -none- call
     first.sigma 100 -none- numeric
     sigma 1000 -none- numeric
     sigest 1 -none- numeric
     yhat.train 30000 -none- numeric
     yhat.train.mean 30 -none- numeric
     yhat.test 30000 -none- numeric
     yhat.test.mean 30 -none- numeric
     varcount 13000 -none- numeric
     y 30 -none- numeric
     Length Class Mode
     call 24 -none- call
     yhat.train 30000 -none- numeric
     yhat.test 30000 -none- numeric
     varcount 13000 -none- numeric
     binaryOffset 30 -none- numeric
    
     Call:
     SuperLearner(Y = Y_gaus, X = X, family = gaussian(), SL.library = c("SL.mean",
     "SL.dbarts", "SL.bartMachine"), cvControl = list(V = 2))
    
    
     Risk Coef
     SL.mean_All 39.02547 0.0078632
     SL.dbarts_All 25.97348 0.9921368
     SL.bartMachine_All 36.05066 0.0000000
    
     Call:
     SuperLearner(Y = Y_bin, X = X, family = binomial(), SL.library = c("SL.mean",
     "SL.dbarts", "SL.bartMachine"), cvControl = list(V = 2))
    
    
     Risk Coef
     SL.mean_All 0.3288889 0
     SL.dbarts_All 0.1980883 1
     SL.bartMachine_All 0.2282789 0
     ExtraTrees:
     - # of trees: 500
     - node size: 5
     - # of dim: 13
     - # of tries: 4
     - type: numeric (regression)
     - multi-task: no
     ExtraTrees:
     - # of trees: 500
     - node size: 1
     - # of dim: 13
     - # of tries: 3
     - type: factor (classification)
     - multi-task: no
    
     Call:
     SuperLearner(Y = Y_bin, X = X, family = binomial(), SL.library = sl_lib,
     cvControl = list(V = 2))
    
    
     Risk Coef
     SL.extraTrees_All 0.1038481 0.5531783
     SL.ranger_All 0.1047875 0.4468217
     SL.mean_All 0.2436000 0.0000000
     $grid
     NULL
    
     $names
     [1] "SL.extraTrees_1"
    
     $base_learner
     [1] "SL.extraTrees"
    
     $params
     list()
    
    
     Call:
     SuperLearner(Y = Y_bin, X = X, family = binomial(), SL.library = c("SL.mean",
     lib$names), cvControl = list(V = 2))
    
    
     Risk Coef
     SL.mean_All 0.2436000 0
     SL.extraTrees_1_All 0.1124902 1
     SL.extraTrees_1 <- function(...) SL.extraTrees(...)
     $grid
     NULL
    
     $names
     [1] "SL.extraTrees_1"
    
     $base_learner
     [1] "SL.extraTrees"
    
     $params
     list()
    
     [1] "SL.extraTrees_1"
     [1] 1
    
     Call:
     SuperLearner(Y = Y_bin, X = X, family = binomial(), SL.library = c("SL.mean",
     lib$names), cvControl = list(V = 2), env = sl_env)
    
    
     Risk Coef
     SL.mean_All 0.2436000 0
     SL.extraTrees_1_All 0.1130368 1
     $grid
     mtry
     1 1
     2 2
    
     $names
     [1] "SL.extraTrees_1" "SL.extraTrees_2"
    
     $base_learner
     [1] "SL.extraTrees"
    
     $params
     list()
    
     [1] "SL.extraTrees_1" "SL.extraTrees_2"
    
     Call:
     SuperLearner(Y = Y_bin, X = X, family = binomial(), SL.library = c("SL.mean",
     lib$names), cvControl = list(V = 2), env = sl_env)
    
    
     Risk Coef
     SL.mean_All 0.2436000 0
     SL.extraTrees_1_All 0.1133418 0
     SL.extraTrees_2_All 0.1059143 1
     $grid
     mtry
     1 1
     2 2
    
     $names
     [1] "SL.extraTrees_1" "SL.extraTrees_2"
    
     $base_learner
     [1] "SL.extraTrees"
    
     $params
     list()
    
     [1] "SL.extraTrees_1" "SL.extraTrees_2"
    
     Call:
     SuperLearner(Y = Y_bin, X = X, family = binomial(), SL.library = c("SL.mean",
     lib$names), cvControl = list(V = 2), env = sl_env)
    
    
     Risk Coef
     SL.mean_All 0.2544000 0
     SL.extraTrees_1_All 0.1450968 0
     SL.extraTrees_2_All 0.1313301 1
    
     Call: glm(formula = Y ~ ., family = family, data = X, weights = obsWeights,
     model = model)
    
     Coefficients:
     (Intercept) crim zn indus chas nox
     3.646e+01 -1.080e-01 4.642e-02 2.056e-02 2.687e+00 -1.777e+01
     rm age dis rad tax ptratio
     3.810e+00 6.922e-04 -1.476e+00 3.060e-01 -1.233e-02 -9.527e-01
     black lstat
     9.312e-03 -5.248e-01
    
     Degrees of Freedom: 505 Total (i.e. Null); 492 Residual
     Null Deviance: 42720
     Residual Deviance: 11080 AIC: 3028
    
     Call:
     glm(formula = Y ~ ., family = family, data = X, weights = obsWeights,
     model = model)
    
     Deviance Residuals:
     Min 1Q Median 3Q Max
     -15.595 -2.730 -0.518 1.777 26.199
    
     Coefficients:
     Estimate Std. Error t value Pr(>|t|)
     (Intercept) 3.646e+01 5.103e+00 7.144 3.28e-12 ***
     crim -1.080e-01 3.286e-02 -3.287 0.001087 **
     zn 4.642e-02 1.373e-02 3.382 0.000778 ***
     indus 2.056e-02 6.150e-02 0.334 0.738288
     chas 2.687e+00 8.616e-01 3.118 0.001925 **
     nox -1.777e+01 3.820e+00 -4.651 4.25e-06 ***
     rm 3.810e+00 4.179e-01 9.116 < 2e-16 ***
     age 6.922e-04 1.321e-02 0.052 0.958229
     dis -1.476e+00 1.995e-01 -7.398 6.01e-13 ***
     rad 3.060e-01 6.635e-02 4.613 5.07e-06 ***
     tax -1.233e-02 3.760e-03 -3.280 0.001112 **
     ptratio -9.527e-01 1.308e-01 -7.283 1.31e-12 ***
     black 9.312e-03 2.686e-03 3.467 0.000573 ***
     lstat -5.248e-01 5.072e-02 -10.347 < 2e-16 ***
     ---
     Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
    
     (Dispersion parameter for gaussian family taken to be 22.51785)
    
     Null deviance: 42716 on 505 degrees of freedom
     Residual deviance: 11079 on 492 degrees of freedom
     AIC: 3027.6
    
     Number of Fisher Scoring iterations: 2
    
    
     Call:
     glm(formula = Y ~ ., family = family, data = X, weights = obsWeights,
     model = model)
    
     Deviance Residuals:
     Min 1Q Median 3Q Max
     -2.9747 -0.4250 -0.0911 0.2720 3.5929
    
     Coefficients:
     Estimate Std. Error z value Pr(>|z|)
     (Intercept) 10.682635 3.921395 2.724 0.006446 **
     crim -0.040649 0.049796 -0.816 0.414321
     zn 0.012134 0.010678 1.136 0.255786
     indus -0.040715 0.045615 -0.893 0.372078
     chas 0.248209 0.653283 0.380 0.703989
     nox -3.601085 2.924365 -1.231 0.218170
     rm 1.155157 0.374843 3.082 0.002058 **
     age -0.018660 0.009319 -2.002 0.045252 *
     dis -0.518934 0.146286 -3.547 0.000389 ***
     rad 0.255522 0.061391 4.162 3.15e-05 ***
     tax -0.009500 0.003107 -3.057 0.002233 **
     ptratio -0.409317 0.103191 -3.967 7.29e-05 ***
     black -0.001451 0.002558 -0.567 0.570418
     lstat -0.318436 0.054735 -5.818 5.96e-09 ***
     ---
     Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
    
     (Dispersion parameter for binomial family taken to be 1)
    
     Null deviance: 669.76 on 505 degrees of freedom
     Residual deviance: 296.39 on 492 degrees of freedom
     AIC: 324.39
    
     Number of Fisher Scoring iterations: 7
    
     [1] "coefficients" "residuals" "fitted.values"
     [4] "effects" "R" "rank"
     [7] "qr" "family" "linear.predictors"
     [10] "deviance" "aic" "null.deviance"
     [13] "iter" "weights" "prior.weights"
     [16] "df.residual" "df.null" "y"
     [19] "converged" "boundary" "call"
     [22] "formula" "terms" "data"
     [25] "offset" "control" "method"
     [28] "contrasts" "xlevels"
    
     Call:
     glm(formula = Y ~ ., family = family, data = X, weights = obsWeights,
     model = model)
    
     Deviance Residuals:
     Min 1Q Median 3Q Max
     -15.595 -2.730 -0.518 1.777 26.199
    
     Coefficients:
     Estimate Std. Error t value Pr(>|t|)
     (Intercept) 3.646e+01 5.103e+00 7.144 3.28e-12 ***
     crim -1.080e-01 3.286e-02 -3.287 0.001087 **
     zn 4.642e-02 1.373e-02 3.382 0.000778 ***
     indus 2.056e-02 6.150e-02 0.334 0.738288
     chas 2.687e+00 8.616e-01 3.118 0.001925 **
     nox -1.777e+01 3.820e+00 -4.651 4.25e-06 ***
     rm 3.810e+00 4.179e-01 9.116 < 2e-16 ***
     age 6.922e-04 1.321e-02 0.052 0.958229
     dis -1.476e+00 1.995e-01 -7.398 6.01e-13 ***
     rad 3.060e-01 6.635e-02 4.613 5.07e-06 ***
     tax -1.233e-02 3.760e-03 -3.280 0.001112 **
     ptratio -9.527e-01 1.308e-01 -7.283 1.31e-12 ***
     black 9.312e-03 2.686e-03 3.467 0.000573 ***
     lstat -5.248e-01 5.072e-02 -10.347 < 2e-16 ***
     ---
     Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
    
     (Dispersion parameter for gaussian family taken to be 22.51785)
    
     Null deviance: 42716 on 505 degrees of freedom
     Residual deviance: 11079 on 492 degrees of freedom
     AIC: 3027.6
    
     Number of Fisher Scoring iterations: 2
    
    
     Call:
     glm(formula = Y ~ ., family = family, data = X, weights = obsWeights,
     model = model)
    
     Deviance Residuals:
     Min 1Q Median 3Q Max
     -2.9747 -0.4250 -0.0911 0.2720 3.5929
    
     Coefficients:
     Estimate Std. Error z value Pr(>|z|)
     (Intercept) 10.682635 3.921395 2.724 0.006446 **
     crim -0.040649 0.049796 -0.816 0.414321
     zn 0.012134 0.010678 1.136 0.255786
     indus -0.040715 0.045615 -0.893 0.372078
     chas 0.248209 0.653283 0.380 0.703989
     nox -3.601085 2.924365 -1.231 0.218170
     rm 1.155157 0.374843 3.082 0.002058 **
     age -0.018660 0.009319 -2.002 0.045252 *
     dis -0.518934 0.146286 -3.547 0.000389 ***
     rad 0.255522 0.061391 4.162 3.15e-05 ***
     tax -0.009500 0.003107 -3.057 0.002233 **
     ptratio -0.409317 0.103191 -3.967 7.29e-05 ***
     black -0.001451 0.002558 -0.567 0.570418
     lstat -0.318436 0.054735 -5.818 5.96e-09 ***
     ---
     Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
    
     (Dispersion parameter for binomial family taken to be 1)
    
     Null deviance: 669.76 on 505 degrees of freedom
     Residual deviance: 296.39 on 492 degrees of freedom
     AIC: 324.39
    
     Number of Fisher Scoring iterations: 7
    
    
     Call:
     SuperLearner(Y = Y_gaus, X = X, family = gaussian(), SL.library = c("SL.mean",
     "SL.glm"))
    
    
     Risk Coef
     SL.mean_All 84.83418 0.01409163
     SL.glm_All 23.62860 0.98590837
     V1
     Min. :-3.903
     1st Qu.:17.517
     Median :22.124
     Mean :22.533
     3rd Qu.:27.341
     Max. :44.361
    
     Call:
     SuperLearner(Y = Y_bin, X = X, family = binomial(), SL.library = c("SL.mean",
     "SL.glm"))
    
    
     Risk Coef
     SL.mean_All 0.23508982 0.009634095
     SL.glm_All 0.09334622 0.990365905
     V1
     Min. :0.003619
     1st Qu.:0.034209
     Median :0.195582
     Mean :0.375494
     3rd Qu.:0.783137
     Max. :0.993512
     Got an error, as expected.
     <simpleError in cbind2(1, newx) %*% nbeta: Cholmod error 'X and/or Y have wrong dimensions' at file ../MatrixOps/cholmod_sdmult.c, line 90>
     Got an error, as expected.
     <simpleError in cbind2(1, newx) %*% nbeta: Cholmod error 'X and/or Y have wrong dimensions' at file ../MatrixOps/cholmod_sdmult.c, line 90>
     Call:
     lda(X, grouping = Y, prior = prior, method = method, tol = tol,
     CV = CV, nu = nu)
    
     Prior probabilities of groups:
     0 1
     0.6245059 0.3754941
    
     Group means:
     crim zn indus chas nox rm age dis
     0 5.2936824 4.708861 13.622089 0.05379747 0.5912399 5.985693 77.93228 3.349307
     1 0.8191541 22.431579 7.003316 0.09473684 0.4939153 6.781821 53.01211 4.536371
     rad tax ptratio black lstat
     0 11.588608 459.9209 19.19968 340.6392 16.042468
     1 6.157895 322.2789 17.21789 383.3425 7.015947
    
     Coefficients of linear discriminants:
     LD1
     crim 0.0012515925
     zn 0.0095179029
     indus -0.0166376334
     chas 0.1399207112
     nox -2.9934367740
     rm 0.5612713068
     age -0.0128420045
     dis -0.3095403096
     rad 0.0695027989
     tax -0.0027771271
     ptratio -0.2059853828
     black 0.0006058031
     lstat -0.0816668897
     Call:
     lda(X, grouping = Y, prior = prior, method = method, tol = tol,
     CV = CV, nu = nu)
    
     Prior probabilities of groups:
     0 1
     0.6245059 0.3754941
    
     Group means:
     crim zn indus chas nox rm age dis
     0 5.2936824 4.708861 13.622089 0.05379747 0.5912399 5.985693 77.93228 3.349307
     1 0.8191541 22.431579 7.003316 0.09473684 0.4939153 6.781821 53.01211 4.536371
     rad tax ptratio black lstat
     0 11.588608 459.9209 19.19968 340.6392 16.042468
     1 6.157895 322.2789 17.21789 383.3425 7.015947
    
     Coefficients of linear discriminants:
     LD1
     crim 0.0012515925
     zn 0.0095179029
     indus -0.0166376334
     chas 0.1399207112
     nox -2.9934367740
     rm 0.5612713068
     age -0.0128420045
     dis -0.3095403096
     rad 0.0695027989
     tax -0.0027771271
     ptratio -0.2059853828
     black 0.0006058031
     lstat -0.0816668897
    
     Call:
     stats::lm(formula = Y ~ ., data = X, weights = obsWeights, model = model)
    
     Coefficients:
     (Intercept) crim zn indus chas nox
     3.646e+01 -1.080e-01 4.642e-02 2.056e-02 2.687e+00 -1.777e+01
     rm age dis rad tax ptratio
     3.810e+00 6.922e-04 -1.476e+00 3.060e-01 -1.233e-02 -9.527e-01
     black lstat
     9.312e-03 -5.248e-01
    
    
     Call:
     stats::lm(formula = Y ~ ., data = X, weights = obsWeights, model = model)
    
     Residuals:
     Min 1Q Median 3Q Max
     -15.595 -2.730 -0.518 1.777 26.199
    
     Coefficients:
     Estimate Std. Error t value Pr(>|t|)
     (Intercept) 3.646e+01 5.103e+00 7.144 3.28e-12 ***
     crim -1.080e-01 3.286e-02 -3.287 0.001087 **
     zn 4.642e-02 1.373e-02 3.382 0.000778 ***
     indus 2.056e-02 6.150e-02 0.334 0.738288
     chas 2.687e+00 8.616e-01 3.118 0.001925 **
     nox -1.777e+01 3.820e+00 -4.651 4.25e-06 ***
     rm 3.810e+00 4.179e-01 9.116 < 2e-16 ***
     age 6.922e-04 1.321e-02 0.052 0.958229
     dis -1.476e+00 1.995e-01 -7.398 6.01e-13 ***
     rad 3.060e-01 6.635e-02 4.613 5.07e-06 ***
     tax -1.233e-02 3.760e-03 -3.280 0.001112 **
     ptratio -9.527e-01 1.308e-01 -7.283 1.31e-12 ***
     black 9.312e-03 2.686e-03 3.467 0.000573 ***
     lstat -5.248e-01 5.072e-02 -10.347 < 2e-16 ***
     ---
     Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
    
     Residual standard error: 4.745 on 492 degrees of freedom
     Multiple R-squared: 0.7406, Adjusted R-squared: 0.7338
     F-statistic: 108.1 on 13 and 492 DF, p-value: < 2.2e-16
    
    
     Call:
     stats::lm(formula = Y ~ ., data = X, weights = obsWeights, model = model)
    
     Residuals:
     Min 1Q Median 3Q Max
     -0.80469 -0.23612 -0.03105 0.23080 1.05224
    
     Coefficients:
     Estimate Std. Error t value Pr(>|t|)
     (Intercept) 1.6675402 0.3662392 4.553 6.67e-06 ***
     crim 0.0003028 0.0023585 0.128 0.897888
     zn 0.0023028 0.0009851 2.338 0.019808 *
     indus -0.0040254 0.0044131 -0.912 0.362135
     chas 0.0338534 0.0618295 0.548 0.584264
     nox -0.7242540 0.2741160 -2.642 0.008501 **
     rm 0.1357981 0.0299915 4.528 7.48e-06 ***
     age -0.0031071 0.0009480 -3.278 0.001121 **
     dis -0.0748924 0.0143135 -5.232 2.48e-07 ***
     rad 0.0168160 0.0047612 3.532 0.000451 ***
     tax -0.0006719 0.0002699 -2.490 0.013110 *
     ptratio -0.0498376 0.0093885 -5.308 1.68e-07 ***
     black 0.0001466 0.0001928 0.760 0.447370
     lstat -0.0197591 0.0036395 -5.429 8.91e-08 ***
     ---
     Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
    
     Residual standard error: 0.3405 on 492 degrees of freedom
     Multiple R-squared: 0.5192, Adjusted R-squared: 0.5065
     F-statistic: 40.86 on 13 and 492 DF, p-value: < 2.2e-16
    
     [1] "coefficients" "residuals" "fitted.values" "effects"
     [5] "weights" "rank" "assign" "qr"
     [9] "df.residual" "xlevels" "call" "terms"
    
     Call:
     stats::lm(formula = Y ~ ., data = X, weights = obsWeights, model = model)
    
     Residuals:
     Min 1Q Median 3Q Max
     -15.595 -2.730 -0.518 1.777 26.199
    
     Coefficients:
     Estimate Std. Error t value Pr(>|t|)
     (Intercept) 3.646e+01 5.103e+00 7.144 3.28e-12 ***
     crim -1.080e-01 3.286e-02 -3.287 0.001087 **
     zn 4.642e-02 1.373e-02 3.382 0.000778 ***
     indus 2.056e-02 6.150e-02 0.334 0.738288
     chas 2.687e+00 8.616e-01 3.118 0.001925 **
     nox -1.777e+01 3.820e+00 -4.651 4.25e-06 ***
     rm 3.810e+00 4.179e-01 9.116 < 2e-16 ***
     age 6.922e-04 1.321e-02 0.052 0.958229
     dis -1.476e+00 1.995e-01 -7.398 6.01e-13 ***
     rad 3.060e-01 6.635e-02 4.613 5.07e-06 ***
     tax -1.233e-02 3.760e-03 -3.280 0.001112 **
     ptratio -9.527e-01 1.308e-01 -7.283 1.31e-12 ***
     black 9.312e-03 2.686e-03 3.467 0.000573 ***
     lstat -5.248e-01 5.072e-02 -10.347 < 2e-16 ***
     ---
     Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
    
     Residual standard error: 4.745 on 492 degrees of freedom
     Multiple R-squared: 0.7406, Adjusted R-squared: 0.7338
     F-statistic: 108.1 on 13 and 492 DF, p-value: < 2.2e-16
    
    
     Call:
     stats::lm(formula = Y ~ ., data = X, weights = obsWeights, model = model)
    
     Residuals:
     Min 1Q Median 3Q Max
     -0.80469 -0.23612 -0.03105 0.23080 1.05224
    
     Coefficients:
     Estimate Std. Error t value Pr(>|t|)
     (Intercept) 1.6675402 0.3662392 4.553 6.67e-06 ***
     crim 0.0003028 0.0023585 0.128 0.897888
     zn 0.0023028 0.0009851 2.338 0.019808 *
     indus -0.0040254 0.0044131 -0.912 0.362135
     chas 0.0338534 0.0618295 0.548 0.584264
     nox -0.7242540 0.2741160 -2.642 0.008501 **
     rm 0.1357981 0.0299915 4.528 7.48e-06 ***
     age -0.0031071 0.0009480 -3.278 0.001121 **
     dis -0.0748924 0.0143135 -5.232 2.48e-07 ***
     rad 0.0168160 0.0047612 3.532 0.000451 ***
     tax -0.0006719 0.0002699 -2.490 0.013110 *
     ptratio -0.0498376 0.0093885 -5.308 1.68e-07 ***
     black 0.0001466 0.0001928 0.760 0.447370
     lstat -0.0197591 0.0036395 -5.429 8.91e-08 ***
     ---
     Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
    
     Residual standard error: 0.3405 on 492 degrees of freedom
     Multiple R-squared: 0.5192, Adjusted R-squared: 0.5065
     F-statistic: 40.86 on 13 and 492 DF, p-value: < 2.2e-16
    
    
     Call:
     SuperLearner(Y = Y_gaus, X = X, family = gaussian(), SL.library = c("SL.mean",
     "SL.lm"))
    
    
     Risk Coef
     SL.mean_All 84.83418 0.01409163
     SL.lm_All 23.62860 0.98590837
     V1
     Min. :-3.903
     1st Qu.:17.517
     Median :22.124
     Mean :22.533
     3rd Qu.:27.341
     Max. :44.361
    
     Call:
     SuperLearner(Y = Y_bin, X = X, family = binomial(), SL.library = c("SL.mean",
     "SL.lm"))
    
    
     Risk Coef
     SL.mean_All 0.2350898 0
     SL.lm_All 0.1115445 1
     V1
     Min. :0.0000
     1st Qu.:0.1281
     Median :0.3530
     Mean :0.3899
     3rd Qu.:0.6091
     Max. :1.0000
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = SL.library,
     method = "method.NNLS", verbose = F, cvControl = list(V = 2))
    
    
     Risk Coef
     SL.rpart_All 0.2417014 0.02808554
     SL.glmnet_All 0.1759297 0.97191446
     SL.mean_All 0.2666500 0.00000000
     Error in (function (Y, X, newX, ...) : bad algorithm
     Error in (function (Y, X, newX, ...) : bad algorithm
     Error in (function (Y, X, newX, ...) : bad algorithm
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = c(SL.library,
     "SL.bad_algorithm"), method = "method.NNLS", verbose = T, cvControl = list(V = 2))
    
    
    
     Risk Coef
     SL.rpart_All 0.1730096 0.35627490
     SL.glmnet_All 0.1647030 0.61741243
     SL.mean_All 0.2504500 0.02631267
     SL.bad_algorithm_All NA 0.00000000
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = SL.library,
     method = "method.NNLS2", verbose = F, cvControl = list(V = 2))
    
    
     Risk Coef
     SL.rpart_All 0.1959772 0.4596517
     SL.glmnet_All 0.1943974 0.4326680
     SL.mean_All 0.2558500 0.1076803
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = SL.library,
     method = "method.NNloglik", verbose = F, cvControl = list(V = 2))
    
    
     Risk Coef
     SL.rpart_All Inf 0.06137199
     SL.glmnet_All 0.4907020 0.93862801
     SL.mean_All 0.6964542 0.00000000
     Error in (function (Y, X, newX, ...) : bad algorithm
     Error in (function (Y, X, newX, ...) : bad algorithm
     Error in (function (Y, X, newX, ...) : bad algorithm
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = c(SL.library,
     "SL.bad_algorithm"), method = "method.NNloglik", verbose = T, cvControl = list(V = 2))
    
    
    
     Risk Coef
     SL.rpart_All Inf 0
     SL.glmnet_All 0.4877618 1
     SL.mean_All 0.6928473 0
     SL.bad_algorithm_All NA 0
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = SL.library,
     method = "method.CC_LS", verbose = F, cvControl = list(V = 2))
    
    
     Risk Coef
     SL.rpart_All 0.1851302 0.42598535
     SL.glmnet_All 0.1768747 0.54102157
     SL.mean_All 0.2558500 0.03299307
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = SL.library,
     method = "method.CC_nloglik", verbose = F, cvControl = list(V = 2))
    
    
     Risk Coef
     SL.rpart_All 244.2850 0.2339504
     SL.glmnet_All 205.9486 0.7660496
     SL.mean_All 284.3800 0.0000000
     Error in (function (Y, X, newX, ...) : bad algorithm
     Error in (function (Y, X, newX, ...) : bad algorithm
     Error in (function (Y, X, newX, ...) : bad algorithm
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = c(SL.library,
     "SL.bad_algorithm"), method = "method.CC_nloglik", verbose = T, cvControl = list(V = 2))
    
    
    
     Risk Coef
     SL.rpart_All 259.5446 0.1345917
     SL.glmnet_All 198.5286 0.7396111
     SL.mean_All 277.1389 0.1257971
     SL.bad_algorithm_All NA 0.0000000
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = SL.library,
     method = "method.AUC", verbose = F, cvControl = list(V = 2))
    
    
     Risk Coef
     SL.rpart_All 0.2332599 0.4562568
     SL.glmnet_All 0.1828646 0.2982123
     SL.mean_All 0.5250225 0.2455309
     Error in (function (Y, X, newX, ...) : bad algorithm
     Error in (function (Y, X, newX, ...) : bad algorithm
     Removing failed learners: SL.bad_algorithm_All
     Error in (function (Y, X, newX, ...) : bad algorithm
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = c(SL.library,
     "SL.bad_algorithm"), method = "method.AUC", verbose = T, cvControl = list(V = 2))
    
    
    
     Risk Coef
     SL.rpart_All 0.2827044 0.3333333
     SL.glmnet_All 0.2284056 0.3333333
     SL.mean_All 0.5150135 0.3333333
     SL.bad_algorithm_All NA 0.0000000
     Call:
     qda(X, grouping = Y, prior = prior, method = method, tol = tol,
     CV = CV, nu = nu)
    
     Prior probabilities of groups:
     0 1
     0.6245059 0.3754941
    
     Group means:
     crim zn indus chas nox rm age dis
     0 5.2936824 4.708861 13.622089 0.05379747 0.5912399 5.985693 77.93228 3.349307
     1 0.8191541 22.431579 7.003316 0.09473684 0.4939153 6.781821 53.01211 4.536371
     rad tax ptratio black lstat
     0 11.588608 459.9209 19.19968 340.6392 16.042468
     1 6.157895 322.2789 17.21789 383.3425 7.015947
     Call:
     qda(X, grouping = Y, prior = prior, method = method, tol = tol,
     CV = CV, nu = nu)
    
     Prior probabilities of groups:
     0 1
     0.6245059 0.3754941
    
     Group means:
     crim zn indus chas nox rm age dis
     0 5.2936824 4.708861 13.622089 0.05379747 0.5912399 5.985693 77.93228 3.349307
     1 0.8191541 22.431579 7.003316 0.09473684 0.4939153 6.781821 53.01211 4.536371
     rad tax ptratio black lstat
     0 11.588608 459.9209 19.19968 340.6392 16.042468
     1 6.157895 322.2789 17.21789 383.3425 7.015947
     Y
     0 1
     68 32
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = sl_lib, cvControl = list(V = 2))
    
    
    
     Risk Coef
     SL.randomForest_All 0.08543132 1
     SL.mean_All 0.24760000 0
     $grid
     NULL
    
     $names
     [1] "SL.randomForest_1"
    
     $base_learner
     [1] "SL.randomForest"
    
     $params
     list()
    
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = create_rf$names,
     cvControl = list(V = 2))
    
    
     Risk Coef
     SL.randomForest_1_All 0.1237247 1
     SL.randomForest_1 <- function(...) SL.randomForest(...)
     $grid
     NULL
    
     $names
     [1] "SL.randomForest_1"
    
     $base_learner
     [1] "SL.randomForest"
    
     $params
     list()
    
     [1] "SL.randomForest_1"
     [1] 1
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = create_rf$names,
     cvControl = list(V = 2), env = sl_env)
    
    
     Risk Coef
     SL.randomForest_1_All 0.1286959 1
     $grid
     mtry
     1 1
     2 2
    
     $names
     [1] "SL.randomForest_1" "SL.randomForest_2"
    
     $base_learner
     [1] "SL.randomForest"
    
     $params
     list()
    
     [1] "SL.randomForest_1" "SL.randomForest_2"
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = create_rf$names,
     cvControl = list(V = 2), env = sl_env)
    
    
     Risk Coef
     SL.randomForest_1_All 0.07415690 0.3566581
     SL.randomForest_2_All 0.06937001 0.6433419
     $grid
     mtry
     1 1
     2 2
    
     $names
     [1] "SL.randomForest_1" "SL.randomForest_2"
    
     $base_learner
     [1] "SL.randomForest"
    
     $params
     list()
    
     [1] "SL.randomForest_1" "SL.randomForest_2"
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = create_rf$names,
     cvControl = list(V = 2), env = sl_env)
    
    
     Risk Coef
     SL.randomForest_1_All 0.05962605 0.1247237
     SL.randomForest_2_All 0.05350255 0.8752763
     $grid
     mtry nodesize maxnodes
     1 1 NULL NULL
     2 2 NULL NULL
    
     $names
     [1] "SL.randomForest_1_NULL_NULL" "SL.randomForest_2_NULL_NULL"
    
     $base_learner
     [1] "SL.randomForest"
    
     $params
     list()
    
     [1] "SL.randomForest_1_NULL_NULL" "SL.randomForest_2_NULL_NULL"
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = create_rf$names,
     cvControl = list(V = 2), env = sl_env)
    
    
     Risk Coef
     SL.randomForest_1_NULL_NULL_All 0.06451327 0.6179109
     SL.randomForest_2_NULL_NULL_All 0.05916622 0.3820891
     $grid
     mtry maxnodes
     1 1 5
     2 2 5
     3 1 10
     4 2 10
     5 1 NULL
     6 2 NULL
    
     $names
     [1] "SL.randomForest_1_5" "SL.randomForest_2_5" "SL.randomForest_1_10"
     [4] "SL.randomForest_2_10" "SL.randomForest_1_NULL" "SL.randomForest_2_NULL"
    
     $base_learner
     [1] "SL.randomForest"
    
     $params
     list()
    
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = create_rf$names,
     cvControl = list(V = 2), env = sl_env)
    
    
     Risk Coef
     SL.randomForest_1_5_All 0.08783373 0
     SL.randomForest_2_5_All 0.08487843 0
     SL.randomForest_1_10_All 0.08546784 0
     SL.randomForest_2_10_All 0.07830362 1
     SL.randomForest_1_NULL_All 0.08360061 0
     SL.randomForest_2_NULL_All 0.08112253 0
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = create_rf$names,
     cvControl = list(V = 2))
    
    
     Risk Coef
     SL.randomForest_1_5_All 0.06490989 0
     SL.randomForest_2_5_All 0.05400114 0
     SL.randomForest_1_10_All 0.06018865 0
     SL.randomForest_2_10_All 0.05188455 0
     SL.randomForest_1_NULL_All 0.05936604 0
     SL.randomForest_2_NULL_All 0.05173980 1
    
     Call:
     SuperLearner(Y = Y, X = X, family = binomial(), SL.library = create_rf$names,
     cvControl = list(V = 2))
    
    
     Risk Coef
     SL.randomForest_1_5_All 0.06896458 0
     SL.randomForest_2_5_All 0.06124560 1
     SL.randomForest_1_10_All 0.06885161 0
     SL.randomForest_2_10_All 0.06549691 0
     SL.randomForest_1_NULL_All 0.06775305 0
     SL.randomForest_2_NULL_All 0.06500317 0
     Ranger result
    
     Call:
     ranger::ranger(`_Y` ~ ., data = cbind(`_Y` = Y, X), num.trees = num.trees, mtry = mtry, min.node.size = min.node.size, replace = replace, sample.fraction = sample.fraction, case.weights = obsWeights, write.forest = write.forest, probability = probability, num.threads = num.threads, verbose = verbose)
    
     Type: Regression
     Number of trees: 500
     Sample size: 506
     Number of independent variables: 13
     Mtry: 3
     Target node size: 5
     Variable importance mode: none
     OOB prediction error (MSE): 10.54595
     R squared (OOB): 0.8753238
     Ranger result
    
     Call:
     ranger::ranger(`_Y` ~ ., data = cbind(`_Y` = Y, X), num.trees = num.trees, mtry = mtry, min.node.size = min.node.size, replace = replace, sample.fraction = sample.fraction, case.weights = obsWeights, write.forest = write.forest, probability = probability, num.threads = num.threads, verbose = verbose)
    
     Type: Probability estimation
     Number of trees: 500
     Sample size: 506
     Number of independent variables: 13
     Mtry: 3
     Target node size: 1
     Variable importance mode: none
     OOB prediction error: 0.08262419
     Ranger result
    
     Call:
     ranger::ranger(`_Y` ~ ., data = cbind(`_Y` = Y, X), num.trees = num.trees, mtry = mtry, min.node.size = min.node.size, replace = replace, sample.fraction = sample.fraction, case.weights = obsWeights, write.forest = write.forest, probability = probability, num.threads = num.threads, verbose = verbose)
    
     Type: Regression
     Number of trees: 500
     Sample size: 506
     Number of independent variables: 13
     Mtry: 3
     Target node size: 5
     Variable importance mode: none
     OOB prediction error (MSE): 10.45612
     R squared (OOB): 0.8763858
     Ranger result
    
     Call:
     ranger::ranger(`_Y` ~ ., data = cbind(`_Y` = Y, X), num.trees = num.trees, mtry = mtry, min.node.size = min.node.size, replace = replace, sample.fraction = sample.fraction, case.weights = obsWeights, write.forest = write.forest, probability = probability, num.threads = num.threads, verbose = verbose)
    
     Type: Probability estimation
     Number of trees: 500
     Sample size: 506
     Number of independent variables: 13
     Mtry: 3
     Target node size: 1
     Variable importance mode: none
     OOB prediction error: 0.08395011
     Generalized Linear Model of class 'speedglm':
    
     Call: speedglm::speedglm(formula = Y ~ ., data = X, family = family, weights = obsWeights, maxit = maxit, k = k)
    
     Coefficients:
     (Intercept) crim zn indus chas nox
     3.646e+01 -1.080e-01 4.642e-02 2.056e-02 2.687e+00 -1.777e+01
     rm age dis rad tax ptratio
     3.810e+00 6.922e-04 -1.476e+00 3.060e-01 -1.233e-02 -9.527e-01
     black lstat
     9.312e-03 -5.248e-01
    
     Generalized Linear Model of class 'speedglm':
    
     Call: speedglm::speedglm(formula = Y ~ ., data = X, family = family, weights = obsWeights, maxit = maxit, k = k)
    
     Coefficients:
     ------------------------------------------------------------------
     Estimate Std. Error t value Pr(>|t|)
     (Intercept) 3.646e+01 5.103459 7.1441 3.28e-12 ***
     crim -1.080e-01 0.032865 -3.2865 1.09e-03 **
     zn 4.642e-02 0.013728 3.3816 7.78e-04 ***
     indus 2.056e-02 0.061496 0.3343 7.38e-01
     chas 2.687e+00 0.861580 3.1184 1.93e-03 **
     nox -1.777e+01 3.819744 -4.6513 4.25e-06 ***
     rm 3.810e+00 0.417925 9.1161 1.98e-18 ***
     age 6.922e-04 0.013210 0.0524 9.58e-01
     dis -1.476e+00 0.199455 -7.3980 6.01e-13 ***
     rad 3.060e-01 0.066346 4.6129 5.07e-06 ***
     tax -1.233e-02 0.003760 -3.2800 1.11e-03 **
     ptratio -9.527e-01 0.130827 -7.2825 1.31e-12 ***
     black 9.312e-03 0.002686 3.4668 5.73e-04 ***
     lstat -5.248e-01 0.050715 -10.3471 7.78e-23 ***
    
     -------------------------------------------------------------------
     Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
    
     ---
     null df: 505; null deviance: 42716.3;
     residuals df: 492; residuals deviance: 11078.78;
     # obs.: 506; # non-zero weighted obs.: 506;
     AIC: 3027.609; log Likelihood: -1498.804;
     RSS: 11078.8; dispersion: 22.51785; iterations: 1;
     rank: 14; max tolerance: 1e+00; convergence: FALSE.
     Generalized Linear Model of class 'speedglm':
    
     Call: speedglm::speedglm(formula = Y ~ ., data = X, family = family, weights = obsWeights, maxit = maxit, k = k)
    
     Coefficients:
     ------------------------------------------------------------------
     Estimate Std. Error z value Pr(>|z|)
     (Intercept) 10.682635 3.921395 2.7242 6.45e-03 **
     crim -0.040649 0.049796 -0.8163 4.14e-01
     zn 0.012134 0.010678 1.1364 2.56e-01
     indus -0.040715 0.045615 -0.8926 3.72e-01
     chas 0.248209 0.653283 0.3799 7.04e-01
     nox -3.601085 2.924365 -1.2314 2.18e-01
     rm 1.155156 0.374843 3.0817 2.06e-03 **
     age -0.018661 0.009319 -2.0023 4.53e-02 *
     dis -0.518934 0.146286 -3.5474 3.89e-04 ***
     rad 0.255522 0.061391 4.1622 3.15e-05 ***
     tax -0.009500 0.003107 -3.0574 2.23e-03 **
     ptratio -0.409317 0.103191 -3.9666 7.29e-05 ***
     black -0.001452 0.002558 -0.5674 5.70e-01
     lstat -0.318436 0.054735 -5.8178 5.96e-09 ***
    
     -------------------------------------------------------------------
     Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
    
     ---
     null df: 505; null deviance: 669.76;
     residuals df: 492; residuals deviance: 296.39;
     # obs.: 506; # non-zero weighted obs.: 506;
     AIC: 324.3944; log Likelihood: -148.1972;
     RSS: 1107.5; dispersion: 1; iterations: 7;
     rank: 14; max tolerance: 7.55e-12; convergence: TRUE.
     Generalized Linear Model of class 'speedglm':
    
     Call: speedglm::speedglm(formula = Y ~ ., data = X, family = family, weights = obsWeights, maxit = maxit, k = k)
    
     Coefficients:
     ------------------------------------------------------------------
     Estimate Std. Error t value Pr(>|t|)
     (Intercept) 3.646e+01 5.103459 7.1441 3.28e-12 ***
     crim -1.080e-01 0.032865 -3.2865 1.09e-03 **
     zn 4.642e-02 0.013728 3.3816 7.78e-04 ***
     indus 2.056e-02 0.061496 0.3343 7.38e-01
     chas 2.687e+00 0.861580 3.1184 1.93e-03 **
     nox -1.777e+01 3.819744 -4.6513 4.25e-06 ***
     rm 3.810e+00 0.417925 9.1161 1.98e-18 ***
     age 6.922e-04 0.013210 0.0524 9.58e-01
     dis -1.476e+00 0.199455 -7.3980 6.01e-13 ***
     rad 3.060e-01 0.066346 4.6129 5.07e-06 ***
     tax -1.233e-02 0.003760 -3.2800 1.11e-03 **
     ptratio -9.527e-01 0.130827 -7.2825 1.31e-12 ***
     black 9.312e-03 0.002686 3.4668 5.73e-04 ***
     lstat -5.248e-01 0.050715 -10.3471 7.78e-23 ***
    
     -------------------------------------------------------------------
     Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
    
     ---
     null df: 505; null deviance: 42716.3;
     residuals df: 492; residuals deviance: 11078.78;
     # obs.: 506; # non-zero weighted obs.: 506;
     AIC: 3027.609; log Likelihood: -1498.804;
     RSS: 11078.8; dispersion: 22.51785; iterations: 1;
     rank: 14; max tolerance: 1e+00; convergence: FALSE.
     Generalized Linear Model of class 'speedglm':
    
     Call: speedglm::speedglm(formula = Y ~ ., data = X, family = family, weights = obsWeights, maxit = maxit, k = k)
    
     Coefficients:
     ------------------------------------------------------------------
     Estimate Std. Error z value Pr(>|z|)
     (Intercept) 10.682635 3.921395 2.7242 6.45e-03 **
     crim -0.040649 0.049796 -0.8163 4.14e-01
     zn 0.012134 0.010678 1.1364 2.56e-01
     indus -0.040715 0.045615 -0.8926 3.72e-01
     chas 0.248209 0.653283 0.3799 7.04e-01
     nox -3.601085 2.924365 -1.2314 2.18e-01
     rm 1.155156 0.374843 3.0817 2.06e-03 **
     age -0.018661 0.009319 -2.0023 4.53e-02 *
     dis -0.518934 0.146286 -3.5474 3.89e-04 ***
     rad 0.255522 0.061391 4.1622 3.15e-05 ***
     tax -0.009500 0.003107 -3.0574 2.23e-03 **
     ptratio -0.409317 0.103191 -3.9666 7.29e-05 ***
     black -0.001452 0.002558 -0.5674 5.70e-01
     lstat -0.318436 0.054735 -5.8178 5.96e-09 ***
    
     -------------------------------------------------------------------
     Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
    
     ---
     null df: 505; null deviance: 669.76;
     residuals df: 492; residuals deviance: 296.39;
     # obs.: 506; # non-zero weighted obs.: 506;
     AIC: 324.3944; log Likelihood: -148.1972;
     RSS: 1107.5; dispersion: 1; iterations: 7;
     rank: 14; max tolerance: 7.55e-12; convergence: TRUE.
     Linear Regression Model of class 'speedlm':
    
     Call: speedglm::speedlm(formula = Y ~ ., data = X, weights = obsWeights)
    
     Coefficients:
     (Intercept) crim zn indus chas nox
     3.646e+01 -1.080e-01 4.642e-02 2.056e-02 2.687e+00 -1.777e+01
     rm age dis rad tax ptratio
     3.810e+00 6.922e-04 -1.476e+00 3.060e-01 -1.233e-02 -9.527e-01
     black lstat
     9.312e-03 -5.248e-01
    
     Linear Regression Model of class 'speedlm':
    
     Call: speedglm::speedlm(formula = Y ~ ., data = X, weights = obsWeights)
    
     Coefficients:
     ------------------------------------------------------------------
     coef se t p.value
     (Intercept) 36.459488 5.103459 7.144 3.283e-12 ***
     crim -0.108011 0.032865 -3.287 1.087e-03 **
     zn 0.046420 0.013727 3.382 7.781e-04 ***
     indus 0.020559 0.061496 0.334 7.383e-01
     chas 2.686734 0.861580 3.118 1.925e-03 **
     nox -17.766611 3.819744 -4.651 4.246e-06 ***
     rm 3.809865 0.417925 9.116 1.979e-18 ***
     age 0.000692 0.013210 0.052 9.582e-01
     dis -1.475567 0.199455 -7.398 6.013e-13 ***
     rad 0.306049 0.066346 4.613 5.071e-06 ***
     tax -0.012335 0.003761 -3.280 1.112e-03 **
     ptratio -0.952747 0.130827 -7.283 1.309e-12 ***
     black 0.009312 0.002686 3.467 5.729e-04 ***
     lstat -0.524758 0.050715 -10.347 7.777e-23 ***
    
     -------------------------------------------------------------------
     Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
     ---
     Residual standard error: 4.745298 on 492 degrees of freedom;
     observations: 506; R^2: 0.741; adjusted R^2: 0.734;
     F-statistic: 108.1 on 13 and 492 df; p-value: 0.
     Linear Regression Model of class 'speedlm':
    
     Call: speedglm::speedlm(formula = Y ~ ., data = X, weights = obsWeights)
    
     Coefficients:
     ------------------------------------------------------------------
     coef se t p.value
     (Intercept) 1.667540 0.366239 4.553 6.670e-06 ***
     crim 0.000303 0.002358 0.128 8.979e-01
     zn 0.002303 0.000985 2.338 1.981e-02 *
     indus -0.004025 0.004413 -0.912 3.621e-01
     chas 0.033853 0.061829 0.548 5.843e-01
     nox -0.724254 0.274116 -2.642 8.501e-03 **
     rm 0.135798 0.029992 4.528 7.483e-06 ***
     age -0.003107 0.000948 -3.278 1.121e-03 **
     dis -0.074892 0.014313 -5.232 2.482e-07 ***
     rad 0.016816 0.004761 3.532 4.515e-04 ***
     tax -0.000672 0.000270 -2.490 1.311e-02 *
     ptratio -0.049838 0.009389 -5.308 1.677e-07 ***
     black 0.000147 0.000193 0.760 4.474e-01
     lstat -0.019759 0.003639 -5.429 8.912e-08 ***
    
     -------------------------------------------------------------------
     Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
     ---
     Residual standard error: 0.340537 on 492 degrees of freedom;
     observations: 506; R^2: 0.519; adjusted R^2: 0.506;
     F-statistic: 40.86 on 13 and 492 df; p-value: 0.
     Linear Regression Model of class 'speedlm':
    
     Call: speedglm::speedlm(formula = Y ~ ., data = X, weights = obsWeights)
    
     Coefficients:
     ------------------------------------------------------------------
     coef se t p.value
     (Intercept) 36.459488 5.103459 7.144 3.283e-12 ***
     crim -0.108011 0.032865 -3.287 1.087e-03 **
     zn 0.046420 0.013727 3.382 7.781e-04 ***
     indus 0.020559 0.061496 0.334 7.383e-01
     chas 2.686734 0.861580 3.118 1.925e-03 **
     nox -17.766611 3.819744 -4.651 4.246e-06 ***
     rm 3.809865 0.417925 9.116 1.979e-18 ***
     age 0.000692 0.013210 0.052 9.582e-01
     dis -1.475567 0.199455 -7.398 6.013e-13 ***
     rad 0.306049 0.066346 4.613 5.071e-06 ***
     tax -0.012335 0.003761 -3.280 1.112e-03 **
     ptratio -0.952747 0.130827 -7.283 1.309e-12 ***
     black 0.009312 0.002686 3.467 5.729e-04 ***
     lstat -0.524758 0.050715 -10.347 7.777e-23 ***
    
     -------------------------------------------------------------------
     Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
     ---
     Residual standard error: 4.745298 on 492 degrees of freedom;
     observations: 506; R^2: 0.741; adjusted R^2: 0.734;
     F-statistic: 108.1 on 13 and 492 df; p-value: 0.
     Linear Regression Model of class 'speedlm':
    
     Call: speedglm::speedlm(formula = Y ~ ., data = X, weights = obsWeights)
    
     Coefficients:
     ------------------------------------------------------------------
     coef se t p.value
     (Intercept) 1.667540 0.366239 4.553 6.670e-06 ***
     crim 0.000303 0.002358 0.128 8.979e-01
     zn 0.002303 0.000985 2.338 1.981e-02 *
     indus -0.004025 0.004413 -0.912 3.621e-01
     chas 0.033853 0.061829 0.548 5.843e-01
     nox -0.724254 0.274116 -2.642 8.501e-03 **
     rm 0.135798 0.029992 4.528 7.483e-06 ***
     age -0.003107 0.000948 -3.278 1.121e-03 **
     dis -0.074892 0.014313 -5.232 2.482e-07 ***
     rad 0.016816 0.004761 3.532 4.515e-04 ***
     tax -0.000672 0.000270 -2.490 1.311e-02 *
     ptratio -0.049838 0.009389 -5.308 1.677e-07 ***
     black 0.000147 0.000193 0.760 4.474e-01
     lstat -0.019759 0.003639 -5.429 8.912e-08 ***
    
     -------------------------------------------------------------------
     Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
     ---
     Residual standard error: 0.340537 on 492 degrees of freedom;
     observations: 506; R^2: 0.519; adjusted R^2: 0.506;
     F-statistic: 40.86 on 13 and 492 df; p-value: 0.
     ══ testthat results ═══════════════════════════════════════════════════════════
     OK: 89 SKIPPED: 0 FAILED: 1
     1. Error: (unknown) (@test-XGBoost.R#2)
    
     Error: testthat unit tests failed
     Execution halted
Flavor: r-patched-solaris-x86

Version: 2.0-23
Check: package dependencies
Result: NOTE
    Packages suggested but not available for checking: ‘genefilter’ ‘sva’
Flavor: r-oldrel-osx-x86_64