C. Arno, The definitive performance tuning guide for h2o deep learning, 2015.

R. Balzer, A 15 Year Perspective on Automatic Programming, IEEE Transactions on Software Engineering, vol.11, issue.11, pp.1257-1268, 1985.
DOI : 10.1109/TSE.1985.231877

G. Biau, Analysis of a random forests model, J. Mach. Learn. Res, vol.13, issue.1, pp.1063-1095, 2012.
URL : https://hal.archives-ouvertes.fr/hal-00704947

L. Breiman, Some infinity theory for predictors ensembles, 2000.

L. Breiman, Consistency for a sample model of random forests, 2004.

N. Chawla, K. Bowyer, L. Hall, and W. Kegelmeyer, Smote:synthetic minority over-sampling technique, Journal of artificial Intelligence Research, vol.16, pp.321-357, 2002.

Y. Deville and K. K. Lau, Logic program synthesis, The Journal of Logic Programming, vol.19, issue.20, pp.321-350, 1994.
DOI : 10.1016/0743-1066(94)90029-9

URL : https://doi.org/10.1016/0743-1066(94)90029-9

D. Ernst and L. Wehenkel, Extremely randomized trees, Machine learning, vol.63, pp.3-42, 2006.
URL : https://hal.archives-ouvertes.fr/hal-00341932

J. Fan and R. Li, Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties, Journal of the American Statistical Association, vol.96, issue.456, pp.1348-1360, 2001.
DOI : 10.1198/016214501753382273

URL : http://www.stat.psu.edu/~rli/research/penlike.pdf

J. Friedman, T. Hastie, and R. Tibshirani, The elements of statistical learning, series in statistics 1, pp.337-387, 2001.

J. Friedman, T. Hastie, and R. Tibshirani, Regularization Paths for Generalized Linear Models via Coordinate Descent, Journal of Statistical Software, vol.33, issue.1, 2010.
DOI : 10.18637/jss.v033.i01

URL : https://doi.org/10.18637/jss.v033.i01

J. H. Friedman, machine., The Annals of Statistics, vol.29, issue.5, pp.1189-1232, 2001.
DOI : 10.1214/aos/1013203451

. Gastwirth, The estimation of the lorenz curve and the gini index. The review of Economics and Statistics, pp.306-316, 1972.

T. Gedeon, Data Mining of Inputs: Analysing Magnitude and Functional Measures, International Journal of Neural Systems, vol.1995, issue.02, pp.209-217, 1997.
DOI : 10.1142/S0129065797000227

R. Genuer, J. Poggi, and C. Tuleau, Random Forests: some methodological insights, 2008.
URL : https://hal.archives-ouvertes.fr/inria-00340725

G. Hinton and R. Salakhutdinov, Reducing the Dimensionality of Data with Neural Networks, Science, vol.313, issue.5786, pp.504-507, 2006.
DOI : 10.1126/science.1127647

M. Kubat, R. Holte, and S. Marvin, Machine learning in the detection of oil spills in satellite radar images, Machine Learning, vol.30, issue.2/3, pp.195-215, 1999.
DOI : 10.1023/A:1007452223027

M. Kubat and S. Marvin, Addressing the curse of imbalanced training sets:one sided selection, Proceedings of the fourteenth International Conference on Machine Learning, pp.179-186, 1999.

R. Lerman and S. Yitzhaki, A note on the calculation and interpretation of the gini index. Economic Letters, pp.363-368, 1984.

D. Mladenic and M. Grobelnik, Feature selection for unbalanced class distribution and naives bayes, Proceedings of the 16th international conference on machine learning, pp.258-267, 1999.

L. Raileanu and K. Stoffel, Theoretical Comparison between the Gini Index and Information Gain Criteria, Annals of Mathematics and Artificial Intelligence, vol.41, issue.1, pp.77-93, 2004.
DOI : 10.1023/B:AMAI.0000018580.96245.c6

J. Schmidhuber, Deep learning in neural networks: An overview, Neural Networks, vol.61, 2014.
DOI : 10.1016/j.neunet.2014.09.003

B. Scholkopf, C. J. Burges, and A. J. Smola, Advances in kernel methods -support vector learning, 1998.

R. Tibschirani, Regression shrinkage and selection via the lasso, Journal of the Royal Statistical society, series B, vol.58, pp.267-288, 1996.

R. Tibschirani, Regression shrinkage and selection via the lasso:a retrospective, Journal of the Royal Statistical Society, series B, vol.58, issue.1, pp.267-288, 2011.

V. Vapnik, The nature of statistical learning theory, 1995.

S. Yitzhaki, On an Extension of the Gini Inequality Index, International Economic Review, vol.24, issue.3, pp.617-628, 1983.
DOI : 10.2307/2648789

H. Zhou, The Adaptive Lasso and Its Oracle Properties, Journal of the American Statistical Association, vol.101, issue.476, pp.1418-1429, 2006.
DOI : 10.1198/016214506000000735

H. Zhou and T. Hastie, Regularization and variable selection via the elastic net, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.5, issue.2, pp.301-320, 2005.
DOI : 10.1073/pnas.201162998