Horenko, I. (2020) On a Scalable Entropic Breaching of the Overtting Barrier for Small Data Problems in Machine Learning. Neural Computation, 32 (8). pp. 1563-1579.
Full text not available from this repository.
Official URL: https://pubmed.ncbi.nlm.nih.gov/32521216/
Abstract
Overfitting and treatment of small data are among the most challenging problems in machine learning (ML), when a relatively small data statistics size T is not enough to provide a robust ML fit for a relatively large data feature dimension D. Deploying a massively parallel ML analysis of generic classification problems for different D and T, we demonstrate the existence of statistically significant linear overfitting barriers for common ML methods. The results reveal that for a robust classification of bioinformatics-motivated generic problems with the long short-term memory deep learning classifier (LSTM), one needs in the best case a statistics T that is at least 13.8 times larger than the feature dimension D. We show that this overfitting barrier can be breached at a 10−12 fraction of the computational cost by means of the entropy-optimal scalable probabilistic approximations algorithm (eSPA), performing a joint solution of the entropy-optimal Bayesian network inference and feature space segmentation problems. Application of eSPA to experimental single cell RNA sequencing data exhibits a 30-fold classification performance boost when compared to standard bioinformatics tools and a 7-fold boost when compared to the deep learning LSTM classifier.
Item Type: | Article |
---|---|
Subjects: | Mathematical and Computer Sciences > Mathematics > Applied Mathematics |
Divisions: | Department of Mathematics and Computer Science > Institute of Mathematics |
ID Code: | 2557 |
Deposited By: | Monika Drueck |
Deposited On: | 27 Apr 2021 12:41 |
Last Modified: | 27 Apr 2021 12:41 |
Repository Staff Only: item control page