A MATLAB version of our new classification algorithm with guaranteed sensitivity and specificity has been released.
main feature of this classification algorithm is that it is
"self-testing", i.e., the sensitivity and specificity of the trained
classifier can be tested and certified by means of a rigorous
statistical method without the need of an independent test set.
Therefore, strange as it may seem, if you trust math (and your data are
i.i.d.), then you need no validation set: all of your precious data can be used in the training phase!
Moreover, the algorithm allows the user to control the sensitivity-specificity balance by means of two input parameters.
new algorithm uses a simpler construction than its noble ancestor GEM,
which makes it easier to analyse and super-easy to implement.
the other hand, the super-simplicity of the construction might
impact negatively on the capability of the algorithm to adapt to the
data distribution: data pre-processing (normalization) is of particular importance.
Try it immediately on your problem, and let us know! :)
By downloading files from this website, you are accepting the following agreement
We (the licensee) understand that the
GEM package is
supplied "as is", without expressed or implied warranty.
We agree on the following:
- The licensers do not have any
obligation to provide any maintenance or consulting help with respect
- The licensers neither have any
the use of classifiers built through GEM-BALLS, nor for the correctness of
- We will only use GEM-BALLS for non-profit
purposes. This implies that neither GEM-BALLS nor any part of its
code should be used or modified for any commercial software product. REFERENCE
Please cite the following paper (bibtex) when referring to our algorithm: "A New Classification Algorithm With Guaranteed Sensitivity and Specificity for Medical Applications", by A. Carè, F.A. Ramponi, M.C. Campi. IEEE Control Systems Letters, vol. 2, no. 3, pp. 393-398, July 2018.
(pdf copy here)
Type >>help traingemballs
at the MATLAB prompt for a general description of the MATLAB functions.
Here you can find another MATLAB example where a pool of GEM-BALLS classifiers are built from the same training set. Using many
GEM-BALLS classifiers together might considerably improve performance - we have been currently researching on this:
"A study on majority-voting classifiers with guarantees on the probability of error"
by A. Carè, M.C. Campi, F.A. Ramponi, S. Garatti, A.T.J.R. Cobbenhagen
Accepted for IFAC World Congress 2020 (pdf copy here)
Here you are an instance of a GEM-BALLS classifier (red=1, white=0) that was trained by Roy Cobbenhagen.
F.A. Ramponi argued that GEM-BALLS classifiers bear similarities with some of Umberto Boccioni's sculptures. Shall we start talking about Boccioni classifiers? (In Italian, "boccioni" also means "big bowls/balls"!)