- MathNotebook
- MathConcepts
- StudyMath
- Geometry
- Logic
- Bott periodicity
- CategoryTheory
- FieldWithOneElement
- MathDiscovery
- Math Connections
Epistemology - m a t h 4 w i s d o m - g m a i l
- +370 607 27 665
*My work is in the Public Domain for all to share freely.*
- 读物 书 影片 维基百科
Introduction E9F5FC Questions FFFFC0 Software |
Šarūnas Raudys described a hierarchy of statistical methods that accord with divisions of everything and perhaps relate to Clifford algebras and Bott periodicity. - Evolution and generalization of a single neurone: I. Single-layer perceptron as seven statistical classifiers
- Evolution and generalization of a single neurone:: II. Complexity of statistical classifiers and sample size considerations
- Evolution and generalization of a single neurone. III. Primitive, regularized, standard, robust and minimax regressions
- (1) the Euclidean distance classifier;
- (2) the standard Fisher linear discriminant function (DF);
- (3) the Fisher linear DF with pseudo-inversion of the covariance matrix;
- (4) regularized linear discriminant analysis;
- (5) the generalized Fisher DF;
- (6) the minimum empirical error classifier;
- (7) the maximum margin classifier.
- Šarūnas Raudys. Statistical and Neural Classifiers: An Integrated Approach to Design Google Books VU Matematikos ir informatikos skaitykla (004.9/Ra256)
Statistical classification Overview Consider the nonlinear single-layer perceptron (SLP) as a process in which - the weights of the perceptron are increasing
- the cost function of the sum of squares is changing gradually
During backpropagation training, the decision boundary of the SLP becomes identical or close to that of seven statistical classifiers. A classifier distinguishes (two) classes. This is similar to a linear complex structure, which distinguishes what commutes with it from what does not commute with it. |

This page was last changed on May 28, 2024, at 03:34 PM