In this paper we introduce a new family of hyperplane classifiers.
But, in contrast to Support Vector Machines (SVM) - where a constrained
quadratic optimization is used - some of the proposed methods
lead to the unconstrained minimization of convex functions while
others merely require solving a linear system of equations. So
that the efficiency of these methods could be checked, classification
tests were conducted on standard databases. In our evaluation,
classification results of SVM were of course used as a general
point of reference, which we found were outperformed in many cases.