Quality Metrics for Multi-class Classification Algorithms¶
For \(l\) classes \(C_1, \ldots, C_l\), given a vector \(X = (x_1, \ldots, x_n)\) of class labels computed at the prediction stage of the classification algorithm and a vector \(Y = (y_1, \ldots, y_n)\) of expected class labels, the problem is to evaluate the classifier by computing the confusion matrix and connected quality metrics: precision, error rate, and so on.
QualityMetricsId
for multi-class classification is confusionMatrix
.
Details¶
Further definitions use the following notations:
\(\text{tp}_i\) |
true positive |
the number of correctly recognized observations for class \(C_1\) |
\(\text{tn}_i\) |
true negative |
the number of correctly recognized observations that do not belong to the class \(C_1\) |
\(\text{fp}_i\) |
false positive |
the number of observations that were incorrectly assigned to the class \(C_1\) |
\(\text{fn}_i\) |
false negative |
the number of observations that were not recognized as belonging to the class \(C_1\) |
The library uses the following quality metrics for multi-class classifiers:
Quality Metric |
Definition |
---|---|
Average accuracy |
\(\frac {\sum _{i = 1}^{l} \frac {\text{tp}_i + \text{tn}_i}{\text{tp}_i + \text{fn}_i + \text{fp}_i + \text{tn}_i}}{l}\) |
Error rate |
\(\frac {\sum _{i = 1}^{l} \frac {\text{fp}_i + \text{fn}_i}{\text{tp}_i + \text{fn}_i + \text{fp}_i + \text{tn}_i}}{l}\) |
Micro precision (\(\text{Precision}_\mu\)) |
\(\frac {\sum _{i = 1}^{l} \text{tp}_i} {\sum _{i = 1}^{l} (\text{tp}_i + \text{fp}_i)}\) |
Micro recall (\(\text{Recall}_\mu\)) |
\(\frac {\sum _{i = 1}^{l} \text{tp}_i} {\sum _{i = 1}^{l} (\text{tp}_i + \text{fn}_i)}\) |
Micro F-score (\(\text{F-score}_\mu\)) |
\(\frac {(\beta^2 + 1)(\text{Precision}_\mu \times \text{Recall}_\mu)}{\beta^2 \times \text{Precision}_\mu + \text{Recall}_\mu}\) |
Macro precision (\(\text{Precision}_M\)) |
\(\frac {\sum _{i = 1}^{l} \frac {\text{tp}_i}{\text{tp}_i + \text{fp}_i}}{l}\) |
Macro recall (\(\text{Recall}_M\)) |
\(\frac {\sum _{i = 1}^{l} \frac {\text{tp}_i}{\text{tp}_i + \text{fn}_i}}{l}\) |
Macro F-score (\(\text{F-score}_M\)) |
\(\frac {(\beta^2 + 1)(\text{Precision}_M \times \text{Recall}_M)}{\beta^2 \times \text{Precision}_M + \text{Recall}_M}\) |
For more details of these metrics, including the evaluation focus, refer to [Sokolova09].
The following is the confusion matrix:
Classified as Class \(C_1\) |
\(\ldots\) |
Classified as Class \(C_i\) |
\(\ldots\) |
Classified as Class \(C_l\) |
|
---|---|---|---|---|---|
Actual Class \(C_1\) |
\(c_{11}\) |
\(\ldots\) |
\(c_{1i}\) |
\(\ldots\) |
\(c_{1l}\) |
\(\ldots\) |
\(\ldots\) |
\(\ldots\) |
\(\ldots\) |
\(\ldots\) |
\(\ldots\) |
Actual Class \(C_i\) |
\(c_{i1}\) |
\(\ldots\) |
\(c_{ii}\) |
\(\ldots\) |
\(c_{il}\) |
\(\ldots\) |
\(\ldots\) |
\(\ldots\) |
\(\ldots\) |
\(\ldots\) |
\(\ldots\) |
Actual Class \(C_l\) |
\(c_{l1}\) |
\(\ldots\) |
\(c_{li}\) |
\(\ldots\) |
\(c_{ll}\) |
The positives and negatives are defined through elements of the confusion matrix as follows:
Batch Processing¶
Algorithm Input¶
The quality metric algorithm for multi-class classifiers accepts the input described below.
Pass the Input ID
as a parameter to the methods that provide input for your algorithm.
For more details, see Algorithms.
Input ID |
Input |
---|---|
|
Pointer to the \(n \times 1\) numeric table that contains labels computed at the prediction stage of the classification algorithm. This input can be an object of any class derived from |
|
Pointer to the \(n \times 1\) numeric table that contains expected labels. This input can be an object of any class derived from NumericTable except |
Algorithm Parameters¶
The quality metric algorithm has the following parameters:
Parameter |
Default Value |
Description |
---|---|---|
|
|
The floating-point type that the algorithm uses for intermediate computations. Can be |
|
|
Performance-oriented computation method, the only method supported by the algorithm. |
|
\(0\) |
The number of classes (\(l\)). |
|
|
A flag that defines a need to compute the default metrics provided by the library. |
|
\(1\) |
The \(\beta\) parameter of the F-score quality metric provided by the library. |
Algorithm Output¶
The quality metric algorithm calculates the result described below. Pass the Result ID
as a parameter to the methods that access the results of your algorithm. For more details, see Algorithms.
Result ID |
Result |
---|---|
|
Pointer to the \(\text{nClasses} \times \text{nClasses}\) numeric table with the confusion matrix. Note By default, this result is an object of the |
|
Pointer to the \(1 \times 8\) numeric table that contains quality metrics, which you can access by an appropriate Multi-class Metrics ID:
Note By default, this result is an object of the |
Examples¶
Batch Processing: