Está en la página 1de 5

Calculate Confusion Matrices

Use a confusion matrix to show the accuracy of a classification result by comparing a


classification result with ground truth information. ENVI can calculate a confusion matrix
(contingency matrix) using a ground truth image or ground truth ROIs. In each case, the
confusion matrix reports an overall accuracy, producer and user accuracies, kappa coefficient,
confusion matrix, and errors of commission and omission.
See the following sections:
Use a Ground Truth Image
Use Ground Truth Regions of Interest
Confusion Matrix Example
Overall Accuracy
Kappa Coefficient
Confusion Matrix (Pixels)
Confusion Matrix (Percent)
Commision
Omission
Producer Accuracy
User Accuracy

Use a Ground Truth Image


When using a ground truth image, you can also calculate error mask images for each class
showing which pixels were incorrectly classified.
Reference
J. R. Jensen, 1986, Introductory Digital Image Processing, Prentice-Hall, Englewood Cliffs, New
Jersey, p. 379.
1 From the Toolbox, select Classification > Post Classification > Confusion Matrix
Using Ground Truth Image. The Classification Input File dialog appears.
1 Select an input file and perform optional spatial and spectral subsetting, then click OK.
The Input File dialog appears. The Ground Truth Input File dialog appears.
1 Select the ground truth image and perform any spatial subsetting, then click OK. The
Match Classes Parameters dialog appears.
1 Match the ground truth classes with the classification result classes by selecting the
matching names in the two lists and clicking Add Combination. The class
combinations are shown in a list at the bottom of the dialog. If the ground truth and
classification classes have the same names, they are automatically matched.
To remove a class match from the list, select the combination name. The two class names
reappear in the lists at the top of the dialog.
1 Click OK. The Confusion Matrix Parameters dialog appears.
1 Next to the Output Confusion Matrix in label, select the Pixels and/or
the Percent check boxes. If you select both check boxes, they will be reported in the
same window.
1 Next to the Report Accuracy Assessment label, select the Yes or No toggle.
1 Next to Output Error Images label, click the toggle button to select Yes or No.
The output error images are mask images, one for each class, where all correctly classified
pixels have a value of 0 and incorrectly classified pixels have a value of 1. The last error
image band shows all the incorrectly classified pixels for all the classes combined.
1 Select output to File or Memory. If File is selected, enter an output filename.
1 Click OK.
The report shows the overall accuracy, kappa coefficient, confusion matrix, errors of
commission (percentage of pixels erroneously included in a class), errors of omission
(percentage of pixels erroneously excluded from a class), producer accuracy, and user accuracy
for each class. Producer accuracy is the probability that a pixel in the classification image is put

into class x given the ground truth class is x. User Accuracy is the probability that the ground
truth class is x given a pixel is put into class x in the classification image.

Use Ground Truth Regions of Interest


You can display a confusion matrix using ROIs for ground truth. This report pairs ROIs with the
classes of a classification image to show what percentage of the ROI pixels were or were not
contained in a resulting class. To display a confusion matrix report using ROIs for ground truth:
1 From the Toolbox, select Classification > Post Classification > Confusion Matrix
Using Ground Truth ROIs. The Classification Input File dialog appears.
1 Select a classification input file and perform optional spatial and spectral subsetting,
then click OK. The Ground Truth Input File dialog appears. The Match Classes
Parameters dialog appears.
1 Match the ground truth ROIs with the classification result classes by selecting the
matching names in the two lists and clicking Add Combination. The class
combinations are shown in a list at the bottom of the dialog. If the ground truth and
classification classes have the same names, they are automatically matched.
To remove a class match from the list, select the combination name. The two class names
reappear in the lists at the top of the dialog.
1 Click OK. The Confusion Matrix Parameters dialog appears.
1 Select the Pixels and/or the Percent check boxes.
1 Click the Yes or No toggle for Report Accuracy Assessment, and click OK.
The report shows the overall accuracy, kappa coefficient, confusion matrix, errors of
commission (percentage of extra pixels in class), errors of omission (percentage of pixels left
out of class), producer accuracy, and user accuracy for each class. Producer accuracy is the
probability that a pixel in the classification image is put into class x given the ground truth class
is x. User Accuracy is the probability that the ground truth class is x given a pixel is put into
class x in the classification image.

Confusion Matrix Example


The following example explains the items calculated for the confusion matrix, including the
overall accuracy, kappa coefficient, confusion (contingency) matrix, errors of commission, errors
of omission, producer accuracy, and user accuracy. ENVI can calculate a confusion matrix using
either a ground truth image or using ground truth ROIs and both produce an output similar to the
following example.
Confusion Matrix: {M6} (640x400x1)
Overall Accuracy = (131003/256000) 51.1730%
Kappa Coefficient = 0.2648
Ground Truth (Pixels)
Class

Unclassified

Grass

Forest

Swamp

Total

Unclassified

43689

26949

40

18001

88679

Grass

32835

64516

1741

3329

102421

Forest

8202

7277

4096

654

20229

Swamp

15227

10742

18702

44671

Total

99953

109484

5877

40686

256000

Ground Truth (Percent)


Class

Unclassified

Grass

Forest

Swamp

Total

Unclassified

43.71

24.61

0.68

44.24

34.64

Grass

32.85

58.93

29.62

8.18

40.01

Ground Truth (Percent)


Class

Unclassified

Grass

Forest

Swamp

Total

Forest

8.21

6.65

69.70

1.61

7.90

Swamp

15.23

9.81

0.00

45.97

17.45

Total

100.00

100.00

100.00

100.00

100.00

Class

Commission

Omission

Commission

Omission

(Percent)

(Percent)

(Pixels)

(Pixels)

Unclassified

50.73

56.29

44990/88679

56264/99953

Grass

37.01

41.07

37905/102421

44968/109484

Forest

79.75

30.30

16133/20229

1781/5877

Swamp

58.13

54.03

25969/44671

21984/40686

Class

Prod. Acc.

User Acc.

Prod. Acc.

User Acc.

(Percent)

(Percent)

(Pixels)

(Pixels)

Unclassified

43.71

49.27

43689/99953

43689/88679

Grass

58.93

62.99

64516/109484

64516/102421

Forest

69.70

20.25

4096/5877

4096/20229

Swamp

45.97

41.87

18702/40686

18702/44671

Overall Accuracy
The overall accuracy is calculated by summing the number of pixels classified correctly and
dividing by the total number of pixels. The ground truth image or ground truth ROIs define the
true class of the pixels. The pixels classified correctly are found along the diagonal of the
confusion matrix table, which lists the number of pixels that were classified into the correct
ground truth class. The total number of pixels is the sum of all the pixels in all the ground truth
classes.

Kappa Coefficient
The kappa () coefficient measures the agreement between classification and ground truth
pixels. A kappa value of 1 represents perfect agreement while a value of 0 represents no
agreement.

Where :

i is the class number


N is the total number of classified pixels that are being compared to ground truth
mi,i is the number of pixels belonging to the ground truth class i, that have also been
classified with a class i (i.e., values found along the diagonal of the confusion matrix)
Ci is the total number of classified pixels belonging to class i
Gi is the total number of ground truth pixels belonging to class i

This formula returns a kappa coefficient of 0.26, using the example matrix below:

Confusion Matrix (Pixels)


The confusion matrix is calculated by comparing the location and class of each ground truth
pixel with the corresponding location and class in the classification image. Each column of the
confusion matrix represents a ground truth class, and the values in the column correspond to
the classification images labeling of the ground truth pixels. For example, look at the ground
truth column for the Forest class in the Ground Truth (Pixels) table above. The ground truth
shows 5,877 pixels in this class. The classification was able to classify 4,096 of these pixels
properly, but 40 pixels were Unclassified and 1,741 were classified as Grass.

Confusion Matrix (Percent)


The Ground Truth (Percent) table shows the class distribution in percent for each ground truth
class. The values are calculated by dividing the pixel counts in each ground truth column by the
total number of pixels in a given ground truth class. For example, in the Forest class the
percentage of pixels classified correctly is 4,096/5,877=0.697 or 69.7%.

Commission
Errors of commission represent pixels that belong to another class that are labeled as belonging
to the class of interest. The errors of commission are shown in the rows of the confusion matrix.
In the confusion matrix example, the Grass class has a total of 102,421 pixels where 64,516
pixels are classified correctly and 37,905 other pixels are classified incorrectly as Grass (37,905
is the sum of all the other classes in the Grass row of the confusion matrix). The ratio of the
number of pixels classified incorrectly by the total number of pixels in the ground truth class
forms an error of commission. For the Grass class the error of commission is 37,905/102,421
which equals 37%.

Omission
Errors of omission represent pixels that belong to the ground truth class but the classification
technique has failed to classify them into the proper class. The errors of omission are shown in
the columns of the confusion matrix. In the confusion matrix example, the Grass class has a
total of 109,484 ground truth pixels where 64,516 pixels are classified correctly and 44,968
Grass ground truth pixels are classified incorrectly (44,968 is the sum of all the other classes in
the Grass column of the confusion matrix). The ratio of the number of pixels classified
incorrectly by the total number of pixels in the ground truth class forms an error of omission. For
the Grass class the error of omission is 44,968/109,484 which equals 41.1%.

Producer Accuracy

The producer accuracy is a measure indicating the probability that the classifier has labeled an
image pixel into Class A given that the ground truth is Class A. In the confusion matrix example,
the Grass class has a total of 109,484 ground truth pixels where 64,516 pixels are classified
correctly. The producer accuracy is the ratio 64,516/109,484 or 58.9%.

User Accuracy
User accuracy is a measure indicating the probability that a pixel is Class A given that the
classifier has labeled the pixel into Class A. In the confusion matrix example, the classifier has
labeled 102,421 pixels as the Grass class and a total of 64,516 pixels are classified correctly.
The user accuracy is the ratio 64,516/102,421 or 63.0%.

También podría gustarte