Confusion matrix
Compare the actual values in your data against predicted values in a confusion matrix to understand how your model is performing.
It takes a set of classification predictions and produces a 2x2 confusion matrix. A confusion matrix compares the actual values in your data against the predicted values. In a 2x2 matrix, you'll see:
true positives, where the actual value and predicted value are both true.
true negatives, where the actual value and predicted value are both false.
false positives, where the actual value was false, but the prediction was true.
false negatives, where the actual value was true but the prediction was false.
When to use this tool
A confusion matrix can help you understand how your model is performing.
Configuration
Use the following configuration to help import your data into your pipe.
Go to the Pipes module from the side navigation bar.
On the Pipes tab, find the pipe you want to work with. Click the pipe to open. For more information about pipes, see the Creating a pipe documentation.
In your Pipe builder, add your data source.
Click
+ Add tool.
In the search bar, search for Confusion matrix. Click + Tool.
Note
You can also find the Confusion matrix tool in the Learn section.
Connect the tool to your data set.
In the configuration pane, enter the following information:
Table 66. Confusion matrix configurationField
Description
Actual
Select the column to use as the actual value of the data.
Predicted
Select the column to predict the value of the data.
Predicted positive, Predicted negative and Label columns appear with the data.