Skip to main content

Varicent AI Help Center

Confusion matrix

Abstract

Compare the actual values in your data against predicted values in a confusion matrix to understand how your model is performing.

It takes a set of classification predictions and produces a 2x2 confusion matrix. A confusion matrix compares the actual values in your data against the predicted values. In a 2x2 matrix, you'll see:

  • true positives, where the actual value and predicted value are both true.

  • true negatives, where the actual value and predicted value are both false.

  • false positives, where the actual value was false, but the prediction was true.

  • false negatives, where the actual value was true but the prediction was false.

When to use this tool

A confusion matrix can help you understand how your model is performing.

Configuration

Use the following configuration to help import your data into your pipe.

Configuring the Confusion matrix tool
  1. Go to the Pipes module from the side navigation bar.

  2. On the Pipes tab, find the pipe you want to work with. Click the pipe to open. For more information about pipes, see the Creating a pipe documentation.

  3. In your Pipe builder, add your data source.

  4. Click symon_add_icon.png + Add tool.

  5. In the search bar, search for Confusion matrix. Click + Tool.

    Note

    You can also find the Confusion matrix tool in the Learn section.

  6. Connect the tool to your data set.

  7. In the configuration pane, enter the following information:

    Table 66. Confusion matrix configuration

    Field

    Description

    Actual

    Select the column to use as the actual value of the data.

    Predicted

    Select the column to predict the value of the data.



Predicted positive, Predicted negative and Label columns appear with the data.