AI Studio
  • AI Studio Guide
  • AI Studio Basics
    • About AI Studio
      • AI Studio Feature Guide
      • Problem Statements
      • Platform Use Cases
    • Key Terminology to Know
  • Building Your First Project
    • Image Tagging
  • Detailed Guide: Image Tagging
    • Your Data
      • Supported Formats & Image Specifications
      • Exploring Datasets
        • Creating New Datasets
        • Adding Data
        • Removing Data
      • Pre-Processing Results
    • Creating and Training Models
      • Training Basics
    • Evaluating Models
      • Default Evaluation Dataset
      • Interpreting Evaluation
        • Interpreting results
      • Improving Your Model
    • Inference
    • Deployment
Powered by GitBook
On this page
  • Adding New Evaluation Datasets
  • Step 1: Go to the Models page.
  • Step 2: Click on info on the chosen model.
  • Step 3: Add an evaluation dataset here.

Was this helpful?

  1. Detailed Guide: Image Tagging
  2. Evaluating Models

Default Evaluation Dataset

The default evaluation datasets are meant to provide a benchmark to measure the improvement of multiple models created within the project Default evaluation datasets are created by the user to automatically evaluate models trained in the project. The default evaluation datasets are at the project level. The user can add up to 3 default evaluation datasets per project.

Adding New Evaluation Datasets

On top of this, users can add specific validation datasets at a model level if required on the Model – Deployment page.

Step 1: Go to the Models page.

(Insert Image)

Step 2: Click on info on the chosen model.

Hover on the right side of the chosen model to find the info button to go to the deployment page.

(Insert Image)

Step 3: Add an evaluation dataset here.

(Insert Image)

PreviousEvaluating ModelsNextInterpreting Evaluation

Last updated 3 years ago

Was this helpful?