Skip to main content

Introduction

In this section we finetune a pretrained CNN model using the cross-entropy loss for K=2K=2 classes where the machine setting C has the anomalous label FAIL and all other settings belong to the nominal category PASS. We report relevant classification metrics as well as save the finetuned model for future use. UMAP is used to visualize the embeddings of the supervised model.

Approach

The supervised approach differs from the unsupervised approach in that we directly use the anomaly labels during training:
  1. Training Data: Images labeled as either PASS (nominal) or FAIL (anomalous)
  2. Model: Pretrained ResNet-50 finetuned with a binary classification head
  3. Loss Function: Cross-entropy loss for K=2K=2 classes
  4. Output: Binary classification (PASS/FAIL)

Advantages and Limitations

Advantages:
  • Direct optimization for the anomaly detection task
  • Can achieve high accuracy when sufficient labeled data is available
  • Interpretable decision boundary
Limitations:
  • Requires labeled anomaly data which may be scarce
  • May not generalize well to novel anomaly types not seen during training
  • Risk of overfitting to specific anomaly patterns

Results

The supervised model shows strong separation between PASS and FAIL classes when visualized using UMAP dimensionality reduction on the learned embeddings. However, the unsupervised approach was ultimately preferred due to its ability to detect novel anomalies without requiring labeled training data.
Connect these docs to Claude, VSCode, and more via MCP for real-time answers.