medsiglip-448-scin-classification

This model is a fine-tuned version of google/medsiglip-448 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3754
  • Roc Auc: 0.9760

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 5
  • num_epochs: 3

Training results

Training Loss Epoch Step Validation Loss Roc Auc
2.8561 0.1985 40 1.3265 0.7629
2.4018 0.3970 80 0.9869 0.8771
1.6941 0.5955 120 0.7613 0.9162
1.4797 0.7940 160 0.6767 0.9398
1.1814 0.9926 200 0.5849 0.9560
0.884 1.1886 240 0.4970 0.9630
0.7607 1.3871 280 0.4639 0.9663
0.7558 1.5856 320 0.4327 0.9694
0.6405 1.7841 360 0.4078 0.9721
0.6299 1.9826 400 0.3851 0.9751
0.4243 2.1787 440 0.3819 0.9753
0.3755 2.3772 480 0.3830 0.9745
0.3719 2.5757 520 0.3793 0.9756
0.36 2.7742 560 0.3756 0.9761
0.3645 2.9727 600 0.3754 0.9760

Framework versions

  • Transformers 4.57.3
  • Pytorch 2.9.1+cu128
  • Datasets 4.4.1
  • Tokenizers 0.22.1
Downloads last month
8
Safetensors
Model size
0.4B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for amoghajnalens/medsiglip-448-scin-classification

Finetuned
(21)
this model

Evaluation results