GigaCheck-Detector-Multi

๐ŸŒ LLMTrace Website | ๐Ÿ“œ LLMTrace Paper on arXiv | ๐Ÿค— LLMTrace - Detection Dataset | Github |

Model Card

Model Description

This is the official GigaCheck-Detector-Multi model from the LLMTrace project. It is a multilingual transformer-based model trained for AI interval detection. Its purpose is to identify and localize the specific spans of text within a document that were generated by an AI.

The model was trained jointly on the English and Russian portions of the LLMTrace Detection dataset, which includes human, fully AI, and mixed-authorship texts with character-level annotations.

For complete details on the training data, methodology, and evaluation, please refer to our research paper: link(coming soon)

Intended Use & Limitations

This model is intended for fine-grained analysis of documents, academic integrity tools, and research into human-AI collaboration.

Limitations:

  • The model's performance may degrade on text generated by LLMs released after its training date (September 2025).
  • It is not infallible and may miss some AI-generated spans or incorrectly flag human-written parts.
  • The boundary predictions may not be perfectly precise in all cases.

Evaluation

The model was evaluated on the test split of the LLMTrace Detection dataset. The performance is measured using standard mean Average Precision (mAP) metrics for object detection, adapted for text spans.

Metric Value
mAP @ IoU=0.5 0.8976
mAP @ IoU=0.5:0.95 0.7921

Citation

If you use this model in your research, please cite our papers:

@article{Layer2025LLMTrace,
  Title = {{LLMTrace: A Corpus for Classification and Fine-Grained Localization of AI-Written Text}},
  Author = {Irina Tolstykh and Aleksandra Tsybina and Sergey Yakubson and Maksim Kuprashevich},
  Year = {2025},
  Eprint = {arXiv:2509.21269}
}
@article{tolstykh2024gigacheck,
  title={{GigaCheck: Detecting LLM-generated Content}},
  author={Irina Tolstykh and Aleksandra Tsybina and Sergey Yakubson and Aleksandr Gordeev and Vladimir Dokholyan and Maksim Kuprashevich},
  journal={arXiv preprint arXiv:2410.23728},
  year={2024}
}
Downloads last month
78
Safetensors
Model size
7B params
Tensor type
F32
ยท
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for iitolstykh/GigaCheck-Detector-Multi

Finetuned
(320)
this model

Dataset used to train iitolstykh/GigaCheck-Detector-Multi