Datasets:
Update README.md (#2)
Browse files- Update README.md (14633e2bfd869b8a79707f9a081de033de298c57)
Co-authored-by: C2C <[email protected]>
README.md
CHANGED
|
@@ -1,68 +1,69 @@
|
|
| 1 |
-
---
|
| 2 |
-
license: apache-2.0
|
| 3 |
-
size_categories:
|
| 4 |
-
- 1K<n<10K
|
| 5 |
-
task_categories:
|
| 6 |
-
- question-answering
|
| 7 |
-
- image-text-to-text
|
| 8 |
-
tags:
|
| 9 |
-
- chemistry
|
| 10 |
-
|
| 11 |
-
|
| 12 |
-
|
| 13 |
-
|
| 14 |
-
|
| 15 |
-
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
|
| 20 |
-
[
|
| 19 |
+
|
| 20 |
+
[](https://huggingface.co/papers/2511.16205)
|
| 21 |
+
[](https://arxiv.org/abs/2511.16205)
|
| 22 |
+
|
| 23 |
+
### π Key Features
|
| 24 |
+
|
| 25 |
+
- **π Olympic-Level Benchmark** - Challenging problems from IChO 2025 for advanced AI reasoning
|
| 26 |
+
- **π¬ Multimodal Symbolic Language** - Addresses chemistry's unique combination of text, formulas, and molecular structures
|
| 27 |
+
- **π Two Novel Assessment Methods**:
|
| 28 |
+
- **AER (Assessment-Equivalent Reformulation)** - Converts visual output requirements (e.g., drawing molecules) into computationally tractable formats
|
| 29 |
+
- **SVE (Structured Visual Enhancement)** - Diagnostic mechanism to separate visual perception from core chemical reasoning capabilities
|
| 30 |
+
|
| 31 |
+
### π¦ What's Included
|
| 32 |
+
|
| 33 |
+
The current release includes:
|
| 34 |
+
|
| 35 |
+
- β
**Original Problems** - Complete problem sets with additional chapter markers for Problems and Solutions sections (no other modifications to the original content)
|
| 36 |
+
- β
**Well-structured JSON Files** - Clean, organized data designed for:
|
| 37 |
+
- π€ **MLLM Benchmarking** - Olympic-level chemistry reasoning evaluation
|
| 38 |
+
- π **Multi-Agent System Testing** - Hierarchical agent collaboration assessment
|
| 39 |
+
- π― **Multimodal Reasoning** - Text, formula, and molecular structure understanding
|
| 40 |
+
- β³ **Original CDXML Files** - Coming soon (see note below)
|
| 41 |
+
|
| 42 |
+
### π Data Source
|
| 43 |
+
|
| 44 |
+
All problems are sourced from **ICHO 2025**: https://www.icho2025.ae/problems
|
| 45 |
+
|
| 46 |
+
### π Note on CDXML Files
|
| 47 |
+
|
| 48 |
+
Due to compatibility issues across different ChemDraw versions, the CDXML files for molecular structures are not included in the initial v1.0 release. We are actively working to resolve these compatibility challenges and will supplement the dataset with CDXML files in a future update.
|
| 49 |
+
|
| 50 |
+
### π Citation
|
| 51 |
+
|
| 52 |
+
If you use ChemO in your research, please cite our paper:
|
| 53 |
+
|
| 54 |
+
```bibtex
|
| 55 |
+
@article{qiang2025chemlabs,
|
| 56 |
+
title={ChemLabs on ChemO: A Multi-Agent System for Multimodal Reasoning on IChO 2025},
|
| 57 |
+
author={Qiang, Xu and Bai, Shengyuan and Chen, Leqing and Liu, Zijing and Li, Yu},
|
| 58 |
+
journal={arXiv preprint arXiv:2511.16205},
|
| 59 |
+
year={2025}
|
| 60 |
+
}
|
| 61 |
+
```
|
| 62 |
+
|
| 63 |
+
### π State-of-the-Art Results
|
| 64 |
+
|
| 65 |
+
Our ChemLabs multi-agent system combined with SVE achieves **93.6/100** on ChemO, surpassing the estimated human gold medal threshold and establishing a new benchmark in automated chemical problem-solving.
|
| 66 |
+
|
| 67 |
+
### π€ Community
|
| 68 |
+
|
| 69 |
+
We appreciate your patience and look forward to your feedback as we continue to improve this resource for the community.
|