model_id
stringclasses 120
values | model_name
stringclasses 118
values | author
stringclasses 43
values | created_at
stringdate 2022-03-02 23:29:04
2025-11-01 12:45:39
| downloads
int64 0
11.4M
| likes
int64 6
12.8k
| library
stringclasses 4
values | tags
stringclasses 106
values | trending_score
int64 6
912
| trending_rank
int64 1
1k
| architecture
stringclasses 33
values | model_type
stringclasses 33
values | num_parameters
float64 268M
65.7B
⌀ | max_position_embeddings
float64 2.05k
10.2M
⌀ | hidden_size
float64 768
8.19k
⌀ | num_attention_heads
float64 12
128
⌀ | num_hidden_layers
float64 16
92
⌀ | vocab_size
float64 32k
256k
⌀ | primary_category
stringclasses 5
values | secondary_categories
stringclasses 25
values | task_types
stringclasses 37
values | language_support
stringclasses 20
values | use_cases
stringclasses 39
values | performance_metrics
stringclasses 1
value | a2ap_compatibility_score
float64 40
96
| merge_difficulty
stringclasses 4
values | evolution_potential
float64 0.4
0.96
| analysis_timestamp
stringdate 2025-11-02 05:59:02
2025-11-02 06:09:35
| readme_summary
stringclasses 96
values | special_features
stringclasses 43
values |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
meta-llama/Llama-3.2-3B-Instruct
|
Llama-3.2-3B-Instruct
|
meta-llama
|
2024-09-18T15:19:20+00:00
| 1,924,026
| 1,789
|
transformers
|
['transformers', 'safetensors', 'llama', 'text-generation', 'facebook', 'meta', 'pytorch', 'llama-3', 'conversational', 'en', 'de', 'fr', 'it', 'pt', 'hi', 'es', 'th', 'arxiv:2204.05149', 'arxiv:2405.16406', 'license:llama3.2', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 11
| 301
|
Unknown
|
unknown
| null | null | null | null | null | null |
General Language Model
|
[]
|
['text-generation']
|
['English']
|
['General Purpose']
|
{}
| 40
|
Critical
| 0.4
|
2025-11-02T06:02:15.725359
|
No README available
|
[]
|
MiniMaxAI/MiniMax-M1-80k
|
MiniMax-M1-80k
|
MiniMaxAI
|
2025-06-13T08:21:14+00:00
| 262
| 685
|
transformers
|
['transformers', 'safetensors', 'minimax_m1', 'text-generation', 'vllm', 'conversational', 'custom_code', 'arxiv:2506.13585', 'license:apache-2.0', 'autotrain_compatible', 'region:us']
| 11
| 302
|
MiniMaxM1ForCausalLM
|
minimax_m1
| 37,467,979,776
| 10,240,000
| 6,144
| 64
| 80
| 200,064
|
Language Model
|
['RLHF', 'Merged']
|
['text-generation', 'question-answering', 'translation', 'summarization', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Production', 'Education', 'Creative Writing', 'Business']
|
{}
| 66
|
Medium
| 0.66
|
2025-11-02T06:02:16.314644
|
pipeline_tag: text-generation license: apache-2.0 library_name: transformers tags: - vllm <div align="center"> <svg width="60%" height="auto" viewBox="0 0 144 48" fill="none" xmlns="http://www.w3.org/...
|
['Function Calling', 'RAG Support', 'Long Context', 'Fast Inference', 'Memory Efficient', 'Multi-turn', 'Safety Aligned']
|
Qwen/Qwen3-30B-A3B-Instruct-2507
|
Qwen3-30B-A3B-Instruct-2507
|
Qwen
|
2025-07-28T07:31:27+00:00
| 1,212,976
| 638
|
transformers
|
['transformers', 'safetensors', 'qwen3_moe', 'text-generation', 'conversational', 'arxiv:2402.17463', 'arxiv:2407.02490', 'arxiv:2501.15383', 'arxiv:2404.06654', 'arxiv:2505.09388', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 11
| 303
|
Qwen3MoeForCausalLM
|
qwen3_moe
| 2,727,084,032
| 262,144
| 2,048
| 32
| 48
| 151,936
|
Language Model
|
['Specialized']
|
['text-generation', 'question-answering', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Production', 'Creative Writing']
|
{}
| 96
|
Easy
| 0.96
|
2025-11-02T06:02:16.520393
|
library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-30B-A3B-Instruct-2507/blob/main/LICENSE pipeline_tag: text-generation <a href="https://chat.qwen.ai/?mode...
|
['Function Calling', 'RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned']
|
unsloth/gpt-oss-20b-GGUF
|
gpt-oss-20b-GGUF
|
unsloth
|
2025-08-05T17:12:17+00:00
| 211,159
| 451
|
transformers
|
['transformers', 'gguf', 'gpt_oss', 'text-generation', 'openai', 'unsloth', 'base_model:openai/gpt-oss-20b', 'base_model:quantized:openai/gpt-oss-20b', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us', 'conversational']
| 11
| 304
|
GptOssForCausalLM
|
gpt_oss
| 2,967,920,640
| 131,072
| 2,880
| 64
| 24
| 201,088
|
Language Model
|
['Fine-tuned', 'Quantized', 'Specialized']
|
['text-generation', 'reasoning', 'conversation']
|
['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Production', 'Business']
|
{}
| 85
|
Easy
| 0.85
|
2025-11-02T06:02:16.879061
|
base_model: - openai/gpt-oss-20b license: apache-2.0 pipeline_tag: text-generation library_name: transformers tags: - openai - unsloth > [!NOTE] > GGUF uploads with our fixes. More details and [Read o...
|
['Function Calling', 'Fast Inference', 'Multi-turn']
|
Qwen/Qwen3-Next-80B-A3B-Instruct
|
Qwen3-Next-80B-A3B-Instruct
|
Qwen
|
2025-09-09T15:40:56+00:00
| 1,663,221
| 845
|
transformers
|
['transformers', 'safetensors', 'qwen3_next', 'text-generation', 'conversational', 'arxiv:2309.00071', 'arxiv:2404.06654', 'arxiv:2505.09388', 'arxiv:2501.15383', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 11
| 305
|
Qwen3NextForCausalLM
|
qwen3_next
| 2,727,084,032
| 262,144
| 2,048
| 16
| 48
| 151,936
|
Language Model
|
['Merged']
|
['text-generation', 'question-answering', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Creative Writing']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:02:17.030640
|
library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-Next-80B-A3B-Instruct/blob/main/LICENSE pipeline_tag: text-generation <a href="https://chat.qwen.ai/" tar...
|
['Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned']
|
inclusionAI/Ring-1T
|
Ring-1T
|
inclusionAI
|
2025-10-10T16:39:04+00:00
| 1,807
| 216
|
transformers
|
['transformers', 'safetensors', 'bailing_moe', 'text-generation', 'conversational', 'custom_code', 'arxiv:2510.18855', 'license:mit', 'autotrain_compatible', 'region:us']
| 11
| 306
|
BailingMoeV2ForCausalLM
|
bailing_moe
| 65,712,160,768
| 65,536
| 8,192
| 64
| 80
| 157,184
|
Language Model
|
['RLHF']
|
['text-generation', 'question-answering', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Production', 'Education', 'Creative Writing', 'Business', 'Healthcare']
|
{}
| 63
|
Hard
| 0.63
|
2025-11-02T06:02:17.310285
|
pipeline_tag: text-generation license: mit library_name: transformers <p align="center"> <img src="https://mdn.alipayobjects.com/huamei_qa8qxu/afts/img/A*4QxcQrBlTiAAAAAAQXAAAAgAemJ7AQ/original" width...
|
['Function Calling', 'RAG Support', 'Long Context', 'Fast Inference', 'Memory Efficient']
|
cerebras/GLM-4.6-REAP-218B-A32B-FP8
|
GLM-4.6-REAP-218B-A32B-FP8
|
cerebras
|
2025-10-23T20:29:59+00:00
| 757
| 34
|
transformers
|
['transformers', 'safetensors', 'glm4_moe', 'text-generation', 'glm', 'MOE', 'pruning', 'compression', 'conversational', 'en', 'arxiv:2510.13999', 'base_model:zai-org/GLM-4.6-FP8', 'base_model:quantized:zai-org/GLM-4.6-FP8', 'license:mit', 'autotrain_compatible', 'endpoints_compatible', 'compressed-tensors', 'region:us']
| 11
| 307
|
Glm4MoeForCausalLM
|
glm4_moe
| 29,716,643,840
| 202,752
| 5,120
| 96
| 92
| 151,552
|
Language Model
|
['Specialized']
|
['text-generation', 'question-answering', 'code-generation', 'reasoning']
|
['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Production', 'Creative Writing']
|
{}
| 60
|
Hard
| 0.6
|
2025-11-02T06:02:17.586305
|
language: - en library_name: transformers tags: - glm - MOE - pruning - compression license: mit name: cerebras/GLM-4.6-REAP-218B-A32B-FP8 description: > This model was obtained by uniformly pruning 4...
|
['Function Calling', 'Fast Inference', 'Memory Efficient', 'Multi-turn']
|
internlm/JanusCoder-8B
|
JanusCoder-8B
|
internlm
|
2025-10-27T09:33:54+00:00
| 139
| 11
|
transformers
|
['transformers', 'safetensors', 'qwen3', 'text-generation', 'image-text-to-text', 'conversational', 'arxiv:2510.23538', 'arxiv:2403.14734', 'arxiv:2510.09724', 'arxiv:2507.22080', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 11
| 308
|
Qwen3ForCausalLM
|
qwen3
| 7,870,087,168
| 32,768
| 4,096
| 32
| 36
| 151,936
|
Language Model
|
['Specialized']
|
['text-generation', 'code-generation', 'conversation']
|
['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research']
|
{}
| 68
|
Medium
| 0.68
|
2025-11-02T06:02:17.742509
|
license: apache-2.0 pipeline_tag: image-text-to-text library_name: transformers [💻Github Repo](https://github.com/InternLM/JanusCoder) • [🤗Model Collections](https://huggingface.co/collections/internl...
|
['Safety Aligned']
|
unsloth/MiniMax-M2
|
MiniMax-M2
|
unsloth
|
2025-10-28T12:05:08+00:00
| 264
| 11
|
transformers
|
['transformers', 'safetensors', 'minimax', 'text-generation', 'conversational', 'arxiv:2504.07164', 'arxiv:2509.06501', 'arxiv:2509.13160', 'license:mit', 'autotrain_compatible', 'endpoints_compatible', 'fp8', 'region:us']
| 11
| 309
|
MiniMaxM2ForCausalLM
|
minimax
| 7,635,861,504
| 196,608
| 3,072
| 48
| 62
| 200,064
|
Language Model
|
['LoRA']
|
['text-generation', 'question-answering', 'code-generation', 'conversation']
|
['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production']
|
{}
| 91
|
Easy
| 0.91
|
2025-11-02T06:02:17.948843
|
pipeline_tag: text-generation license: mit library_name: transformers <div align="center"> <svg width="60%" height="auto" viewBox="0 0 144 48" fill="none" xmlns="http://www.w3.org/2000/svg"> <path d="...
|
['Function Calling', 'RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned']
|
ByteDance/Ouro-1.4B-Thinking
|
Ouro-1.4B-Thinking
|
ByteDance
|
2025-10-28T22:14:40+00:00
| 34
| 11
|
transformers
|
['transformers', 'safetensors', 'ouro', 'text-generation', 'looped-language-model', 'reasoning', 'recurrent-depth', 'thinking', 'chain-of-thought', 'conversational', 'custom_code', 'arxiv:2510.25741', 'license:apache-2.0', 'autotrain_compatible', 'region:us']
| 11
| 310
|
OuroForCausalLM
|
ouro
| 1,308,622,848
| 65,536
| 2,048
| 16
| 24
| 49,152
|
Language Model
|
['Specialized']
|
['text-generation', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Production', 'Creative Writing']
|
{}
| 83
|
Medium
| 0.83
|
2025-11-02T06:02:18.140222
|
library_name: transformers license: apache-2.0 pipeline_tag: text-generation tags: - looped-language-model - reasoning - recurrent-depth - thinking - chain-of-thought  **⚠...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Memory Efficient']
|
mistralai/Mistral-7B-Instruct-v0.2
|
Mistral-7B-Instruct-v0.2
|
mistralai
|
2023-12-11T13:18:44+00:00
| 2,963,552
| 2,997
|
transformers
|
['transformers', 'pytorch', 'safetensors', 'mistral', 'text-generation', 'finetuned', 'mistral-common', 'conversational', 'arxiv:2310.06825', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'region:us']
| 10
| 311
|
MistralForCausalLM
|
mistral
| 6,573,522,944
| 32,768
| 4,096
| 32
| 32
| 32,000
|
Language Model
|
['Fine-tuned']
|
['text-generation', 'conversation']
|
['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Production', 'Education']
|
{}
| 78
|
Medium
| 0.78
|
2025-11-02T06:02:20.334604
|
library_name: transformers license: apache-2.0 tags: - finetuned - mistral-common new_version: mistralai/Mistral-7B-Instruct-v0.3 inference: false widget: - messages: - role: user content: What is you...
|
['RAG Support', 'Long Context']
|
fdtn-ai/Foundation-Sec-8B
|
Foundation-Sec-8B
|
fdtn-ai
|
2025-04-26T17:20:37+00:00
| 8,906
| 265
|
transformers
|
['transformers', 'safetensors', 'llama', 'text-generation', 'security', 'en', 'arxiv:2504.21039', 'base_model:meta-llama/Llama-3.1-8B', 'base_model:finetune:meta-llama/Llama-3.1-8B', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 10
| 312
|
LlamaForCausalLM
|
llama
| 6,968,311,808
| 131,072
| 4,096
| 32
| 32
| 128,384
|
Language Model
|
['Specialized']
|
['text-generation', 'question-answering', 'text-classification', 'summarization', 'reasoning']
|
['English', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Production', 'Healthcare', 'Legal']
|
{}
| 78
|
Medium
| 0.78
|
2025-11-02T06:02:20.541840
|
base_model: - meta-llama/Llama-3.1-8B language: - en library_name: transformers license: apache-2.0 pipeline_tag: text-generation tags: - security Foundation-Sec-8B (Llama-3.1-FoundationAI-SecurityLLM...
|
['RAG Support', 'Fast Inference', 'Memory Efficient', 'Safety Aligned']
|
katanemo/Arch-Router-1.5B
|
Arch-Router-1.5B
|
katanemo
|
2025-05-30T18:16:23+00:00
| 3,508
| 216
|
transformers
|
['transformers', 'safetensors', 'qwen2', 'text-generation', 'routing', 'preference', 'arxiv:2506.16655', 'llm', 'conversational', 'en', 'base_model:Qwen/Qwen2.5-1.5B-Instruct', 'base_model:finetune:Qwen/Qwen2.5-1.5B-Instruct', 'license:other', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 10
| 313
|
Qwen2ForCausalLM
|
qwen2
| 1,026,097,152
| 32,768
| 1,536
| 12
| 28
| 151,936
|
Language Model
|
['Specialized']
|
['text-generation', 'question-answering', 'translation', 'summarization', 'code-generation', 'conversation']
|
['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Production', 'Healthcare', 'Legal']
|
{}
| 83
|
Medium
| 0.83
|
2025-11-02T06:02:20.787630
|
base_model: - Qwen/Qwen2.5-1.5B-Instruct language: - en library_name: transformers license: other license_name: katanemo-research license_link: https://huggingface.co/katanemo/Arch-Router-1.5B/blob/ma...
|
['Fast Inference', 'Memory Efficient', 'Multi-turn', 'Safety Aligned']
|
dphn/Dolphin-Mistral-24B-Venice-Edition
|
Dolphin-Mistral-24B-Venice-Edition
|
dphn
|
2025-06-12T05:29:16+00:00
| 7,206
| 279
|
transformers
|
['transformers', 'safetensors', 'mistral', 'text-generation', 'conversational', 'base_model:mistralai/Mistral-Small-24B-Instruct-2501', 'base_model:finetune:mistralai/Mistral-Small-24B-Instruct-2501', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 10
| 314
|
MistralForCausalLM
|
mistral
| 13,254,000,640
| 32,768
| 5,120
| 32
| 40
| 131,072
|
Language Model
|
['Specialized']
|
['text-generation', 'conversation']
|
['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Production', 'Business', 'Legal']
|
{}
| 63
|
Hard
| 0.63
|
2025-11-02T06:02:20.933638
|
license: apache-2.0 base_model: - mistralai/Mistral-Small-24B-Instruct-2501 pipeline_tag: text-generation library_name: transformers Website: https://dphn.ai Twitter: https://x.com/dphnAI Web Chat: ht...
|
['Multi-turn', 'Safety Aligned']
|
LiquidAI/LFM2-350M-Math
|
LFM2-350M-Math
|
LiquidAI
|
2025-08-25T17:06:02+00:00
| 1,296
| 49
|
transformers
|
['transformers', 'safetensors', 'lfm2', 'text-generation', 'liquid', 'edge', 'conversational', 'en', 'base_model:LiquidAI/LFM2-350M', 'base_model:finetune:LiquidAI/LFM2-350M', 'license:other', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 10
| 315
|
Lfm2ForCausalLM
|
lfm2
| 268,435,456
| 128,000
| 1,024
| 16
| 16
| 65,536
|
Chat/Instruct
|
['RLHF', 'LoRA']
|
['text-generation', 'translation', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Production', 'Education']
|
{}
| 70
|
Medium
| 0.7
|
2025-11-02T06:02:21.114830
|
library_name: transformers license: other license_name: lfm1.0 license_link: LICENSE language: - en pipeline_tag: text-generation tags: - liquid - lfm2 - edge base_model: LiquidAI/LFM2-350M <center> <...
|
['RAG Support', 'Multi-turn']
|
facebook/MobileLLM-Pro
|
MobileLLM-Pro
|
facebook
|
2025-09-10T18:40:06+00:00
| 4,486
| 138
|
transformers
|
['transformers', 'safetensors', 'llama4_text', 'text-generation', 'facebook', 'meta', 'pytorch', 'conversational', 'custom_code', 'en', 'base_model:facebook/MobileLLM-Pro-base', 'base_model:finetune:facebook/MobileLLM-Pro-base', 'license:fair-noncommercial-research-license', 'autotrain_compatible', 'region:us']
| 10
| 316
|
Unknown
|
unknown
| null | null | null | null | null | null |
General Language Model
|
[]
|
['text-generation']
|
['English']
|
['General Purpose']
|
{}
| 40
|
Critical
| 0.4
|
2025-11-02T06:02:22.274794
|
No README available
|
[]
|
nineninesix/kani-tts-370m
|
kani-tts-370m
|
nineninesix
|
2025-09-30T07:31:22+00:00
| 8,444
| 136
|
transformers
|
['transformers', 'safetensors', 'lfm2', 'text-generation', 'text-to-speech', 'en', 'de', 'ar', 'zh', 'es', 'ko', 'arxiv:2505.20506', 'base_model:nineninesix/kani-tts-450m-0.2-pt', 'base_model:finetune:nineninesix/kani-tts-450m-0.2-pt', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 10
| 317
|
Lfm2ForCausalLM
|
lfm2
| 283,798,528
| 128,000
| 1,024
| 16
| 16
| 80,539
|
Language Model
|
['Fine-tuned', 'Specialized']
|
['conversation']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Education', 'Legal']
|
{}
| 75
|
Medium
| 0.75
|
2025-11-02T06:02:22.404526
|
license: apache-2.0 language: - en - de - ar - zh - es - ko pipeline_tag: text-to-speech library_name: transformers base_model: - nineninesix/kani-tts-450m-0.2-pt <p> <img src="https://www.nineninesix...
|
['Fast Inference', 'Memory Efficient', 'Multi-turn']
|
vandijklab/C2S-Scale-Gemma-2-27B
|
C2S-Scale-Gemma-2-27B
|
vandijklab
|
2025-10-06T20:22:30+00:00
| 9,715
| 137
|
transformers
|
['transformers', 'safetensors', 'gemma2', 'text-generation', 'biology', 'scRNAseq', 'genomics', 'computational-biology', 'bioinformatics', 'gene-expression', 'cell-biology', 'pytorch', 'cell-type-annotation', 'Question Answering', 'en', 'base_model:google/gemma-2-27b', 'base_model:finetune:google/gemma-2-27b', 'license:cc-by-4.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 10
| 318
|
Gemma2ForCausalLM
|
gemma2
| 12,900,630,528
| 8,192
| 4,608
| 32
| 46
| 256,000
|
Language Model
|
['Fine-tuned', 'RLHF', 'Specialized']
|
['text-generation', 'question-answering', 'text-classification', 'reasoning']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Education', 'Legal']
|
{}
| 60
|
Hard
| 0.6
|
2025-11-02T06:02:22.561589
|
license: cc-by-4.0 language: - en base_model: google/gemma-2-27b library_name: transformers pipeline_tag: text-generation tags: - biology - scRNAseq - gemma2 - genomics - computational-biology - bioin...
|
['RAG Support', 'Fast Inference']
|
openai-community/gpt2
|
gpt2
|
openai-community
|
2022-03-02T23:29:04+00:00
| 11,402,243
| 3,004
|
transformers
|
['transformers', 'pytorch', 'tf', 'jax', 'tflite', 'rust', 'onnx', 'safetensors', 'gpt2', 'text-generation', 'exbert', 'en', 'doi:10.57967/hf/0039', 'license:mit', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 9
| 319
|
GPT2LMHeadModel
|
gpt2
| null | null | null | null | null | 50,257
|
Language Model
|
['Fine-tuned']
|
['text-generation', 'code-generation']
|
['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['General Purpose']
|
{}
| 40
|
Critical
| 0.4
|
2025-11-02T06:02:22.730271
|
language: en tags: - exbert license: mit Test the whole generation capabilities here: https://transformer.huggingface.co/doc/gpt2-large Pretrained model on English language using a causal language mod...
|
[]
|
google/gemma-7b
|
gemma-7b
|
google
|
2024-02-08T22:36:43+00:00
| 31,556
| 3,228
|
transformers
|
['transformers', 'safetensors', 'gguf', 'gemma', 'text-generation', 'arxiv:2305.14314', 'arxiv:2312.11805', 'arxiv:2009.03300', 'arxiv:1905.07830', 'arxiv:1911.11641', 'arxiv:1904.09728', 'arxiv:1905.10044', 'arxiv:1907.10641', 'arxiv:1811.00937', 'arxiv:1809.02789', 'arxiv:1911.01547', 'arxiv:1705.03551', 'arxiv:2107.03374', 'arxiv:2108.07732', 'arxiv:2110.14168', 'arxiv:2304.06364', 'arxiv:2206.04615', 'arxiv:1804.06876', 'arxiv:2110.08193', 'arxiv:2009.11462', 'arxiv:2101.11718', 'arxiv:1804.09301', 'arxiv:2109.07958', 'arxiv:2203.09509', 'license:gemma', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 9
| 320
|
Unknown
|
unknown
| null | null | null | null | null | null |
General Language Model
|
[]
|
['text-generation']
|
['English']
|
['General Purpose']
|
{}
| 40
|
Critical
| 0.4
|
2025-11-02T06:02:23.929381
|
No README available
|
[]
|
meta-llama/Llama-3.3-70B-Instruct
|
Llama-3.3-70B-Instruct
|
meta-llama
|
2024-11-26T16:08:47+00:00
| 764,118
| 2,549
|
transformers
|
['transformers', 'safetensors', 'llama', 'text-generation', 'facebook', 'meta', 'pytorch', 'llama-3', 'conversational', 'en', 'fr', 'it', 'pt', 'hi', 'es', 'th', 'de', 'arxiv:2204.05149', 'base_model:meta-llama/Llama-3.1-70B', 'base_model:finetune:meta-llama/Llama-3.1-70B', 'license:llama3.3', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 9
| 321
|
Unknown
|
unknown
| null | null | null | null | null | null |
General Language Model
|
[]
|
['text-generation']
|
['English']
|
['General Purpose']
|
{}
| 40
|
Critical
| 0.4
|
2025-11-02T06:02:27.147707
|
No README available
|
[]
|
Qwen/Qwen3-1.7B
|
Qwen3-1.7B
|
Qwen
|
2025-04-27T03:41:05+00:00
| 1,284,936
| 305
|
transformers
|
['transformers', 'safetensors', 'qwen3', 'text-generation', 'conversational', 'arxiv:2505.09388', 'base_model:Qwen/Qwen3-1.7B-Base', 'base_model:finetune:Qwen/Qwen3-1.7B-Base', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 9
| 322
|
Qwen3ForCausalLM
|
qwen3
| 1,720,451,072
| 40,960
| 2,048
| 16
| 28
| 151,936
|
Language Model
|
[]
|
['text-generation', 'question-answering', 'translation', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Production', 'Creative Writing']
|
{}
| 78
|
Medium
| 0.78
|
2025-11-02T06:02:27.470281
|
library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-1.7B/blob/main/LICENSE pipeline_tag: text-generation base_model: - Qwen/Qwen3-1.7B-Base <a href="https://...
|
['Fast Inference', 'Multi-turn', 'Safety Aligned']
|
LiquidAI/LFM2-350M
|
LFM2-350M
|
LiquidAI
|
2025-07-10T12:01:24+00:00
| 26,822
| 170
|
transformers
|
['transformers', 'safetensors', 'lfm2', 'text-generation', 'liquid', 'edge', 'conversational', 'en', 'ar', 'zh', 'fr', 'de', 'ja', 'ko', 'es', 'license:other', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 9
| 323
|
Lfm2ForCausalLM
|
lfm2
| 268,435,456
| 128,000
| 1,024
| 16
| 16
| 65,536
|
Language Model
|
['LoRA']
|
['text-generation', 'question-answering', 'translation', 'code-generation', 'conversation']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Creative Writing', 'Healthcare']
|
{}
| 88
|
Easy
| 0.88
|
2025-11-02T06:02:29.375529
|
library_name: transformers license: other license_name: lfm1.0 license_link: LICENSE language: - en - ar - zh - fr - de - ja - ko - es pipeline_tag: text-generation tags: - liquid - lfm2 - edge <cente...
|
['Function Calling', 'RAG Support', 'Fast Inference', 'Multi-turn', 'Safety Aligned']
|
moonshotai/Kimi-K2-Instruct
|
Kimi-K2-Instruct
|
moonshotai
|
2025-07-11T00:55:12+00:00
| 81,900
| 2,198
|
transformers
|
['transformers', 'safetensors', 'kimi_k2', 'text-generation', 'conversational', 'custom_code', 'doi:10.57967/hf/5976', 'license:other', 'autotrain_compatible', 'endpoints_compatible', 'fp8', 'region:us']
| 9
| 324
|
DeepseekV3ForCausalLM
|
kimi_k2
| 38,784,729,088
| 131,072
| 7,168
| 64
| 61
| 163,840
|
Language Model
|
[]
|
['question-answering', 'summarization', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production']
|
{}
| 63
|
Hard
| 0.63
|
2025-11-02T06:02:29.591562
|
license: other license_name: modified-mit library_name: transformers new_version: moonshotai/Kimi-K2-Instruct-0905 <div align="center"> <picture> <img src="figures/kimi-logo.png" width="30%" alt="Kimi...
|
['Function Calling', 'RAG Support', 'Long Context', 'Fast Inference', 'Memory Efficient', 'Multi-turn']
|
google/gemma-3-270m-it
|
gemma-3-270m-it
|
google
|
2025-07-30T18:06:27+00:00
| 206,547
| 448
|
transformers
|
['transformers', 'safetensors', 'gemma3_text', 'text-generation', 'gemma3', 'gemma', 'google', 'conversational', 'arxiv:2503.19786', 'arxiv:1905.07830', 'arxiv:1905.10044', 'arxiv:1911.11641', 'arxiv:1705.03551', 'arxiv:1911.01547', 'arxiv:1907.10641', 'arxiv:2311.07911', 'arxiv:2311.12022', 'arxiv:2411.04368', 'arxiv:1904.09728', 'arxiv:1903.00161', 'arxiv:2009.03300', 'arxiv:2304.06364', 'arxiv:2103.03874', 'arxiv:2110.14168', 'arxiv:2108.07732', 'arxiv:2107.03374', 'arxiv:2403.07974', 'arxiv:2305.03111', 'arxiv:2405.04520', 'arxiv:2210.03057', 'arxiv:2106.03193', 'arxiv:1910.11856', 'arxiv:2502.12404', 'arxiv:2502.21228', 'arxiv:2404.16816', 'arxiv:2104.12756', 'arxiv:2311.16502', 'arxiv:2203.10244', 'arxiv:2404.12390', 'arxiv:1810.12440', 'arxiv:1908.02660', 'arxiv:2310.02255', 'arxiv:2312.11805', 'base_model:google/gemma-3-270m', 'base_model:finetune:google/gemma-3-270m', 'license:gemma', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 9
| 325
|
Unknown
|
unknown
| null | null | null | null | null | null |
General Language Model
|
[]
|
['text-generation']
|
['English']
|
['General Purpose']
|
{}
| 40
|
Critical
| 0.4
|
2025-11-02T06:02:30.758737
|
No README available
|
[]
|
nvidia/NVIDIA-Nemotron-Nano-9B-v2
|
NVIDIA-Nemotron-Nano-9B-v2
|
nvidia
|
2025-08-12T22:43:32+00:00
| 190,478
| 419
|
transformers
|
['transformers', 'safetensors', 'nvidia', 'pytorch', 'text-generation', 'conversational', 'en', 'es', 'fr', 'de', 'it', 'ja', 'dataset:nvidia/Nemotron-Post-Training-Dataset-v1', 'dataset:nvidia/Nemotron-Post-Training-Dataset-v2', 'dataset:nvidia/Nemotron-Pretraining-Dataset-sample', 'dataset:nvidia/Nemotron-CC-v2', 'dataset:nvidia/Nemotron-CC-Math-v1', 'dataset:nvidia/Nemotron-Pretraining-SFT-v1', 'arxiv:2504.03624', 'arxiv:2508.14444', 'arxiv:2412.02595', 'base_model:nvidia/NVIDIA-Nemotron-Nano-12B-v2', 'base_model:finetune:nvidia/NVIDIA-Nemotron-Nano-12B-v2', 'license:other', 'endpoints_compatible', 'region:us']
| 9
| 326
|
NemotronHForCausalLM
|
nemotron_h
| 14,074,511,360
| 131,072
| 4,480
| 40
| 56
| 131,072
|
Language Model
|
['Quantized', 'Merged', 'Specialized']
|
['text-generation', 'question-answering', 'translation', 'summarization', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Education', 'Business', 'Legal', 'Finance']
|
{}
| 71
|
Medium
| 0.71
|
2025-11-02T06:02:31.286170
|
license: other license_name: nvidia-open-model-license license_link: >- https://www.nvidia.com/en-us/agreements/enterprise-software/nvidia-open-model-license/ pipeline_tag: text-generation datasets: -...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Memory Efficient', 'Multi-turn', 'Safety Aligned']
|
LiquidAI/LFM2-8B-A1B
|
LFM2-8B-A1B
|
LiquidAI
|
2025-10-07T13:55:39+00:00
| 13,348
| 231
|
transformers
|
['transformers', 'safetensors', 'lfm2_moe', 'text-generation', 'liquid', 'lfm2', 'edge', 'moe', 'conversational', 'en', 'ar', 'zh', 'fr', 'de', 'ja', 'ko', 'es', 'license:other', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 9
| 327
|
Lfm2MoeForCausalLM
|
lfm2_moe
| 1,342,177,280
| 128,000
| 2,048
| 32
| 24
| 65,536
|
Language Model
|
['Quantized', 'LoRA']
|
['text-generation', 'question-answering', 'translation', 'code-generation', 'conversation']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Creative Writing', 'Healthcare']
|
{}
| 88
|
Easy
| 0.88
|
2025-11-02T06:02:31.730781
|
library_name: transformers license: other license_name: lfm1.0 license_link: LICENSE language: - en - ar - zh - fr - de - ja - ko - es pipeline_tag: text-generation tags: - liquid - lfm2 - edge - moe ...
|
['Function Calling', 'RAG Support', 'Fast Inference', 'Multi-turn', 'Safety Aligned']
|
cerebras/Qwen3-Coder-REAP-25B-A3B
|
Qwen3-Coder-REAP-25B-A3B
|
cerebras
|
2025-10-20T15:40:03+00:00
| 971
| 32
|
transformers
|
['transformers', 'qwen3_moe', 'text-generation', 'qwen-coder', 'MOE', 'pruning', 'compression', 'conversational', 'en', 'arxiv:2510.13999', 'base_model:Qwen/Qwen3-Coder-30B-A3B-Instruct', 'base_model:finetune:Qwen/Qwen3-Coder-30B-A3B-Instruct', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 9
| 328
|
Qwen3MoeForCausalLM
|
qwen3_moe
| 2,727,084,032
| 262,144
| 2,048
| 32
| 48
| 151,936
|
Language Model
|
['Specialized']
|
['text-generation', 'question-answering', 'code-generation', 'reasoning']
|
['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Production', 'Creative Writing']
|
{}
| 85
|
Easy
| 0.85
|
2025-11-02T06:02:32.037782
|
language: - en library_name: transformers tags: - qwen-coder - MOE - pruning - compression license: apache-2.0 name: cerebras/Qwen3-Coder-REAP-25B-A3B description: > This model was obtained by uniform...
|
['Function Calling', 'Fast Inference', 'Memory Efficient', 'Multi-turn']
|
meta-llama/Llama-3.2-1B-Instruct
|
Llama-3.2-1B-Instruct
|
meta-llama
|
2024-09-18T15:12:47+00:00
| 3,843,095
| 1,141
|
transformers
|
['transformers', 'safetensors', 'llama', 'text-generation', 'facebook', 'meta', 'pytorch', 'llama-3', 'conversational', 'en', 'de', 'fr', 'it', 'pt', 'hi', 'es', 'th', 'arxiv:2204.05149', 'arxiv:2405.16406', 'license:llama3.2', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 8
| 329
|
Unknown
|
unknown
| null | null | null | null | null | null |
General Language Model
|
[]
|
['text-generation']
|
['English']
|
['General Purpose']
|
{}
| 40
|
Critical
| 0.4
|
2025-11-02T06:02:33.286959
|
No README available
|
[]
|
microsoft/Phi-4-mini-instruct
|
Phi-4-mini-instruct
|
microsoft
|
2025-02-19T01:00:58+00:00
| 249,315
| 623
|
transformers
|
['transformers', 'safetensors', 'phi3', 'text-generation', 'nlp', 'code', 'conversational', 'custom_code', 'multilingual', 'ar', 'zh', 'cs', 'da', 'nl', 'en', 'fi', 'fr', 'de', 'he', 'hu', 'it', 'ja', 'ko', 'no', 'pl', 'pt', 'ru', 'es', 'sv', 'th', 'tr', 'uk', 'arxiv:2503.01743', 'license:mit', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 8
| 330
|
Phi3ForCausalLM
|
phi3
| 4,238,475,264
| 131,072
| 3,072
| 24
| 32
| 200,064
|
Language Model
|
['Specialized']
|
['text-generation', 'question-answering', 'summarization', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Education', 'Business', 'Healthcare', 'Legal']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:02:33.514613
|
language: - multilingual - ar - zh - cs - da - nl - en - fi - fr - de - he - hu - it - ja - ko - 'no' - pl - pt - ru - es - sv - th - tr - uk library_name: transformers license: mit license_link: http...
|
['Function Calling', 'RAG Support', 'Long Context', 'Multi-turn', 'Safety Aligned']
|
Qwen/Qwen3-30B-A3B
|
Qwen3-30B-A3B
|
Qwen
|
2025-04-27T03:43:05+00:00
| 421,099
| 808
|
transformers
|
['transformers', 'safetensors', 'qwen3_moe', 'text-generation', 'conversational', 'arxiv:2309.00071', 'arxiv:2505.09388', 'base_model:Qwen/Qwen3-30B-A3B-Base', 'base_model:finetune:Qwen/Qwen3-30B-A3B-Base', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 8
| 331
|
Qwen3MoeForCausalLM
|
qwen3_moe
| 2,727,084,032
| 40,960
| 2,048
| 32
| 48
| 151,936
|
Language Model
|
[]
|
['text-generation', 'question-answering', 'translation', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Production', 'Creative Writing']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:02:35.665325
|
library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-30B-A3B/blob/main/LICENSE pipeline_tag: text-generation base_model: - Qwen/Qwen3-30B-A3B-Base <a href="ht...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned']
|
Qwen/Qwen3-Embedding-8B
|
Qwen3-Embedding-8B
|
Qwen
|
2025-06-03T14:39:10+00:00
| 751,756
| 419
|
sentence-transformers
|
['sentence-transformers', 'safetensors', 'qwen3', 'text-generation', 'transformers', 'sentence-similarity', 'feature-extraction', 'text-embeddings-inference', 'arxiv:2506.05176', 'base_model:Qwen/Qwen3-8B-Base', 'base_model:finetune:Qwen/Qwen3-8B-Base', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 8
| 332
|
Qwen3ForCausalLM
|
qwen3
| 7,868,977,152
| 40,960
| 4,096
| 32
| 36
| 151,665
|
Language Model
|
[]
|
['text-classification', 'code-generation', 'reasoning']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['General Purpose']
|
{}
| 76
|
Medium
| 0.76
|
2025-11-02T06:02:35.814612
|
license: apache-2.0 base_model: - Qwen/Qwen3-8B-Base tags: - transformers - sentence-transformers - sentence-similarity - feature-extraction - text-embeddings-inference <p align="center"> <img src="ht...
|
['Long Context', 'Safety Aligned']
|
unsloth/Qwen3-Coder-30B-A3B-Instruct-GGUF
|
Qwen3-Coder-30B-A3B-Instruct-GGUF
|
unsloth
|
2025-07-31T10:27:38+00:00
| 128,072
| 304
|
transformers
|
['transformers', 'gguf', 'unsloth', 'qwen3', 'qwen', 'text-generation', 'arxiv:2505.09388', 'base_model:Qwen/Qwen3-Coder-30B-A3B-Instruct', 'base_model:quantized:Qwen/Qwen3-Coder-30B-A3B-Instruct', 'license:apache-2.0', 'endpoints_compatible', 'region:us', 'imatrix', 'conversational']
| 8
| 333
|
Unknown
|
unknown
| null | null | null | null | null | null |
Language Model
|
[]
|
['text-generation', 'question-answering', 'code-generation', 'conversation']
|
['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research']
|
{}
| 48
|
Hard
| 0.48
|
2025-11-02T06:02:37.112658
|
tags: - unsloth - qwen3 - qwen base_model: - Qwen/Qwen3-Coder-30B-A3B-Instruct library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-Coder-30B-A3B-Instruct/blo...
|
['Fast Inference', 'Memory Efficient', 'Multi-turn', 'Safety Aligned']
|
tencent/DeepSeek-V3.1-Terminus-W4AFP8
|
DeepSeek-V3.1-Terminus-W4AFP8
|
tencent
|
2025-10-28T03:13:16+00:00
| 25
| 8
|
transformers
|
['transformers', 'safetensors', 'deepseek_v3', 'text-generation', 'quantized', 'TensorRT-Model-Optimizer', 'int4', 'fp8', 'conversational', 'custom_code', 'base_model:deepseek-ai/DeepSeek-V3.1-Terminus', 'base_model:finetune:deepseek-ai/DeepSeek-V3.1-Terminus', 'license:mit', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', '8-bit', 'region:us']
| 8
| 334
|
DeepseekV3ForCausalLM
|
deepseek_v3
| 38,537,003,008
| 163,840
| 7,168
| 128
| 61
| 129,280
|
Code Generation
|
['Quantized', 'Merged']
|
['question-answering']
|
['English', 'Spanish', 'German', 'Russian', 'Arabic']
|
['General Purpose']
|
{}
| 40
|
Critical
| 0.4
|
2025-11-02T06:02:37.292979
|
license: mit library_name: transformers base_model: - deepseek-ai/DeepSeek-V3.1-Terminus tags: - quantized - TensorRT-Model-Optimizer - int4 - fp8 This model is a mixed-precision quantized version of ...
|
[]
|
unsloth/MiniMax-M2-GGUF
|
MiniMax-M2-GGUF
|
unsloth
|
2025-11-01T12:45:39+00:00
| 0
| 8
|
transformers
|
['transformers', 'gguf', 'text-generation', 'arxiv:2504.07164', 'arxiv:2509.06501', 'arxiv:2509.13160', 'base_model:MiniMaxAI/MiniMax-M2', 'base_model:quantized:MiniMaxAI/MiniMax-M2', 'license:mit', 'endpoints_compatible', 'region:us', 'imatrix', 'conversational']
| 8
| 335
|
Unknown
|
unknown
| null | null | null | null | null | null |
Language Model
|
['LoRA']
|
['text-generation', 'question-answering', 'code-generation', 'conversation']
|
['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production']
|
{}
| 66
|
Medium
| 0.66
|
2025-11-02T06:02:38.492234
|
pipeline_tag: text-generation license: mit library_name: transformers base_model: - MiniMaxAI/MiniMax-M2 <div align="center"> <svg width="60%" height="auto" viewBox="0 0 144 48" fill="none" xmlns="htt...
|
['Function Calling', 'RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned']
|
TinyLlama/TinyLlama-1.1B-Chat-v1.0
|
TinyLlama-1.1B-Chat-v1.0
|
TinyLlama
|
2023-12-30T06:27:30+00:00
| 4,184,525
| 1,438
|
transformers
|
['transformers', 'safetensors', 'llama', 'text-generation', 'conversational', 'en', 'dataset:cerebras/SlimPajama-627B', 'dataset:bigcode/starcoderdata', 'dataset:HuggingFaceH4/ultrachat_200k', 'dataset:HuggingFaceH4/ultrafeedback_binarized', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 7
| 336
|
LlamaForCausalLM
|
llama
| 1,172,832,256
| 2,048
| 2,048
| 32
| 22
| 32,000
|
Language Model
|
['Fine-tuned']
|
['text-generation', 'conversation']
|
['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['General Purpose']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:02:38.719228
|
license: apache-2.0 datasets: - cerebras/SlimPajama-627B - bigcode/starcoderdata - HuggingFaceH4/ultrachat_200k - HuggingFaceH4/ultrafeedback_binarized language: - en widget: - example_title: Fibonacc...
|
['Long Context', 'Multi-turn', 'Safety Aligned']
|
meta-llama/Meta-Llama-3-8B
|
Meta-Llama-3-8B
|
meta-llama
|
2024-04-17T09:35:16+00:00
| 1,760,997
| 6,358
|
transformers
|
['transformers', 'safetensors', 'llama', 'text-generation', 'facebook', 'meta', 'pytorch', 'llama-3', 'en', 'license:llama3', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 7
| 337
|
Unknown
|
unknown
| null | null | null | null | null | null |
General Language Model
|
[]
|
['text-generation']
|
['English']
|
['General Purpose']
|
{}
| 40
|
Critical
| 0.4
|
2025-11-02T06:02:39.914430
|
No README available
|
[]
|
microsoft/Phi-3-mini-4k-instruct
|
Phi-3-mini-4k-instruct
|
microsoft
|
2024-04-22T16:18:17+00:00
| 1,546,273
| 1,321
|
transformers
|
['transformers', 'safetensors', 'phi3', 'text-generation', 'nlp', 'code', 'conversational', 'custom_code', 'en', 'fr', 'license:mit', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 7
| 338
|
Phi3ForCausalLM
|
phi3
| 3,722,379,264
| 4,096
| 3,072
| 32
| 32
| 32,064
|
Language Model
|
['Fine-tuned', 'Quantized', 'Specialized']
|
['text-generation', 'question-answering', 'summarization', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Education', 'Business', 'Legal']
|
{}
| 81
|
Medium
| 0.81
|
2025-11-02T06:02:42.325387
|
license: mit license_link: https://huggingface.co/microsoft/Phi-3-mini-4k-instruct/resolve/main/LICENSE language: - en - fr pipeline_tag: text-generation tags: - nlp - code inference: parameters: temp...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Memory Efficient', 'Multi-turn', 'Safety Aligned']
|
AlicanKiraz0/Cybersecurity-BaronLLM_Offensive_Security_LLM_Q6_K_GGUF
|
Cybersecurity-BaronLLM_Offensive_Security_LLM_Q6_K_GGUF
|
AlicanKiraz0
|
2025-01-21T03:41:56+00:00
| 632
| 110
|
transformers
|
['transformers', 'gguf', 'llama-cpp', 'gguf-my-repo', 'text-generation', 'en', 'base_model:meta-llama/Llama-3.1-8B-Instruct', 'base_model:quantized:meta-llama/Llama-3.1-8B-Instruct', 'license:mit', 'endpoints_compatible', 'region:us', 'conversational']
| 7
| 339
|
Unknown
|
unknown
| null | null | null | null | null | null |
General Language Model
|
[]
|
['text-generation']
|
['English']
|
['General Purpose']
|
{}
| 40
|
Critical
| 0.4
|
2025-11-02T06:02:43.883863
|
No README available
|
[]
|
Qwen/Qwen3-14B
|
Qwen3-14B
|
Qwen
|
2025-04-27T03:42:45+00:00
| 737,394
| 300
|
transformers
|
['transformers', 'safetensors', 'qwen3', 'text-generation', 'conversational', 'arxiv:2309.00071', 'arxiv:2505.09388', 'base_model:Qwen/Qwen3-14B-Base', 'base_model:finetune:Qwen/Qwen3-14B-Base', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 7
| 340
|
Qwen3ForCausalLM
|
qwen3
| 13,360,824,320
| 40,960
| 5,120
| 40
| 40
| 151,936
|
Language Model
|
[]
|
['text-generation', 'question-answering', 'translation', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Production', 'Creative Writing']
|
{}
| 71
|
Medium
| 0.71
|
2025-11-02T06:02:44.064325
|
library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-14B/blob/main/LICENSE pipeline_tag: text-generation base_model: - Qwen/Qwen3-14B-Base <a href="https://ch...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned']
|
Qwen/Qwen3-30B-A3B-Thinking-2507
|
Qwen3-30B-A3B-Thinking-2507
|
Qwen
|
2025-07-29T11:05:11+00:00
| 172,593
| 310
|
transformers
|
['transformers', 'safetensors', 'qwen3_moe', 'text-generation', 'conversational', 'arxiv:2402.17463', 'arxiv:2407.02490', 'arxiv:2501.15383', 'arxiv:2404.06654', 'arxiv:2505.09388', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 7
| 341
|
Qwen3MoeForCausalLM
|
qwen3_moe
| 2,727,084,032
| 262,144
| 2,048
| 32
| 48
| 151,936
|
Language Model
|
['Specialized']
|
['text-generation', 'question-answering', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Creative Writing']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:02:46.313187
|
library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-30B-A3B-Thinking-2507/blob/main/LICENSE pipeline_tag: text-generation <a href="https://chat.qwen.ai/" tar...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned']
|
google/gemma-3-270m
|
gemma-3-270m
|
google
|
2025-08-05T18:50:31+00:00
| 132,968
| 893
|
transformers
|
['transformers', 'safetensors', 'gemma3_text', 'text-generation', 'gemma3', 'gemma', 'google', 'arxiv:2503.19786', 'arxiv:1905.07830', 'arxiv:1905.10044', 'arxiv:1911.11641', 'arxiv:1705.03551', 'arxiv:1911.01547', 'arxiv:1907.10641', 'arxiv:2311.07911', 'arxiv:2311.12022', 'arxiv:2411.04368', 'arxiv:1904.09728', 'arxiv:1903.00161', 'arxiv:2009.03300', 'arxiv:2304.06364', 'arxiv:2103.03874', 'arxiv:2110.14168', 'arxiv:2108.07732', 'arxiv:2107.03374', 'arxiv:2403.07974', 'arxiv:2305.03111', 'arxiv:2405.04520', 'arxiv:2210.03057', 'arxiv:2106.03193', 'arxiv:1910.11856', 'arxiv:2502.12404', 'arxiv:2502.21228', 'arxiv:2404.16816', 'arxiv:2104.12756', 'arxiv:2311.16502', 'arxiv:2203.10244', 'arxiv:2404.12390', 'arxiv:1810.12440', 'arxiv:1908.02660', 'arxiv:2310.02255', 'arxiv:2312.11805', 'license:gemma', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 7
| 342
|
Unknown
|
unknown
| null | null | null | null | null | null |
General Language Model
|
[]
|
['text-generation']
|
['English']
|
['General Purpose']
|
{}
| 40
|
Critical
| 0.4
|
2025-11-02T06:02:47.444961
|
No README available
|
[]
|
inference-net/Schematron-3B
|
Schematron-3B
|
inference-net
|
2025-08-21T19:15:21+00:00
| 2,248,942
| 98
|
transformers
|
['transformers', 'safetensors', 'llama', 'text-generation', 'conversational', 'base_model:meta-llama/Llama-3.2-3B-Instruct', 'base_model:finetune:meta-llama/Llama-3.2-3B-Instruct', 'license:llama3.2', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 7
| 343
|
LlamaForCausalLM
|
llama
| 3,564,896,256
| 131,072
| 3,072
| 24
| 28
| 128,256
|
Language Model
|
['Specialized']
|
['text-generation', 'question-answering', 'code-generation']
|
['English', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Production', 'Legal']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:02:47.648310
|
library_name: transformers license: llama3.2 base_model: meta-llama/Llama-3.2-3B-Instruct <p align="center"> <img alt="Schematron" src="https://huggingface.co/inference-net/Schematron-3B/resolve/main/...
|
['Long Context', 'Fast Inference', 'Safety Aligned']
|
meta-llama/Llama-2-7b-hf
|
Llama-2-7b-hf
|
meta-llama
|
2023-07-13T16:16:13+00:00
| 658,332
| 2,184
|
transformers
|
['transformers', 'pytorch', 'safetensors', 'llama', 'text-generation', 'facebook', 'meta', 'llama-2', 'en', 'arxiv:2307.09288', 'license:llama2', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 6
| 344
|
Unknown
|
unknown
| null | null | null | null | null | null |
General Language Model
|
[]
|
['text-generation']
|
['English']
|
['General Purpose']
|
{}
| 40
|
Critical
| 0.4
|
2025-11-02T06:02:48.833648
|
No README available
|
[]
|
mistralai/Mistral-7B-v0.1
|
Mistral-7B-v0.1
|
mistralai
|
2023-09-20T13:03:50+00:00
| 509,066
| 3,999
|
transformers
|
['transformers', 'pytorch', 'safetensors', 'mistral', 'text-generation', 'pretrained', 'mistral-common', 'en', 'arxiv:2310.06825', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'region:us']
| 6
| 345
|
MistralForCausalLM
|
mistral
| 6,573,522,944
| 32,768
| 4,096
| 32
| 32
| 32,000
|
Language Model
|
[]
|
['text-generation']
|
['English', 'Spanish', 'German', 'Arabic']
|
['General Purpose']
|
{}
| 70
|
Medium
| 0.7
|
2025-11-02T06:02:49.080851
|
library_name: transformers language: - en license: apache-2.0 tags: - pretrained - mistral-common inference: false extra_gated_description: >- If you want to learn more about how we process your perso...
|
[]
|
TheBloke/Mistral-7B-Instruct-v0.2-GGUF
|
Mistral-7B-Instruct-v0.2-GGUF
|
TheBloke
|
2023-12-11T22:18:46+00:00
| 60,731
| 475
|
transformers
|
['transformers', 'gguf', 'mistral', 'finetuned', 'text-generation', 'arxiv:2310.06825', 'base_model:mistralai/Mistral-7B-Instruct-v0.2', 'base_model:quantized:mistralai/Mistral-7B-Instruct-v0.2', 'license:apache-2.0', 'region:us', 'conversational']
| 6
| 346
|
Unknown
|
mistral
| null | null | null | null | null | null |
Language Model
|
['Fine-tuned', 'Quantized']
|
['text-generation', 'question-answering', 'summarization', 'conversation']
|
['English', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Production', 'Creative Writing']
|
{}
| 63
|
Hard
| 0.63
|
2025-11-02T06:02:49.275008
|
base_model: mistralai/Mistral-7B-Instruct-v0.2 inference: false license: apache-2.0 model_creator: Mistral AI_ model_name: Mistral 7B Instruct v0.2 model_type: mistral pipeline_tag: text-generation pr...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn']
|
Qwen/Qwen3-32B
|
Qwen3-32B
|
Qwen
|
2025-04-27T03:52:59+00:00
| 1,610,353
| 561
|
transformers
|
['transformers', 'safetensors', 'qwen3', 'text-generation', 'conversational', 'arxiv:2309.00071', 'arxiv:2505.09388', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 6
| 347
|
Qwen3ForCausalLM
|
qwen3
| 20,910,571,520
| 40,960
| 5,120
| 64
| 64
| 151,936
|
Language Model
|
[]
|
['text-generation', 'question-answering', 'translation', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Production', 'Creative Writing']
|
{}
| 61
|
Hard
| 0.61
|
2025-11-02T06:02:49.430634
|
library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-32B/blob/main/LICENSE pipeline_tag: text-generation <a href="https://chat.qwen.ai/" target="_blank" style...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned']
|
Qwen/Qwen3-Reranker-0.6B
|
Qwen3-Reranker-0.6B
|
Qwen
|
2025-05-29T13:30:45+00:00
| 1,066,755
| 249
|
transformers
|
['transformers', 'safetensors', 'qwen3', 'text-generation', 'text-ranking', 'arxiv:2506.05176', 'base_model:Qwen/Qwen3-0.6B-Base', 'base_model:finetune:Qwen/Qwen3-0.6B-Base', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 6
| 348
|
Qwen3ForCausalLM
|
qwen3
| 507,630,592
| 40,960
| 1,024
| 16
| 28
| 151,669
|
Language Model
|
[]
|
['text-generation', 'text-classification', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['General Purpose']
|
{}
| 81
|
Medium
| 0.81
|
2025-11-02T06:02:49.894521
|
license: apache-2.0 base_model: - Qwen/Qwen3-0.6B-Base library_name: transformers pipeline_tag: text-ranking <p align="center"> <img src="https://qianwen-res.oss-accelerate-overseas.aliyuncs.com/logo_...
|
['Long Context', 'Safety Aligned']
|
zai-org/GLM-4.5-Air
|
GLM-4.5-Air
|
zai-org
|
2025-07-20T03:25:55+00:00
| 507,067
| 504
|
transformers
|
['transformers', 'safetensors', 'glm4_moe', 'text-generation', 'conversational', 'en', 'zh', 'arxiv:2508.06471', 'license:mit', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 6
| 349
|
Glm4MoeForCausalLM
|
glm4_moe
| 9,881,780,224
| 131,072
| 4,096
| 96
| 46
| 151,552
|
Language Model
|
[]
|
['text-generation', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Spanish', 'German', 'Arabic']
|
['Business']
|
{}
| 55
|
Hard
| 0.55
|
2025-11-02T06:02:50.041933
|
language: - en - zh library_name: transformers license: mit pipeline_tag: text-generation <div align="center"> <img src=https://raw.githubusercontent.com/zai-org/GLM-4.5/refs/heads/main/resources/logo...
|
[]
|
swiss-ai/Apertus-8B-Instruct-2509
|
Apertus-8B-Instruct-2509
|
swiss-ai
|
2025-08-13T09:30:23+00:00
| 376,921
| 384
|
transformers
|
['transformers', 'safetensors', 'apertus', 'text-generation', 'multilingual', 'compliant', 'swiss-ai', 'conversational', 'arxiv:2509.14233', 'base_model:swiss-ai/Apertus-8B-2509', 'base_model:finetune:swiss-ai/Apertus-8B-2509', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 6
| 350
|
ApertusForCausalLM
|
apertus
| 6,979,321,856
| 65,536
| 4,096
| 32
| 32
| 131,072
|
Language Model
|
['Specialized']
|
['text-generation', 'question-answering', 'summarization', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Production', 'Legal']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:02:50.217908
|
license: apache-2.0 base_model: - swiss-ai/Apertus-8B-2509 pipeline_tag: text-generation library_name: transformers tags: - multilingual - compliant - swiss-ai - apertus extra_gated_prompt: "### Apert...
|
['Function Calling', 'RAG Support', 'Long Context', 'Safety Aligned']
|
LiquidAI/LFM2-1.2B-Tool
|
LFM2-1.2B-Tool
|
LiquidAI
|
2025-09-03T17:35:21+00:00
| 924
| 81
|
transformers
|
['transformers', 'safetensors', 'lfm2', 'text-generation', 'liquid', 'edge', 'conversational', 'en', 'ar', 'zh', 'fr', 'de', 'ja', 'ko', 'es', 'base_model:LiquidAI/LFM2-1.2B', 'base_model:finetune:LiquidAI/LFM2-1.2B', 'license:other', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 6
| 351
|
Lfm2ForCausalLM
|
lfm2
| 939,524,096
| 128,000
| 2,048
| 32
| 16
| 65,536
|
Chat/Instruct
|
['LoRA']
|
['text-generation', 'translation', 'code-generation', 'conversation']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Production', 'Healthcare']
|
{}
| 85
|
Easy
| 0.85
|
2025-11-02T06:02:52.348493
|
library_name: transformers license: other license_name: lfm1.0 license_link: LICENSE language: - en - ar - zh - fr - de - ja - ko - es pipeline_tag: text-generation tags: - liquid - lfm2 - edge base_m...
|
['Function Calling', 'Fast Inference', 'Multi-turn']
|
Qwen/Qwen3-Next-80B-A3B-Thinking
|
Qwen3-Next-80B-A3B-Thinking
|
Qwen
|
2025-09-09T15:45:31+00:00
| 230,661
| 436
|
transformers
|
['transformers', 'safetensors', 'qwen3_next', 'text-generation', 'conversational', 'arxiv:2309.00071', 'arxiv:2505.09388', 'arxiv:2501.15383', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 6
| 352
|
Qwen3NextForCausalLM
|
qwen3_next
| 2,727,084,032
| 262,144
| 2,048
| 16
| 48
| 151,936
|
Language Model
|
['Merged']
|
['text-generation', 'question-answering', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Creative Writing']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:02:52.544495
|
library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-Next-80B-A3B-Thinking/blob/main/LICENSE pipeline_tag: text-generation <a href="https://chat.qwen.ai/" tar...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned']
|
moondream/moondream3-preview
|
moondream3-preview
|
moondream
|
2025-09-11T19:48:53+00:00
| 23,271
| 463
|
transformers
|
['transformers', 'safetensors', 'moondream3', 'text-generation', 'image-text-to-text', 'custom_code', 'doi:10.57967/hf/6761', 'license:other', 'autotrain_compatible', 'region:us']
| 6
| 353
|
Unknown
|
unknown
| null | null | null | null | null | null |
General Language Model
|
[]
|
['text-generation']
|
['English']
|
['General Purpose']
|
{}
| 40
|
Critical
| 0.4
|
2025-11-02T06:02:53.721389
|
No README available
|
[]
|
ibm-granite/granite-4.0-micro
|
granite-4.0-micro
|
ibm-granite
|
2025-09-16T19:47:09+00:00
| 25,720
| 220
|
transformers
|
['transformers', 'safetensors', 'granitemoehybrid', 'text-generation', 'language', 'granite-4.0', 'conversational', 'arxiv:0000.00000', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 6
| 354
|
GraniteMoeHybridForCausalLM
|
granitemoehybrid
| 3,402,629,120
| 131,072
| 2,560
| 40
| 40
| 100,352
|
Language Model
|
['Fine-tuned', 'RLHF']
|
['text-generation', 'question-answering', 'text-classification', 'summarization', 'conversation']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Education', 'Business']
|
{}
| 81
|
Medium
| 0.81
|
2025-11-02T06:02:53.888316
|
license: apache-2.0 library_name: transformers tags: - language - granite-4.0 📣 **Update [10-07-2025]:** Added a *default system prompt* to the chat template to guide the model towards more *professio...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned']
|
Qwen/Qwen3-Next-80B-A3B-Instruct-FP8
|
Qwen3-Next-80B-A3B-Instruct-FP8
|
Qwen
|
2025-09-22T03:48:53+00:00
| 163,485
| 56
|
transformers
|
['transformers', 'safetensors', 'qwen3_next', 'text-generation', 'conversational', 'arxiv:2309.00071', 'arxiv:2404.06654', 'arxiv:2505.09388', 'arxiv:2501.15383', 'base_model:Qwen/Qwen3-Next-80B-A3B-Instruct', 'base_model:quantized:Qwen/Qwen3-Next-80B-A3B-Instruct', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'fp8', 'region:us']
| 6
| 355
|
Qwen3NextForCausalLM
|
qwen3_next
| 2,727,084,032
| 262,144
| 2,048
| 16
| 48
| 151,936
|
Language Model
|
['Quantized']
|
['text-generation', 'question-answering', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Creative Writing']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:02:54.059809
|
library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3-Next-80B-A3B-Instruct-FP8/blob/main/LICENSE pipeline_tag: text-generation base_model: - Qwen/Qwen3-Next-8...
|
['Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned']
|
Qwen/Qwen3Guard-Gen-8B
|
Qwen3Guard-Gen-8B
|
Qwen
|
2025-09-23T11:40:09+00:00
| 31,113
| 61
|
transformers
|
['transformers', 'safetensors', 'qwen3', 'text-generation', 'conversational', 'arxiv:2510.14276', 'base_model:Qwen/Qwen3-8B', 'base_model:finetune:Qwen/Qwen3-8B', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 6
| 356
|
Qwen3ForCausalLM
|
qwen3
| 7,870,087,168
| 32,768
| 4,096
| 32
| 36
| 151,936
|
Language Model
|
['Specialized']
|
['text-generation', 'text-classification', 'conversation']
|
['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Creative Writing', 'Healthcare', 'Legal', 'Finance']
|
{}
| 68
|
Medium
| 0.68
|
2025-11-02T06:02:54.278515
|
library_name: transformers license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen3Guard-Gen-8B/blob/main/LICENSE pipeline_tag: text-generation base_model: - Qwen/Qwen3-8B <p align="center"...
|
['RAG Support', 'Safety Aligned']
|
ZJU-AI4H/Hulu-Med-7B
|
Hulu-Med-7B
|
ZJU-AI4H
|
2025-10-09T02:27:37+00:00
| 1,626
| 24
|
transformers
|
['transformers', 'safetensors', 'hulumed_qwen2', 'text-generation', 'medical', 'multimodal', 'vision-language-model', 'image-to-text', 'video-understanding', '3d-understanding', 'qwen', 'pytorch', 'image-text-to-text', 'conversational', 'custom_code', 'arxiv:2510.08668', 'license:apache-2.0', 'autotrain_compatible', 'region:us']
| 6
| 357
|
HulumedQwen2ForCausalLM
|
hulumed_qwen2
| 4,860,936,192
| 32,768
| 3,584
| 28
| 28
| 152,064
|
Language Model
|
['Merged', 'Specialized']
|
['text-generation', 'question-answering', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Education', 'Healthcare']
|
{}
| 70
|
Medium
| 0.7
|
2025-11-02T06:02:54.424265
|
license: apache-2.0 tags: - medical - multimodal - vision-language-model - image-to-text - video-understanding - 3d-understanding - qwen - pytorch frameworks: - pytorch pipeline_tag: image-text-to-tex...
|
['RAG Support', 'Fast Inference', 'Multi-turn']
|
ZJU-AI4H/Hulu-Med-32B
|
Hulu-Med-32B
|
ZJU-AI4H
|
2025-10-09T02:28:34+00:00
| 423
| 29
|
transformers
|
['transformers', 'safetensors', 'hulumed_qwen2', 'text-generation', 'medical', 'multimodal', 'vision-language-model', 'image-to-text', 'video-understanding', '3d-understanding', 'qwen', 'pytorch', 'image-text-to-text', 'conversational', 'custom_code', 'arxiv:2510.08668', 'license:apache-2.0', 'autotrain_compatible', 'region:us']
| 6
| 358
|
HulumedQwen2ForCausalLM
|
hulumed_qwen2
| 20,911,226,880
| 32,768
| 5,120
| 40
| 64
| 152,064
|
Language Model
|
['Merged', 'Specialized']
|
['text-generation', 'question-answering', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Education', 'Healthcare']
|
{}
| 50
|
Hard
| 0.5
|
2025-11-02T06:02:54.618400
|
license: apache-2.0 tags: - medical - multimodal - vision-language-model - image-to-text - video-understanding - 3d-understanding - qwen - pytorch frameworks: - pytorch pipeline_tag: image-text-to-tex...
|
['RAG Support', 'Fast Inference', 'Multi-turn']
|
nineninesix/kani-tts-400m-0.3-pt
|
kani-tts-400m-0.3-pt
|
nineninesix
|
2025-10-18T12:17:29+00:00
| 711
| 6
|
transformers
|
['transformers', 'safetensors', 'lfm2', 'text-generation', 'text-to-speech', 'en', 'ja', 'de', 'ar', 'zh', 'es', 'ko', 'ky', 'dataset:laion/Emolia', 'dataset:NightPrince/MasriSpeech-Full', 'arxiv:2501.15907', 'arxiv:2506.09827', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 6
| 359
|
Lfm2ForCausalLM
|
lfm2
| 283,798,528
| 128,000
| 1,024
| 16
| 16
| 80,539
|
Language Model
|
['Specialized']
|
['text-generation', 'conversation']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Education', 'Legal']
|
{}
| 75
|
Medium
| 0.75
|
2025-11-02T06:02:54.831196
|
license: apache-2.0 language: - en - ja - de - ar - zh - es - ko - ky pipeline_tag: text-to-speech library_name: transformers datasets: - laion/Emolia - NightPrince/MasriSpeech-Full <p> <img src="http...
|
['Fast Inference', 'Memory Efficient', 'Multi-turn']
|
meituan-longcat/LongCat-Flash-Omni-FP8
|
LongCat-Flash-Omni-FP8
|
meituan-longcat
|
2025-10-24T00:48:24+00:00
| 22
| 6
|
LongCat-Flash-Omni
|
['LongCat-Flash-Omni', 'safetensors', 'text-generation', 'transformers', 'conversational', 'custom_code', 'license:mit', 'fp8', 'region:us']
| 6
| 360
|
LongcatFlashOmniForCausalLM
|
unknown
| null | 131,072
| 6,144
| 64
| null | 131,072
|
Language Model
|
[]
|
['text-generation', 'question-answering', 'translation', 'summarization', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Legal']
|
{}
| 71
|
Medium
| 0.71
|
2025-11-02T06:02:55.108351
|
license: mit library_name: LongCat-Flash-Omni pipeline_tag: text-generation tags: - transformers <div align="center"> <img src="https://raw.githubusercontent.com/meituan-longcat/LongCat-Flash-Omni/mai...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned']
|
MiniMaxAI/MiniMax-M2
|
MiniMax-M2
|
MiniMaxAI
|
2025-10-22T13:45:10+00:00
| 529,835
| 912
|
transformers
|
['transformers', 'safetensors', 'minimax', 'text-generation', 'conversational', 'arxiv:2504.07164', 'arxiv:2509.06501', 'arxiv:2509.13160', 'license:mit', 'autotrain_compatible', 'endpoints_compatible', 'fp8', 'region:us']
| 912
| 361
|
MiniMaxM2ForCausalLM
|
minimax
| 7,635,861,504
| 196,608
| 3,072
| 48
| 62
| 200,064
|
Language Model
|
['LoRA']
|
['text-generation', 'question-answering', 'code-generation', 'conversation']
|
['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production']
|
{}
| 91
|
Easy
| 0.91
|
2025-11-02T06:02:57.327219
|
pipeline_tag: text-generation license: mit library_name: transformers <div align="center"> <svg width="60%" height="auto" viewBox="0 0 144 48" fill="none" xmlns="http://www.w3.org/2000/svg"> <path d="...
|
['Function Calling', 'RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned']
|
moonshotai/Kimi-Linear-48B-A3B-Instruct
|
Kimi-Linear-48B-A3B-Instruct
|
moonshotai
|
2025-10-30T12:37:31+00:00
| 7,914
| 284
|
transformers
|
['transformers', 'safetensors', 'kimi_linear', 'text-generation', 'conversational', 'custom_code', 'arxiv:2510.26692', 'arxiv:2412.06464', 'license:mit', 'autotrain_compatible', 'region:us']
| 284
| 362
|
KimiLinearForCausalLM
|
kimi_linear
| 2,097,414,144
| null | 2,304
| 32
| 27
| 163,840
|
Language Model
|
['RLHF']
|
['text-generation', 'code-generation', 'conversation']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Production', 'Education']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:02:57.476868
|
license: mit pipeline_tag: text-generation library_name: transformers <div align="center"> <a href="https://huggingface.co/papers/2510.26692"><img width="80%" src="figures/banner.png"></a> </div> <div...
|
['Long Context', 'Fast Inference', 'Safety Aligned']
|
openai/gpt-oss-safeguard-20b
|
gpt-oss-safeguard-20b
|
openai
|
2025-09-18T18:51:08+00:00
| 4,071
| 99
|
transformers
|
['transformers', 'safetensors', 'gpt_oss', 'text-generation', 'vllm', 'conversational', 'arxiv:2508.10925', 'base_model:openai/gpt-oss-20b', 'base_model:finetune:openai/gpt-oss-20b', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', '8-bit', 'mxfp4', 'region:us']
| 99
| 363
|
GptOssForCausalLM
|
gpt_oss
| 2,967,920,640
| 131,072
| 2,880
| 64
| 24
| 201,088
|
Language Model
|
[]
|
['text-generation', 'reasoning']
|
['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Production', 'Business']
|
{}
| 73
|
Medium
| 0.73
|
2025-11-02T06:02:57.812813
|
license: apache-2.0 pipeline_tag: text-generation library_name: transformers tags: - vllm base_model: - openai/gpt-oss-20b base_model_relation: finetune <p align="center"> <img alt="gpt-oss-safeguard-...
|
['Safety Aligned']
|
ibm-granite/granite-4.0-h-1b
|
granite-4.0-h-1b
|
ibm-granite
|
2025-10-07T20:21:46+00:00
| 2,392
| 86
|
transformers
|
['transformers', 'safetensors', 'granitemoehybrid', 'text-generation', 'language', 'granite-4.0', 'conversational', 'arxiv:0000.00000', 'base_model:ibm-granite/granite-4.0-h-1b-base', 'base_model:finetune:ibm-granite/granite-4.0-h-1b-base', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 86
| 364
|
GraniteMoeHybridForCausalLM
|
granitemoehybrid
| 1,286,602,752
| 131,072
| 1,536
| 12
| 40
| 100,352
|
Language Model
|
['Fine-tuned', 'RLHF', 'Specialized']
|
['text-generation', 'question-answering', 'text-classification', 'summarization', 'conversation']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Education']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:02:58.014609
|
license: apache-2.0 library_name: transformers tags: - language - granite-4.0 base_model: - ibm-granite/granite-4.0-h-1b-base **Model Summary:** Granite-4.0-H-1B is a lightweight instruct model finetu...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned']
|
openai/gpt-oss-20b
|
gpt-oss-20b
|
openai
|
2025-08-04T22:33:29+00:00
| 4,652,383
| 3,841
|
transformers
|
['transformers', 'safetensors', 'gpt_oss', 'text-generation', 'vllm', 'conversational', 'arxiv:2508.10925', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', '8-bit', 'mxfp4', 'region:us']
| 62
| 365
|
GptOssForCausalLM
|
gpt_oss
| 2,967,920,640
| 131,072
| 2,880
| 64
| 24
| 201,088
|
Language Model
|
['Fine-tuned', 'Specialized']
|
['text-generation', 'reasoning', 'conversation']
|
['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Production', 'Business']
|
{}
| 85
|
Easy
| 0.85
|
2025-11-02T06:02:58.237746
|
license: apache-2.0 pipeline_tag: text-generation library_name: transformers tags: - vllm <p align="center"> <img alt="gpt-oss-20b" src="https://raw.githubusercontent.com/openai/gpt-oss/main/docs/gpt-...
|
['Function Calling', 'Fast Inference', 'Multi-turn']
|
zai-org/GLM-4.6
|
GLM-4.6
|
zai-org
|
2025-09-29T18:22:51+00:00
| 65,184
| 939
|
transformers
|
['transformers', 'safetensors', 'glm4_moe', 'text-generation', 'conversational', 'en', 'zh', 'arxiv:2508.06471', 'license:mit', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 60
| 366
|
Glm4MoeForCausalLM
|
glm4_moe
| 29,716,643,840
| 202,752
| 5,120
| 96
| 92
| 151,552
|
Language Model
|
[]
|
['text-generation', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Arabic']
|
['Creative Writing']
|
{}
| 63
|
Hard
| 0.63
|
2025-11-02T06:02:58.448600
|
language: - en - zh library_name: transformers license: mit pipeline_tag: text-generation <div align="center"> <img src=https://raw.githubusercontent.com/zai-org/GLM-4.5/refs/heads/main/resources/logo...
|
['Function Calling', 'Long Context']
|
ibm-granite/granite-4.0-h-350m
|
granite-4.0-h-350m
|
ibm-granite
|
2025-10-07T20:23:17+00:00
| 3,059
| 56
|
transformers
|
['transformers', 'safetensors', 'granitemoehybrid', 'text-generation', 'language', 'granite-4.0', 'conversational', 'arxiv:0000.00000', 'base_model:ibm-granite/granite-4.0-h-350m-base', 'base_model:finetune:ibm-granite/granite-4.0-h-350m-base', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 56
| 367
|
GraniteMoeHybridForCausalLM
|
granitemoehybrid
| 303,562,752
| 32,768
| 768
| 12
| 32
| 100,352
|
Language Model
|
['Fine-tuned', 'RLHF', 'Specialized']
|
['text-generation', 'question-answering', 'text-classification', 'summarization', 'conversation']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Education']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:02:58.657652
|
license: apache-2.0 library_name: transformers tags: - language - granite-4.0 base_model: - ibm-granite/granite-4.0-h-350m-base **Model Summary:** Granite-4.0-H-350M is a lightweight instruct model fi...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned']
|
openai/gpt-oss-safeguard-120b
|
gpt-oss-safeguard-120b
|
openai
|
2025-09-18T18:50:45+00:00
| 1,640
| 50
|
transformers
|
['transformers', 'safetensors', 'gpt_oss', 'text-generation', 'vllm', 'conversational', 'arxiv:2508.10925', 'base_model:openai/gpt-oss-120b', 'base_model:finetune:openai/gpt-oss-120b', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', '8-bit', 'mxfp4', 'region:us']
| 50
| 368
|
GptOssForCausalLM
|
gpt_oss
| 4,162,314,240
| 131,072
| 2,880
| 64
| 36
| 201,088
|
Language Model
|
[]
|
['text-generation', 'reasoning']
|
['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Production', 'Business']
|
{}
| 68
|
Medium
| 0.68
|
2025-11-02T06:02:59.242237
|
license: apache-2.0 pipeline_tag: text-generation library_name: transformers tags: - vllm base_model: - openai/gpt-oss-120b base_model_relation: finetune <p align="center"> <img alt="gpt-oss-safeguard...
|
['Safety Aligned']
|
inclusionAI/LLaDA2.0-flash-preview
|
LLaDA2.0-flash-preview
|
inclusionAI
|
2025-10-25T09:22:29+00:00
| 803
| 53
|
transformers
|
['transformers', 'safetensors', 'llada2_moe', 'text-generation', 'dllm', 'diffusion', 'llm', 'text_generation', 'conversational', 'custom_code', 'license:apache-2.0', 'autotrain_compatible', 'region:us']
| 46
| 369
|
LLaDA2MoeModelLM
|
llada2_moe
| 7,086,276,608
| 16,384
| 4,096
| 32
| 32
| 157,184
|
Language Model
|
['Fine-tuned', 'RLHF']
|
['text-generation', 'question-answering', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Korean', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Education']
|
{}
| 65
|
Medium
| 0.65
|
2025-11-02T06:02:59.395981
|
license: apache-2.0 library_name: transformers tags: - dllm - diffusion - llm - text_generation **LLaDA2.0-flash-preview** is a diffusion language model featuring a 100BA6B Mixture-of-Experts (MoE) ar...
|
['Function Calling', 'RAG Support', 'Fast Inference', 'Memory Efficient']
|
utter-project/EuroLLM-9B
|
EuroLLM-9B
|
utter-project
|
2024-11-22T10:44:55+00:00
| 12,514
| 134
|
transformers
|
['transformers', 'pytorch', 'safetensors', 'llama', 'text-generation', 'en', 'de', 'es', 'fr', 'it', 'pt', 'pl', 'nl', 'tr', 'sv', 'cs', 'el', 'hu', 'ro', 'fi', 'uk', 'sl', 'sk', 'da', 'lt', 'lv', 'et', 'bg', 'no', 'ca', 'hr', 'ga', 'mt', 'gl', 'zh', 'ru', 'ko', 'ja', 'ar', 'hi', 'arxiv:2202.03799', 'arxiv:2402.17733', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 43
| 370
|
Unknown
|
unknown
| null | null | null | null | null | null |
General Language Model
|
[]
|
['text-generation']
|
['English']
|
['General Purpose']
|
{}
| 40
|
Critical
| 0.4
|
2025-11-02T06:03:00.658179
|
No README available
|
[]
|
meta-llama/Llama-3.1-8B-Instruct
|
Llama-3.1-8B-Instruct
|
meta-llama
|
2024-07-18T08:56:00+00:00
| 5,243,778
| 4,852
|
transformers
|
['transformers', 'safetensors', 'llama', 'text-generation', 'facebook', 'meta', 'pytorch', 'llama-3', 'conversational', 'en', 'de', 'fr', 'it', 'pt', 'hi', 'es', 'th', 'arxiv:2204.05149', 'base_model:meta-llama/Llama-3.1-8B', 'base_model:finetune:meta-llama/Llama-3.1-8B', 'license:llama3.1', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 39
| 371
|
Unknown
|
unknown
| null | null | null | null | null | null |
General Language Model
|
[]
|
['text-generation']
|
['English']
|
['General Purpose']
|
{}
| 40
|
Critical
| 0.4
|
2025-11-02T06:03:03.804054
|
No README available
|
[]
|
meituan-longcat/LongCat-Flash-Omni
|
LongCat-Flash-Omni
|
meituan-longcat
|
2025-10-23T09:42:24+00:00
| 32
| 39
|
LongCat-Flash-Omni
|
['LongCat-Flash-Omni', 'safetensors', 'text-generation', 'transformers', 'conversational', 'custom_code', 'license:mit', 'region:us']
| 39
| 372
|
LongcatFlashOmniForCausalLM
|
unknown
| null | 131,072
| 6,144
| 64
| null | 131,072
|
Language Model
|
[]
|
['text-generation', 'question-answering', 'translation', 'summarization', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Legal']
|
{}
| 71
|
Medium
| 0.71
|
2025-11-02T06:03:03.996299
|
license: mit library_name: LongCat-Flash-Omni pipeline_tag: text-generation tags: - transformers <div align="center"> <img src="https://raw.githubusercontent.com/meituan-longcat/LongCat-Flash-Omni/mai...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Multi-turn', 'Safety Aligned']
|
openai/gpt-oss-120b
|
gpt-oss-120b
|
openai
|
2025-08-04T22:33:06+00:00
| 3,697,129
| 4,085
|
transformers
|
['transformers', 'safetensors', 'gpt_oss', 'text-generation', 'vllm', 'conversational', 'arxiv:2508.10925', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', '8-bit', 'mxfp4', 'region:us']
| 38
| 373
|
GptOssForCausalLM
|
gpt_oss
| 4,162,314,240
| 131,072
| 2,880
| 64
| 36
| 201,088
|
Language Model
|
['Fine-tuned', 'Specialized']
|
['text-generation', 'reasoning', 'conversation']
|
['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Production', 'Business']
|
{}
| 80
|
Medium
| 0.8
|
2025-11-02T06:03:04.150662
|
license: apache-2.0 pipeline_tag: text-generation library_name: transformers tags: - vllm <p align="center"> <img alt="gpt-oss-120b" src="https://raw.githubusercontent.com/openai/gpt-oss/main/docs/gpt...
|
['Function Calling', 'Fast Inference', 'Multi-turn']
|
ibm-granite/granite-4.0-350m
|
granite-4.0-350m
|
ibm-granite
|
2025-10-07T20:25:05+00:00
| 1,392
| 38
|
transformers
|
['transformers', 'safetensors', 'granitemoehybrid', 'text-generation', 'language', 'granite-4.0', 'conversational', 'arxiv:0000.00000', 'base_model:ibm-granite/granite-4.0-350m-base', 'base_model:finetune:ibm-granite/granite-4.0-350m-base', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 38
| 374
|
GraniteMoeHybridForCausalLM
|
granitemoehybrid
| 455,081,984
| 32,768
| 1,024
| 16
| 28
| 100,352
|
Language Model
|
['Fine-tuned', 'RLHF', 'Specialized']
|
['text-generation', 'question-answering', 'text-classification', 'summarization', 'conversation']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Education']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:03:04.274091
|
license: apache-2.0 library_name: transformers tags: - language - granite-4.0 base_model: - ibm-granite/granite-4.0-350m-base **Model Summary:** Granite-4.0-350M is a lightweight instruct model finetu...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned']
|
ibm-granite/granite-4.0-1b
|
granite-4.0-1b
|
ibm-granite
|
2025-10-07T20:24:27+00:00
| 1,469
| 37
|
transformers
|
['transformers', 'safetensors', 'granitemoehybrid', 'text-generation', 'language', 'granite-4.0', 'conversational', 'arxiv:0000.00000', 'base_model:ibm-granite/granite-4.0-1b-base', 'base_model:finetune:ibm-granite/granite-4.0-1b-base', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 37
| 375
|
GraniteMoeHybridForCausalLM
|
granitemoehybrid
| 2,218,786,816
| 131,072
| 2,048
| 16
| 40
| 100,352
|
Language Model
|
['Fine-tuned', 'RLHF', 'Specialized']
|
['text-generation', 'question-answering', 'text-classification', 'summarization', 'conversation']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Education']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:03:05.047576
|
license: apache-2.0 library_name: transformers tags: - language - granite-4.0 base_model: - ibm-granite/granite-4.0-1b-base **Model Summary:** Granite-4.0-1B is a lightweight instruct model finetuned ...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Safety Aligned']
|
moonshotai/Kimi-Linear-48B-A3B-Base
|
Kimi-Linear-48B-A3B-Base
|
moonshotai
|
2025-10-30T12:39:14+00:00
| 71
| 36
|
transformers
|
['transformers', 'safetensors', 'kimi_linear', 'text-generation', 'conversational', 'custom_code', 'arxiv:2510.26692', 'arxiv:2412.06464', 'license:mit', 'autotrain_compatible', 'region:us']
| 36
| 376
|
KimiLinearForCausalLM
|
kimi_linear
| 2,097,414,144
| null | 2,304
| 32
| 27
| 163,840
|
Language Model
|
['RLHF']
|
['text-generation', 'code-generation', 'conversation']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Production', 'Education']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:03:05.636188
|
license: mit pipeline_tag: text-generation library_name: transformers This model is presented in the paper [Kimi Linear: An Expressive, Efficient Attention Architecture](https://huggingface.co/papers/...
|
['Long Context', 'Fast Inference', 'Safety Aligned']
|
AvitoTech/avibe
|
avibe
|
AvitoTech
|
2025-10-20T10:43:13+00:00
| 2,043
| 35
|
transformers
|
['transformers', 'safetensors', 'qwen3', 'text-generation', 'conversational', 'ru', 'en', 'base_model:Qwen/Qwen3-8B', 'base_model:finetune:Qwen/Qwen3-8B', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 35
| 377
|
Qwen3ForCausalLM
|
qwen3
| 7,724,507,136
| 32,768
| 4,096
| 32
| 36
| 116,394
|
Language Model
|
[]
|
['text-generation', 'question-answering', 'summarization', 'conversation']
|
['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['General Purpose']
|
{}
| 75
|
Medium
| 0.75
|
2025-11-02T06:03:05.824245
|
license: apache-2.0 language: - ru - en base_model: - Qwen/Qwen3-8B pipeline_tag: text-generation library_name: transformers A-vibe это большая языковая модель, созданная Авито Тех, дочерней технологи...
|
['Function Calling']
|
Alibaba-NLP/Tongyi-DeepResearch-30B-A3B
|
Tongyi-DeepResearch-30B-A3B
|
Alibaba-NLP
|
2025-09-16T06:56:49+00:00
| 13,787
| 731
|
transformers
|
['transformers', 'safetensors', 'qwen3_moe', 'text-generation', 'conversational', 'en', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 32
| 378
|
Qwen3MoeForCausalLM
|
qwen3_moe
| 2,727,084,032
| 131,072
| 2,048
| 32
| 48
| 151,936
|
Language Model
|
['RLHF']
|
['text-generation', 'question-answering', 'reasoning']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Production', 'Education']
|
{}
| 70
|
Medium
| 0.7
|
2025-11-02T06:03:07.314545
|
license: apache-2.0 language: - en pipeline_tag: text-generation library_name: transformers We present **Tongyi DeepResearch**, an agentic large language model featuring 30 billion total parameters, ...
|
['RAG Support']
|
deepseek-ai/DeepSeek-V3.2-Exp
|
DeepSeek-V3.2-Exp
|
deepseek-ai
|
2025-09-29T06:07:26+00:00
| 104,688
| 760
|
transformers
|
['transformers', 'safetensors', 'deepseek_v32', 'text-generation', 'conversational', 'base_model:deepseek-ai/DeepSeek-V3.2-Exp-Base', 'base_model:finetune:deepseek-ai/DeepSeek-V3.2-Exp-Base', 'license:mit', 'autotrain_compatible', 'endpoints_compatible', 'fp8', 'region:us']
| 31
| 379
|
DeepseekV32ForCausalLM
|
deepseek_v32
| 38,537,003,008
| 163,840
| 7,168
| 128
| 61
| 129,280
|
Language Model
|
[]
|
['text-generation', 'question-answering', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Spanish', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research']
|
{}
| 58
|
Hard
| 0.58
|
2025-11-02T06:03:07.713742
|
license: mit library_name: transformers base_model: - deepseek-ai/DeepSeek-V3.2-Exp-Base base_model_relation: finetune <!-- markdownlint-disable first-line-h1 --> <!-- markdownlint-disable html --> <!...
|
['Function Calling', 'Fast Inference', 'Safety Aligned']
|
nineninesix/kani-tts-400m-en
|
kani-tts-400m-en
|
nineninesix
|
2025-10-23T20:52:55+00:00
| 617
| 28
|
transformers
|
['transformers', 'safetensors', 'lfm2', 'text-generation', 'text-to-speech', 'en', 'arxiv:2501.15907', 'arxiv:2506.09827', 'base_model:nineninesix/kani-tts-400m-0.3-pt', 'base_model:finetune:nineninesix/kani-tts-400m-0.3-pt', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 28
| 380
|
Lfm2ForCausalLM
|
lfm2
| 283,798,528
| 128,000
| 1,024
| 16
| 16
| 80,539
|
Language Model
|
['Specialized']
|
['text-generation', 'conversation']
|
['English', 'Chinese', 'Korean', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Production', 'Education', 'Legal']
|
{}
| 75
|
Medium
| 0.75
|
2025-11-02T06:03:07.961888
|
license: apache-2.0 language: - en pipeline_tag: text-to-speech library_name: transformers base_model: - nineninesix/kani-tts-400m-0.3-pt <p> <img src="https://cdn-uploads.huggingface.co/production/up...
|
['Fast Inference', 'Memory Efficient', 'Multi-turn']
|
ibm-granite/granite-4.0-h-1b-base
|
granite-4.0-h-1b-base
|
ibm-granite
|
2025-10-07T20:22:43+00:00
| 480
| 27
|
transformers
|
['transformers', 'safetensors', 'granitemoehybrid', 'text-generation', 'language', 'granite-4.0', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 27
| 381
|
GraniteMoeHybridForCausalLM
|
granitemoehybrid
| 1,286,602,752
| 131,072
| 1,536
| 12
| 40
| 100,352
|
Language Model
|
['LoRA', 'Specialized']
|
['text-generation', 'question-answering', 'text-classification', 'summarization']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Education']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:03:10.157330
|
license: apache-2.0 library_name: transformers tags: - language - granite-4.0 **Model Summary:** Granite-4.0-H-1B-Base is a lightweight decoder-only language model designed for scenarios where efficie...
|
['Long Context', 'Fast Inference', 'Safety Aligned']
|
ibm-granite/granite-4.0-1b-base
|
granite-4.0-1b-base
|
ibm-granite
|
2025-10-07T20:24:47+00:00
| 965
| 26
|
transformers
|
['transformers', 'safetensors', 'granitemoehybrid', 'text-generation', 'language', 'granite-4.0', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 26
| 382
|
GraniteMoeHybridForCausalLM
|
granitemoehybrid
| 2,218,786,816
| 131,072
| 2,048
| 16
| 40
| 100,352
|
Language Model
|
['LoRA', 'Specialized']
|
['text-generation', 'question-answering', 'text-classification', 'summarization']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Education']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:03:10.314272
|
license: apache-2.0 library_name: transformers tags: - language - granite-4.0 **Model Summary:** Granite-4.0-1B-Base is a lightweight decoder-only language model designed for scenarios where efficienc...
|
['Long Context', 'Fast Inference', 'Safety Aligned']
|
moonshotai/Kimi-K2-Instruct-0905
|
Kimi-K2-Instruct-0905
|
moonshotai
|
2025-09-03T03:34:36+00:00
| 39,104
| 535
|
transformers
|
['transformers', 'safetensors', 'kimi_k2', 'text-generation', 'conversational', 'custom_code', 'license:other', 'autotrain_compatible', 'endpoints_compatible', 'fp8', 'region:us']
| 25
| 383
|
DeepseekV3ForCausalLM
|
kimi_k2
| 38,784,729,088
| 262,144
| 7,168
| 64
| 61
| 163,840
|
Language Model
|
[]
|
['question-answering', 'summarization', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Production', 'Creative Writing']
|
{}
| 48
|
Hard
| 0.48
|
2025-11-02T06:03:10.476065
|
license: other license_name: modified-mit library_name: transformers <div align="center"> <picture> <img src="figures/kimi-logo.png" width="30%" alt="Kimi K2: Open Agentic Intellignece"> </picture> </...
|
['Long Context']
|
ByteDance/Ouro-1.4B
|
Ouro-1.4B
|
ByteDance
|
2025-10-28T22:15:18+00:00
| 489
| 25
|
transformers
|
['transformers', 'safetensors', 'ouro', 'text-generation', 'looped-language-model', 'reasoning', 'recurrent-depth', 'conversational', 'custom_code', 'arxiv:2510.25741', 'license:apache-2.0', 'autotrain_compatible', 'region:us']
| 25
| 384
|
OuroForCausalLM
|
ouro
| 1,308,622,848
| 65,536
| 2,048
| 16
| 24
| 49,152
|
Language Model
|
[]
|
['text-generation', 'reasoning']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Production', 'Education']
|
{}
| 78
|
Medium
| 0.78
|
2025-11-02T06:03:10.844752
|
library_name: transformers license: apache-2.0 pipeline_tag: text-generation tags: - looped-language-model - reasoning - recurrent-depth 📚 [Paper](https://huggingface.co/papers/2510.25741) • 🏠 [Projec...
|
['RAG Support', 'Long Context']
|
ibm-granite/granite-4.0-h-350m-base
|
granite-4.0-h-350m-base
|
ibm-granite
|
2025-10-07T20:24:04+00:00
| 509
| 24
|
transformers
|
['transformers', 'safetensors', 'granitemoehybrid', 'text-generation', 'language', 'granite-4.0', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 24
| 385
|
GraniteMoeHybridForCausalLM
|
granitemoehybrid
| 303,562,752
| 32,768
| 768
| 12
| 32
| 100,352
|
Language Model
|
['LoRA', 'Specialized']
|
['text-generation', 'question-answering', 'text-classification', 'summarization']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Education']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:03:10.979950
|
license: apache-2.0 library_name: transformers tags: - language - granite-4.0 **Model Summary:** Granite-4.0-H-350M-Base is a lightweight decoder-only language model designed for scenarios where effic...
|
['Long Context', 'Fast Inference', 'Safety Aligned']
|
nvidia/Qwen3-Nemotron-32B-RLBFF
|
Qwen3-Nemotron-32B-RLBFF
|
nvidia
|
2025-10-12T05:55:24+00:00
| 396
| 24
|
transformers
|
['transformers', 'safetensors', 'qwen3', 'text-generation', 'nvidia', 'conversational', 'en', 'dataset:nvidia/HelpSteer3', 'arxiv:2509.21319', 'arxiv:2306.05685', 'arxiv:2503.04378', 'arxiv:2505.11475', 'arxiv:2410.01257', 'arxiv:2310.05344', 'arxiv:2311.09528', 'arxiv:2406.08673', 'base_model:Qwen/Qwen3-32B', 'base_model:finetune:Qwen/Qwen3-32B', 'license:other', 'autotrain_compatible', 'text-generation-inference', 'region:us']
| 24
| 386
|
Qwen3ForCausalLM
|
qwen3
| 20,910,571,520
| 40,960
| 5,120
| 64
| 64
| 151,936
|
Language Model
|
['Fine-tuned', 'RLHF']
|
['text-generation', 'code-generation', 'conversation']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Production', 'Creative Writing', 'Business']
|
{}
| 61
|
Hard
| 0.61
|
2025-11-02T06:03:11.225122
|
license: other license_name: nvidia-open-model-license license_link: >- https://www.nvidia.com/en-us/agreements/enterprise-software/nvidia-open-model-license/ inference: false fine-tuning: false langu...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Memory Efficient', 'Multi-turn', 'Safety Aligned']
|
PokeeAI/pokee_research_7b
|
pokee_research_7b
|
PokeeAI
|
2025-10-17T20:38:37+00:00
| 5,574
| 95
|
transformers
|
['transformers', 'safetensors', 'qwen2', 'text-generation', 'agent', 'deepresearch', 'llm', 'rl', 'reinforcementlearning', 'conversational', 'en', 'dataset:miromind-ai/MiroRL-GenQA', 'arxiv:2510.15862', 'base_model:Qwen/Qwen2.5-7B-Instruct', 'base_model:finetune:Qwen/Qwen2.5-7B-Instruct', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 24
| 387
|
Qwen2ForCausalLM
|
qwen2
| 4,860,936,192
| 32,768
| 3,584
| 28
| 28
| 152,064
|
Language Model
|
['Fine-tuned', 'RLHF', 'Merged', 'Specialized']
|
['text-generation', 'question-answering', 'summarization', 'reasoning']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Production', 'Education', 'Healthcare', 'Legal', 'Finance']
|
{}
| 78
|
Medium
| 0.78
|
2025-11-02T06:03:11.532085
|
base_model: - Qwen/Qwen2.5-7B-Instruct datasets: - miromind-ai/MiroRL-GenQA language: - en license: apache-2.0 tags: - agent - deepresearch - llm - rl - reinforcementlearning pipeline_tag: text-genera...
|
['RAG Support', 'Fast Inference', 'Memory Efficient', 'Multi-turn', 'Safety Aligned']
|
ibm-granite/granite-4.0-350m-base
|
granite-4.0-350m-base
|
ibm-granite
|
2025-10-07T20:25:24+00:00
| 495
| 23
|
transformers
|
['transformers', 'safetensors', 'granitemoehybrid', 'text-generation', 'language', 'granite-4.0', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 23
| 388
|
GraniteMoeHybridForCausalLM
|
granitemoehybrid
| 455,081,984
| 32,768
| 1,024
| 16
| 28
| 100,352
|
Language Model
|
['LoRA', 'Specialized']
|
['text-generation', 'question-answering', 'text-classification', 'summarization']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Research', 'Education']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:03:11.713311
|
license: apache-2.0 library_name: transformers tags: - language - granite-4.0 **Model Summary:** Granite-4.0-350M-Base is a lightweight decoder-only language model designed for scenarios where efficie...
|
['Long Context', 'Fast Inference', 'Safety Aligned']
|
internlm/JanusCoder-14B
|
JanusCoder-14B
|
internlm
|
2025-10-27T09:34:49+00:00
| 165
| 23
|
transformers
|
['transformers', 'safetensors', 'qwen3', 'text-generation', 'image-text-to-text', 'conversational', 'arxiv:2510.23538', 'arxiv:2403.14734', 'arxiv:2510.09724', 'arxiv:2507.22080', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 23
| 389
|
Qwen3ForCausalLM
|
qwen3
| 13,360,824,320
| 32,768
| 5,120
| 40
| 40
| 151,936
|
Language Model
|
['Specialized']
|
['text-generation', 'code-generation', 'conversation']
|
['English', 'Chinese', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research']
|
{}
| 58
|
Hard
| 0.58
|
2025-11-02T06:03:11.847754
|
license: apache-2.0 pipeline_tag: image-text-to-text library_name: transformers [💻Github Repo](https://github.com/InternLM/JanusCoder) • [🤗Model Collections](https://huggingface.co/collections/internl...
|
['Safety Aligned']
|
ByteDance/Ouro-2.6B
|
Ouro-2.6B
|
ByteDance
|
2025-10-28T22:19:46+00:00
| 227
| 21
|
transformers
|
['transformers', 'safetensors', 'ouro', 'text-generation', 'looped-language-model', 'reasoning', 'recurrent-depth', 'conversational', 'custom_code', 'arxiv:2510.25741', 'license:apache-2.0', 'autotrain_compatible', 'region:us']
| 21
| 390
|
OuroForCausalLM
|
ouro
| 2,516,582,400
| 65,536
| 2,048
| 16
| 48
| 49,152
|
Language Model
|
[]
|
['text-generation', 'reasoning']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Production', 'Education']
|
{}
| 78
|
Medium
| 0.78
|
2025-11-02T06:03:12.171111
|
library_name: transformers license: apache-2.0 pipeline_tag: text-generation tags: - looped-language-model - reasoning - recurrent-depth  [📚 Paper (Hugging Face)](https://...
|
['RAG Support', 'Long Context']
|
HuggingFaceTB/SmolLM3-3B
|
SmolLM3-3B
|
HuggingFaceTB
|
2025-07-08T10:11:45+00:00
| 56,521
| 768
|
transformers
|
['transformers', 'safetensors', 'smollm3', 'text-generation', 'conversational', 'en', 'fr', 'es', 'it', 'pt', 'zh', 'ar', 'ru', 'base_model:HuggingFaceTB/SmolLM3-3B-Base', 'base_model:finetune:HuggingFaceTB/SmolLM3-3B-Base', 'license:apache-2.0', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 20
| 391
|
SmolLM3ForCausalLM
|
smollm3
| 2,074,607,616
| 65,536
| 2,048
| 16
| 36
| 128,256
|
Language Model
|
['Quantized']
|
['text-generation', 'question-answering', 'summarization', 'code-generation', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Korean', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Production']
|
{}
| 83
|
Medium
| 0.83
|
2025-11-02T06:03:14.683845
|
library_name: transformers license: apache-2.0 language: - en - fr - es - it - pt - zh - ar - ru base_model: - HuggingFaceTB/SmolLM3-3B-Base  arch...
|
['Function Calling', 'RAG Support', 'Fast Inference', 'Memory Efficient']
|
cerebras/GLM-4.5-Air-REAP-82B-A12B
|
GLM-4.5-Air-REAP-82B-A12B
|
cerebras
|
2025-10-20T15:44:06+00:00
| 1,718
| 81
|
transformers
|
['transformers', 'safetensors', 'glm4_moe', 'text-generation', 'glm', 'MOE', 'pruning', 'compression', 'conversational', 'en', 'arxiv:2510.13999', 'base_model:zai-org/GLM-4.5-Air', 'base_model:finetune:zai-org/GLM-4.5-Air', 'license:mit', 'autotrain_compatible', 'endpoints_compatible', 'region:us']
| 18
| 395
|
Glm4MoeForCausalLM
|
glm4_moe
| 9,881,780,224
| 131,072
| 4,096
| 96
| 46
| 151,552
|
Language Model
|
['Specialized']
|
['text-generation', 'question-answering', 'code-generation', 'reasoning']
|
['English', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Production', 'Creative Writing']
|
{}
| 70
|
Medium
| 0.7
|
2025-11-02T06:03:15.376603
|
language: - en library_name: transformers tags: - glm - MOE - pruning - compression license: mit name: cerebras/GLM-4.5-Air-REAP-82B-A12B description: > This model was obtained by uniformly pruning 25...
|
['Function Calling', 'Fast Inference', 'Memory Efficient', 'Multi-turn']
|
Qwen/Qwen2.5-7B-Instruct
|
Qwen2.5-7B-Instruct
|
Qwen
|
2024-09-16T11:55:40+00:00
| 8,169,288
| 849
|
transformers
|
['transformers', 'safetensors', 'qwen2', 'text-generation', 'chat', 'conversational', 'en', 'arxiv:2309.00071', 'arxiv:2407.10671', 'base_model:Qwen/Qwen2.5-7B', 'base_model:finetune:Qwen/Qwen2.5-7B', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 17
| 396
|
Qwen2ForCausalLM
|
qwen2
| 4,860,936,192
| 32,768
| 3,584
| 28
| 28
| 152,064
|
Language Model
|
['Specialized']
|
['text-generation', 'question-answering', 'code-generation', 'conversation']
|
['English', 'Chinese', 'Korean', 'Japanese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['Production']
|
{}
| 86
|
Easy
| 0.86
|
2025-11-02T06:03:15.566097
|
license: apache-2.0 license_link: https://huggingface.co/Qwen/Qwen2.5-7B-Instruct/blob/main/LICENSE language: - en pipeline_tag: text-generation base_model: Qwen/Qwen2.5-7B tags: - chat library_name: ...
|
['Long Context', 'Fast Inference', 'Safety Aligned']
|
deepseek-ai/DeepSeek-R1
|
DeepSeek-R1
|
deepseek-ai
|
2025-01-20T03:46:07+00:00
| 425,526
| 12,820
|
transformers
|
['transformers', 'safetensors', 'deepseek_v3', 'text-generation', 'conversational', 'custom_code', 'arxiv:2501.12948', 'license:mit', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'fp8', 'region:us']
| 17
| 397
|
DeepseekV3ForCausalLM
|
deepseek_v3
| 38,537,003,008
| 163,840
| 7,168
| 128
| 61
| 129,280
|
Language Model
|
['Fine-tuned', 'RLHF', 'Merged']
|
['text-generation', 'question-answering', 'summarization', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Education', 'Business']
|
{}
| 48
|
Hard
| 0.48
|
2025-11-02T06:03:15.873743
|
license: mit library_name: transformers <!-- markdownlint-disable first-line-h1 --> <!-- markdownlint-disable html --> <!-- markdownlint-disable no-duplicate-header --> <div align="center"> <img src="...
|
['RAG Support', 'Long Context']
|
Qwen/Qwen3-Embedding-0.6B
|
Qwen3-Embedding-0.6B
|
Qwen
|
2025-06-03T14:25:32+00:00
| 4,372,213
| 699
|
sentence-transformers
|
['sentence-transformers', 'safetensors', 'qwen3', 'text-generation', 'transformers', 'sentence-similarity', 'feature-extraction', 'text-embeddings-inference', 'arxiv:2506.05176', 'base_model:Qwen/Qwen3-0.6B-Base', 'base_model:finetune:Qwen/Qwen3-0.6B-Base', 'license:apache-2.0', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 17
| 398
|
Qwen3ForCausalLM
|
qwen3
| 507,630,592
| 32,768
| 1,024
| 16
| 28
| 151,669
|
Language Model
|
[]
|
['text-classification', 'code-generation', 'reasoning']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic', 'Multilingual']
|
['General Purpose']
|
{}
| 81
|
Medium
| 0.81
|
2025-11-02T06:03:16.072802
|
license: apache-2.0 base_model: - Qwen/Qwen3-0.6B-Base tags: - transformers - sentence-transformers - sentence-similarity - feature-extraction - text-embeddings-inference <p align="center"> <img src="...
|
['Long Context', 'Safety Aligned']
|
ByteDance/Ouro-2.6B-Thinking
|
Ouro-2.6B-Thinking
|
ByteDance
|
2025-10-28T22:27:40+00:00
| 174
| 17
|
transformers
|
['transformers', 'safetensors', 'ouro', 'text-generation', 'looped-language-model', 'reasoning', 'recurrent-depth', 'thinking', 'chain-of-thought', 'conversational', 'custom_code', 'arxiv:2510.25741', 'license:apache-2.0', 'autotrain_compatible', 'region:us']
| 17
| 399
|
OuroForCausalLM
|
ouro
| 2,516,582,400
| 65,536
| 2,048
| 16
| 48
| 49,152
|
Language Model
|
['Specialized']
|
['text-generation', 'reasoning', 'conversation']
|
['English', 'Chinese', 'Spanish', 'French', 'German', 'Russian', 'Arabic']
|
['Research', 'Production', 'Creative Writing']
|
{}
| 83
|
Medium
| 0.83
|
2025-11-02T06:03:16.277447
|
library_name: transformers license: apache-2.0 pipeline_tag: text-generation tags: - looped-language-model - reasoning - recurrent-depth - thinking - chain-of-thought  **⚠...
|
['RAG Support', 'Long Context', 'Fast Inference', 'Memory Efficient']
|
meta-llama/Llama-3.2-1B
|
Llama-3.2-1B
|
meta-llama
|
2024-09-18T15:03:14+00:00
| 1,788,591
| 2,143
|
transformers
|
['transformers', 'safetensors', 'llama', 'text-generation', 'facebook', 'meta', 'pytorch', 'llama-3', 'en', 'de', 'fr', 'it', 'pt', 'hi', 'es', 'th', 'arxiv:2204.05149', 'arxiv:2405.16406', 'license:llama3.2', 'autotrain_compatible', 'text-generation-inference', 'endpoints_compatible', 'region:us']
| 16
| 400
|
Unknown
|
unknown
| null | null | null | null | null | null |
General Language Model
|
[]
|
['text-generation']
|
['English']
|
['General Purpose']
|
{}
| 40
|
Critical
| 0.4
|
2025-11-02T06:03:17.510247
|
No README available
|
[]
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.