π§ Model
whisper-large-v3-turbo
by openai
whisper-large-v3-turbo is an open-source AI model by openai
π Updated 12/31/2025
Technical Specifications
Parameters0.81
ArchitectureWhisperForConditionalGeneration
View Config (3 entries)
{
"architectures": [
"WhisperForConditionalGeneration"
],
"model_type": "whisper",
"tokenizer_config": {
"bos_token": "<|endoftext|>",
"eos_token": "<|endoftext|>",
"pad_token": "<|endoftext|>",
"unk_token": "<|endoftext|>"
}
}
πΎ
Est. VRAM Required
~3 GB
Estimation Formula
VRAM = params Γ 0.6 + 2 GB
Based on FP16 precision.
β οΈ Does not account for KV cache or parallel overhead.
π Estimate only. Actual requirements may vary.
π Daily sync (11:00 Beijing)
Based on open-source metadata snapshot. Last synced: Dec 31, 2025
π§ Architecture Explorer
Neural network architecture
1 Input Layer
2 Hidden Layers
3 Attention
4 Output Layer
Parameters 0.81B
Technical Specifications
Parameters0.81
ArchitectureWhisperForConditionalGeneration
0View Config (3 entries)
{
"architectures": [
"WhisperForConditionalGeneration"
],
"model_type": "whisper",
"tokenizer_config": {
"bos_token": "<|endoftext|>",
"eos_token": "<|endoftext|>",
"pad_token": "<|endoftext|>",
"unk_token": "<|endoftext|>"
}
}
π Limitations & Considerations
- β’ Benchmark scores may vary based on evaluation methodology and hardware configuration.
- β’ VRAM requirements are estimates; actual usage depends on quantization and batch size.
- β’ FNI scores are relative rankings and may change as new models are added.
- β License Unknown: Verify licensing terms before commercial use.
- β’ Source: Huggingface
π Related Resources
π Related Papers
No related papers linked yet. Check the model's official documentation for research papers.
π Training Datasets
Training data information not available. Refer to the original model card for details.