🧠
Model

Kokoro 82m

by hexgrad ID: hf-model--hexgrad--kokoro-82m
Scale 0.082B
Downloads 4.3M
FNI Rank 35
Percentile Top 0%
Activity
β†’ 0.0%

**Kokoro** is an open-weight TTS model with 82 million parameters. Despite its lightweight architecture, it delivers comparable quality to larger models while being significantly faster and more cost-efficient. With Apache-licensed weights, Kokoro can be deployed anywhere from production environment...

Audited 35 FNI Score
Tiny 0.082B Params
- Context
Hot 4.3M Downloads
8G GPU ~2GB Est. VRAM
Model Information Summary
Entity Passport
Registry ID hf-model--hexgrad--kokoro-82m
Provider huggingface
πŸ’Ύ

Compute Threshold

~1.4GB VRAM

Interactive
Analyze Hardware
β–Ό

* Static estimation for 4-Bit Quantization.

πŸ“œ

Cite this model

Academic & Research Attribution

BibTeX
@misc{hf_model__hexgrad__kokoro_82m,
  author = {hexgrad},
  title = {Kokoro 82m Model},
  year = {2026},
  howpublished = {\url{https://huggingface.co/hexgrad/Kokoro-82M}},
  note = {Accessed via Free2AITools Knowledge Fortress}
}
APA Style
hexgrad. (2026). Kokoro 82m [Model]. Free2AITools. https://huggingface.co/hexgrad/Kokoro-82M

πŸ”¬Technical Deep Dive

Full Specifications [+]

⚑ Quick Commands

πŸ¦™ Ollama Run
ollama run kokoro-82m
πŸ€— HF Download
huggingface-cli download hexgrad/kokoro-82m

βš–οΈ Free2AI Nexus Index

Methodology β†’ πŸ“˜ What is FNI?
35.0
Top 0% Overall Impact
πŸ”₯ Popularity (P) 0
πŸš€ Velocity (V) 0
πŸ›‘οΈ Credibility (C) 0
πŸ”§ Utility (U) 0
Nexus Verified Data

πŸ’¬ Why this score?

The Nexus Index for Kokoro 82m aggregates Popularity (P:0), Velocity (V:0), and Credibility (C:0). The Utility score (U:0) represents deployment readiness, context efficiency, and structural reliability within the Nexus ecosystem.

Data Verified πŸ• Last Updated: Not calculated
Free2AI Nexus Index | Fair Β· Transparent Β· Explainable | Full Methodology
---

πŸš€ What's Next?

Technical Deep Dive

πŸ“ Limitations & Considerations

  • β€’ Benchmark scores may vary based on evaluation methodology and hardware configuration.
  • β€’ VRAM requirements are estimates; actual usage depends on quantization and batch size.
  • β€’ FNI scores are relative rankings and may change as new models are added.
  • β€’ Source: Unknown
Top Tier

Social Proof

HuggingFace Hub
5.4KLikes
4.3MDownloads
πŸ”„ Daily sync (03:00 UTC)

AI Summary: Based on Hugging Face metadata. Not a recommendation.

πŸ“Š FNI Methodology πŸ“š Knowledge Baseℹ️ Verify with original source

πŸ›‘οΈ Model Transparency Report

Verified data manifest for traceability and transparency.

100% Data Disclosure Active

πŸ†” Identity & Source

id
hf-model--hexgrad--kokoro-82m
source
huggingface
author
hexgrad
tags
text-to-speechenarxiv:2306.07691arxiv:2203.02395base_model:yl4579/styletts2-ljspeechbase_model:finetune:yl4579/styletts2-ljspeechdoi:10.57967/hf/4329license:apache-2.0region:us

βš™οΈ Technical Specs

architecture
null
params billions
0.082
context length
null
pipeline tag
text-to-speech
vram gb
1.4
vram is estimated
true
vram formula
VRAM β‰ˆ (params * 0.75) + 0.8GB (KV) + 0.5GB (OS)

πŸ“Š Engagement & Metrics

likes
5,378
downloads
4,344,156

Free2AITools Constitutional Data Pipeline: Curated disclosure mode active. (V15.x Standard)