Hyperbolic

hyperbolic.xyz
Platform Stats
Total Models9
Organizations3
Verified Benchmarks0
Multimodal Models1
Pricing Overview
Avg Input (per 1M)$1.08
Avg Output (per 1M)$1.08
Cheapest Model$0.10
Premium Model$4.00
Supported Features
Number of models supporting each feature
web Search
0
function Calling
9
structured Output
9
code Execution
0
batch Inference
9
finetuning
0
Input Modalities
Models supporting different input types
text
9 (100%)
image
1 (11%)
audio
0 (0%)
video
0 (0%)
Models Overview
Top performers and pricing distribution

Pricing Distribution

Input pricing per 1M tokens
$0-1
6 models
$1-5
3 models

Top Performing Models

By benchmark avg
#1Llama 3.3 70B Instruct
79.9%
#2Llama 3.1 405B Instruct
79.2%
#3Qwen2.5 72B Instruct
77.4%
#4Llama 3.1 70B Instruct
74.7%
#5Llama 3.2 90B Instruct
71.3%

Most Affordable Models

Llama 3.1 8B Instruct
$0.10/1M
QwQ-32B-Preview
$0.20/1M
Qwen2.5-Coder 32B Instruct
$0.20/1M

Available Models

9 models available through Hyperbolic

LicenseLinks
DeepSeekDeepSeek-V2.5
DeepSeek-V2.5 is an upgraded version that combines DeepSeek-V2-Chat and DeepSeek-Coder-V2-Instruct, integrating general and coding abilities. It better aligns with human preferences and has been optimized in various aspects, including writing and instruction following.
May 8, 2024
deepseek
16.8%-89.0%--
MetaLlama 3.1 8B Instruct
Llama 3.1 8B Instruct is a multilingual large language model optimized for dialogue use cases. It features a 128K context length, state-of-the-art tool use, and strong reasoning capabilities.
Jul 23, 2024
Llama 3.1 Community License
--72.6%--
AlibabaQwQ-32B-Preview
An experimental research model focused on advancing AI reasoning capabilities, particularly excelling in mathematics and programming. Features deep introspection and self-questioning abilities while having some limitations in language mixing and recursive reasoning patterns.
Nov 28, 2024
Apache 2.0
---50.0%-
AlibabaQwen2.5-Coder 32B Instruct
Qwen2.5-Coder is a specialized coding model trained on 5.5 trillion tokens of code data, supporting 92 programming languages with a 128K context window. It excels in code generation, completion, repair, and multi-programming tasks while maintaining strong performance in mathematics and general capabilities.
Sep 19, 2024
Apache 2.0
--92.7%31.4%90.2%
MetaLlama 3.3 70B Instruct
Llama 3.3 is a multilingual large language model optimized for dialogue use cases across multiple languages. It is a pretrained and instruction-tuned generative model with 70 billion parameters, outperforming many open-source and closed chat models on common industry benchmarks. Llama 3.3 supports a context length of 128,000 tokens and is designed for commercial and research use in multiple languages.
Dec 6, 2024
Llama 3.3 Community License Agreement
--88.4%--
AlibabaQwen2.5 72B Instruct
Qwen2.5-72B-Instruct is an instruction-tuned 72 billion parameter language model, part of the Qwen2.5 series. It is designed to follow instructions, generate long texts (over 8K tokens), understand structured data (e.g., tables), and generate structured outputs, especially JSON. The model supports multilingual capabilities across over 29 languages.
Sep 19, 2024
Qwen
--86.6%55.5%88.2%
MetaLlama 3.1 70B Instruct
Llama 3.1 70B Instruct is a large language model optimized for multilingual dialogue use cases. It outperforms many available open source and closed chat models on common industry benchmarks.
Jul 23, 2024
Llama 3.1 Community License
--80.5%--
MetaLlama 3.2 90B Instruct
Llama 3.2 90B is a large multimodal language model optimized for visual recognition, image reasoning, and captioning tasks. It supports a context length of 128,000 tokens and is designed for deployment on edge and mobile devices, offering state-of-the-art performance in image understanding and generative tasks.
Sep 25, 2024
Llama 3.2
-----
MetaLlama 3.1 405B Instruct
Llama 3.1 405B Instruct is a large language model optimized for multilingual dialogue use cases. It outperforms many available open source and closed chat models on common industry benchmarks. The model supports 8 languages and has a 128K token context length.
Jul 23, 2024
Llama 3.1 Community License
--89.0%--