Datasets:
Update README.md
Browse files
README.md
CHANGED
|
@@ -6,7 +6,38 @@ language:
|
|
| 6 |
- en
|
| 7 |
tags:
|
| 8 |
- code
|
|
|
|
|
|
|
|
|
|
| 9 |
---
|
| 10 |
|
|
|
|
| 11 |
|
| 12 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 6 |
- en
|
| 7 |
tags:
|
| 8 |
- code
|
| 9 |
+
- llama-cpp
|
| 10 |
+
- wheels
|
| 11 |
+
- linux
|
| 12 |
---
|
| 13 |
|
| 14 |
+
# 🏭 Llama-cpp-python Mega-Factory
|
| 15 |
|
| 16 |
+
> **"Stop waiting for `pip` to compile. Just install and run."**
|
| 17 |
+
|
| 18 |
+
This repository is a high-performance archive of pre-compiled `llama-cpp-python` wheels. Specifically built for **Debian/Ubuntu**, these binaries eliminate "compilation hell" and unlock the full potential of your hardware.
|
| 19 |
+
|
| 20 |
+
---
|
| 21 |
+
|
| 22 |
+
## 🚀 Why These Wheels?
|
| 23 |
+
|
| 24 |
+
Standard wheels target the "lowest common denominator" to avoid crashes on old hardware. This factory uses a massive **Everything Preset** to target specific server-grade instruction sets, maximizing your **Tokens per Second (T/s)**.
|
| 25 |
+
|
| 26 |
+
* **Zero Dependencies:** No `cmake`, `gcc`, or `nvcc` required on your target machine.
|
| 27 |
+
* **Server-Grade Power:** Optimized builds for architectures like `Sapphire Rapids`, `Icelake`, `Alderlake`, and `Haswell`.
|
| 28 |
+
* **Full Backend Support:** Pre-configured for `Vulkan`, `OpenBLAS`, `CLBlast`, and `MKL`.
|
| 29 |
+
* **Cutting Edge:** Supporting Python versions from `3.10` up to experimental `3.14`.
|
| 30 |
+
|
| 31 |
+
---
|
| 32 |
+
|
| 33 |
+
## 📊 The "Everything" Matrix
|
| 34 |
+
Our private distributed build farm is currently maintaining **3,600+ combinations**:
|
| 35 |
+
|
| 36 |
+
| Category | Coverage |
|
| 37 |
+
| :--- | :--- |
|
| 38 |
+
| **Llama-cpp-python** | v0.3.12 — v0.3.16+ |
|
| 39 |
+
| **Python Versions** | 3.10, 3.11, 3.12, 3.13, 3.14 |
|
| 40 |
+
| **Backends** | `Basic (CPU)`, `OpenBLAS`, `Vulkan`, `CLBlast`, `MKL` |
|
| 41 |
+
| **Optimizations** | `AVX-512`, `VNNI`, `AMX`, `AVX2` |
|
| 42 |
+
|
| 43 |
+
---
|