PromptForge
Back to list
frameworkLLMinference-optimizationmicrosoftquantization

BitNet

Microsoft's official 1-bit LLM inference framework, drastically reducing deployment costs and hardware requirements while enabling efficient large-model execution on CPUs.

50 views33231 stars3/13/2026

Microsoft's official 1-bit LLM inference framework, drastically reducing deployment costs and hardware requirements while enabling efficient large-model execution on CPUs.