Back to list
modelon-device-aillm-inferenceedge-aigooglecpp
LiteRT-LM - Google On-Device LLM Inference Engine
On-device LLM inference engine by Google AI Edge team, built in C++ for efficient large language model execution on mobile and edge devices.
3 views1305 stars4/5/2026
On-device LLM inference engine by Google AI Edge team, built in C++ for efficient large language model execution on mobile and edge devices.