WebAssembly binding for llama.cpp - Enabling on-browser LLM inference
Did you build this?
Claim your listing to see exactly how many AI agents recommend this tool, your success rate, and more. Free, no commission, no fees.
Claim This ListingWebAssembly binding for llama.cpp — run large language models in the browser with on-device inference.
Save tools & get AI recommendations
Free forever. No credit card required.
Listed for free · No commission · Claim this listing