6 min read
LLM LLM local inference in consumer grade machine is now possible with the latest Qwen-3.5 model and llama-cpp library.
In this stream, I wrote personal notes for learning about Large Language Model (LLM).
LLM local inference in consumer grade machine is now possible with the latest Qwen-3.5 model and llama-cpp library.
Quickly testing Deepseek R1 in your local machine
Reproducible Jupyter notebook sharing using Nix and VSCode. Useful for sandboxing