4 points | by nvader 2 days ago
2 comments
Windows and macOS does come with a small model for generating text completion. You can write a wrapper for your own TUI to access them platform agnostically.
For consistent LLM behaviour, you can use ollama api with your model of choice to generate. https://docs.ollama.com/api/generate
Chrome has a built-in Gemini Nano too. But there isn't an official way to use it outside chrome yet.
Is there a Linux-y standard brewing?
Windows and macOS does come with a small model for generating text completion. You can write a wrapper for your own TUI to access them platform agnostically.
For consistent LLM behaviour, you can use ollama api with your model of choice to generate. https://docs.ollama.com/api/generate
Chrome has a built-in Gemini Nano too. But there isn't an official way to use it outside chrome yet.
Is there a Linux-y standard brewing?