>a model will probably know its user so well that it will be painful to switch to a different one.
Very keen to avoid this sort of life lock-in personally.
I'd much rather build a local version that is half as capable. Use cloud for inference via API but have the core data storage etc local. Already made some progress replacing pieces with own MCP servers
>A.I. companions
Another thing I'm absolutely not doing hosted. Certainly not the emotional connotations of "companion".
I can see both strategies working commercially though
>a model will probably know its user so well that it will be painful to switch to a different one.
Very keen to avoid this sort of life lock-in personally.
I'd much rather build a local version that is half as capable. Use cloud for inference via API but have the core data storage etc local. Already made some progress replacing pieces with own MCP servers
>A.I. companions
Another thing I'm absolutely not doing hosted. Certainly not the emotional connotations of "companion".
I can see both strategies working commercially though
https://archive.is/ndw6X