2 points | by lidangzzz 6 hours ago
4 comments
If you want to save tokens, how about using a source language in the APL family?
APL may save a lot of tokens for computing tasks, but not for other general tasks, such as backend development?
Also I guess LLM doesn't have enough APL codes in training dataset, which might be a big problem.
LLMs are still very good at popular languages, so moving to APL for general tasks is probably a bad choice.
As to the training set's depth of APL, yes, it's an issue. However, it's worth seeing how well MoonBit[1, 2] works with LLMs, faced with exactly the same problem -- integrating the LLM directly into the parser pipeline.
1: https://www.youtube.com/watch?v=SnY0F9w1xdM 2: https://www.moonbitlang.com/blog/moonbit-ai
Hongbo blocked me on Twitter, lol
If you want to save tokens, how about using a source language in the APL family?
APL may save a lot of tokens for computing tasks, but not for other general tasks, such as backend development?
Also I guess LLM doesn't have enough APL codes in training dataset, which might be a big problem.
LLMs are still very good at popular languages, so moving to APL for general tasks is probably a bad choice.
As to the training set's depth of APL, yes, it's an issue. However, it's worth seeing how well MoonBit[1, 2] works with LLMs, faced with exactly the same problem -- integrating the LLM directly into the parser pipeline.
1: https://www.youtube.com/watch?v=SnY0F9w1xdM 2: https://www.moonbitlang.com/blog/moonbit-ai
Hongbo blocked me on Twitter, lol