It depends on the scale you're talking about but the local LLMs are fairly capable of translating between different programming languages, particularly if you're not so concerned about external library support.
Pasting a function in and asking for it in a different programming language will get an implementation in the target language. Using ollama run llama2:13b on my mac will allow converting in this fashion.
It might not be the best code, but this is true of machine translations as well.
> Using ollama run llama2:13b on my mac will allow converting in this fashion
okay, i'll have t try this out
> It might not be the best code, but this is true of machine translations as well.
true, one thing im hoping for is some local version of copilot/chatgpt for code where its trained on local libraries and the local project and such (and can translate them in some scenarios)
Pasting a function in and asking for it in a different programming language will get an implementation in the target language. Using ollama run llama2:13b on my mac will allow converting in this fashion.
It might not be the best code, but this is true of machine translations as well.