Super cool. Big fan of LLMs for explaining code/helping debug as opposed to autocomplete.
Have you tried playing with the prompt? Curious if something (so silly) like “how would an experienced engineer fix this error: <stacktrace>” make a difference.
Just rebuilt the app with more examples, so the queue should stay a lot shorter now. This was built using Gradio[0] and is hosted on Spaces[1]. Check out the paper[2] and github repo[3].
They are the numbers at the bottom left of the interface under the 'clear' button. Try clicking them and it will auto fill the input then you just click submit. Or what do you see?
Particularly, one of the most useful feedback we heard early on was adding support for rendering the GUI in jupyter/colab notebooks. Extremely common use case (that also allowed us to distinguish ourselves from some competitors)