Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah! I want to use the logprobs API, but you can't for example:

- sample multiple logits and branch (we maybe could with the old text completion API, but this no longer exists)

- add in a reasoning token on the fly

- stop execution, ask the user, etc.

But a visualization of logprobs in a query seems like it might be useful.



Can't you?

1- option top_logprobs allows you not just to get the most likely token, but the top most likely tokens.

You can branch, by just chosing any point in your generated string and feed it back to the LLM, for example: { "user":"what is the colour of love?", "assistant":"the colour of love is"}

It's true that it will add an "assistant" tag, wand old completions was better for this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: