Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
andoando
2 days ago
|
parent
|
context
|
favorite
| on:
History LLMs: Models trained exclusively on pre-19...
This is my curiosity too. Would be a great test of how intelligent LLM's actually are. Can they follow a completely logical train of thought inventing something totally outside their learned scope?
int_19h
2 days ago
|
next
[–]
You definitely won't get that out of a 4B model tho.
reply
raddan
2 days ago
|
prev
[–]
Brilliant. I love this idea!
reply
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: