Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes but the critical difference is that AI couldn’t pass these tests if they weren’t trained on a very similar set of questions and answers.

Not every person takes a SAT prep class to improve their test score. There are lots of people who are truly above average in terms of intelligence and can score very high on the first try.



> Yes but the critical difference is that AI couldn’t pass these tests if they weren’t trained on a very similar set of questions and answers.

First, I don't think that's strictly true. Obviously they wouldn't do as well but they still do better than chance.

Second, there's evidence that this is a big part of the Flynn effect, which means humans are susceptible to a similar phenomenon.


“ Obviously they wouldn't do as well but they still do better than chance” is a huge understatement. If a model wasn’t trained with any SAT questions and answers but was trained with the same verbal and mathematical knowledge a high school student would have, the AI would do extremely poorly in an actual test. In contrast, the vast majority of human test takers would score leagues above picking answers by chance.

Your original reply insinuated that the AI is learning very similar to how humans do and that’s just not true. Yes, humans do pattern matching based on prior experiences/knowledge like AI does when you train a model, but human intelligence goes way beyond that.


> Yes, humans do pattern matching based on prior experiences/knowledge like AI does when you train a model, but human intelligence goes way beyond that.

Humans are trained on orders of magnitude more multimodal data over their lifetimes. Also, humans are not borne as an unbiased model, billions of years of evolution have crafted many implicit biases into our cognition (like a propensity to language, facial recognition, etc.). All machine learning models are true blank slates, so it takes a lot more data just to build up to the same starting point as a newborn human.

All that's to say that you have no basis upon which to claim that AI learning is NOT similar to how humans do it, or that human intelligence "goes way beyond hat", it's just that humans have a head start and a lot more data to work with.


“Humans are trained on orders of magnitude more multimodal data over their lifetimes. Also, humans are not borne as an unbiased model, billions of years of evolution have crafted many implicit biases into our cognition (like a propensity to language, facial recognition, etc.).”

What in the world are you talking about? I must be talking to chatgpt and am done with this thread. We were originally discussing the differences in methodology between AI and humans for passing standardized exams. Those involve tasks like applying well-defined mathematical concepts to a brand new problem, not “multimodal” data or facial recognition.


If you don't understand what I'm talking about then you don't understand how how these transformer AIs learn and solve problems, so maybe you shouldn't opine about how AI couldn't pass these tests. Your claimed differences between how humans and AIs work is conjecture that can be explained by what I described rather than fundamental differences in how these systems work.


Here's simple example: (x-x+c)=c gpt struggles with such examples if `x` is a large number e.g., x=123_000_000_456 and `c` some specific number, but it is easy for humans.


It's easy for humans once they're taught what variables mean and after 10 or so years of exposure to a real world multimodal training set that's orders of magnitude more data than GPT has seen. Also, algebra is not so easy for people with IQs lower than 90, so not exactly all humans right? What exactly am I supposed infer about how GPT or other AIs and human brains operate from this apples to cars comparison?

You don't have to point out failure modes of GPT, I know what they are. The question we're discussing here is what this indicates, if anything, about how these systems operate as compared to human brains, and whether the differences come down to training data or the fundamental architecture.


ChatGPT is not the only AI in the world.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: