> They took data from so many outside sources that the “state” of the software could not be easily replicated at a later time.
By definition, that's a complex system, and reproducing errors would be equally complex.
A GPT author would produce that for every system. Worse, you would not be able to reproduce bugs in the author itself.
While humans do have bugs that cause them to misunderstand the problem, at least humans are similar enough for us to look at their wrong code and say "Hah, he thought the foobar worked with all frobzes, but it doesn't work with bazzed-up frobzes at all".
IOW, we can point to the reason the bug was written in the first place. With GPT systems it's all opaque - there's no reason or rhyme for why it emitted code that tried to work on bazzed-up frobzes the second time, and not the first time, or why it alternates between the two seemingly randomly ...
By definition, that's a complex system, and reproducing errors would be equally complex.
A GPT author would produce that for every system. Worse, you would not be able to reproduce bugs in the author itself.
While humans do have bugs that cause them to misunderstand the problem, at least humans are similar enough for us to look at their wrong code and say "Hah, he thought the foobar worked with all frobzes, but it doesn't work with bazzed-up frobzes at all".
IOW, we can point to the reason the bug was written in the first place. With GPT systems it's all opaque - there's no reason or rhyme for why it emitted code that tried to work on bazzed-up frobzes the second time, and not the first time, or why it alternates between the two seemingly randomly ...