It’s wrong to think this is a either this or that. Neither is really true. Thinking this way is just a trap. I’ve never believed GPT was just regurgitating but it also still doesn’t actually really understand, and likely never will. Even with step by step thinking prompts it often gets simple things incorrect. And just because the number of things it gets incorrect is decreasing that still doesn’t mean it understands any of it. I think people who ascribe “understanding” and actual cognition are just having a bit of pareidolia. Perhaps for them what GPT does passes for true cognition but many have a much higher standard. For me it’s not there yet maybe eventually but I’m not about to ascribe properties to a system just because it appears to have them.