You don't understand Yann's argument. It's similar to Richard Sutton's, in that these things aren't thinking, they're emulating thinking, and the weak implicit world models that get built in the weights are insufficient for true "AGI."
This is orthogonal to the issue of whether all ideas are essentially "remixes." For the record I agree that they are.