people most often believe what they want to believe so what this says to me is that even though he is very smart, he wants to believe we have figured out/are on the very cusp of figuring out GAI so he is much more inclined to take evidence that suggests that supports what he wants to believe (OMG LOOK THEY PASSED THE TURING TEST!?!?!?) at face value rather than thinking about whether the turing test is actually even a valid way of determining this kind of thing or not.
also, even more cynically, agi being around the corner would increase his own station as an AIvangelist and tech person of note (more TED-style talks in front of other bespectacled 30-50 year olds)
people most often believe what they want to believe so what this says to me is that even though he is very smart, he wants to believe we have figured out/are on the very cusp of figuring out GAI so he is much more inclined to take evidence that suggests that supports what he wants to believe (OMG LOOK THEY PASSED THE TURING TEST!?!?!?) at face value rather than thinking about whether the turing test is actually even a valid way of determining this kind of thing or not.
also, even more cynically, agi being around the corner would increase his own station as an AIvangelist and tech person of note (more TED-style talks in front of other bespectacled 30-50 year olds)
yeah he could always be literally lying to make more money if he’s involved in the industry lol