☆ Yσɠƚԋσʂ ☆@lemmygrad.ml to technology@hexbear.netEnglish · 8 days agoGPT-5: Overdue, overhyped and underwhelming. And that’s not the worst of it.garymarcus.substack.comexternal-linkmessage-square90linkfedilinkarrow-up193arrow-down11cross-posted to: technology@lemmy.ml
arrow-up192arrow-down1external-linkGPT-5: Overdue, overhyped and underwhelming. And that’s not the worst of it.garymarcus.substack.com☆ Yσɠƚԋσʂ ☆@lemmygrad.ml to technology@hexbear.netEnglish · 8 days agomessage-square90linkfedilinkcross-posted to: technology@lemmy.ml
minus-square☆ Yσɠƚԋσʂ ☆@lemmygrad.mlOPlinkfedilinkEnglisharrow-up1·8 days agoWe can look at examples of video generating models. I’d argue they have to have a meaningful and persistent representation of the world internally. Consider something like Genie as an example https://deepmind.google/discover/blog/genie-3-a-new-frontier-for-world-models/ It doesn’t have volition, but it does have intelligence in the domain of creating consistent simulations. So, it does seem like you can get a domain specific intelligence through reinforcement training.
We can look at examples of video generating models. I’d argue they have to have a meaningful and persistent representation of the world internally. Consider something like Genie as an example https://deepmind.google/discover/blog/genie-3-a-new-frontier-for-world-models/
It doesn’t have volition, but it does have intelligence in the domain of creating consistent simulations. So, it does seem like you can get a domain specific intelligence through reinforcement training.