• But evolution is actively participated in and directed by the unbroken process of life.

    Yes. And?

    The need to avoid death is prior to the existence of evolution. It can’t be just the result of an imposition on sentient life, because it’s a necessary condition of the autopoietic processes that define life itself, of which evolution is an extension.

    I’m not seeing how this contradicts anything I said. In fact it supports what I said by recognizing the necessity for a directionality that precedes (and is a prerequisite for) any kind of sentient desire or “wants.”

    A replicator that is too effective at replicating can dissolve its environment and destroy the conditions that made its existence possible.

    @purpleworm@hexbear.net addressed this really well and gave a thoughtful, completely correct response. Not much more for me to say on it.

    When the dissipative structures that formed proto-life cordoned off from the world through cell boundaries, it really did become a need to avoid death to continue. it really is a kind of want, not just its appearance (but not mentally because there is no mind yet) - to maintain tension between the world and itself and propagate itself.

    I think you’re splitting hairs here between ever so slightly different aspects what I have been calling directionality. Desires or “wants” by definition require a mind capable of having a want or desire. Where you say “it really is a kind of want but not mentally because there is no mind yet” then that’s simply not the kind of “want” we are talking about here, the thing that a self-aware (mind-possessing) AI would have if it were genuinely self aware and possessing of a mind. Everything else really is just an appearance of want and is a result of what I’ve been calling directionality. What you’re talking about as the mindless “need to avoid death to continue” is still just the mindless non-intelligent and non-sentient directionality of evolution. And to specifically address this piece:

    to maintain tension between the world and itself and propagate itself.

    But it is part of the world (dialectics ftw!). There is a tension between inside and outside the individual cell (and also a tension between the “self” and “outside the self” of a sentient mind which is addressed further down, but this is not the same thing as the the tension between the cell and the world, as proven by the fact we aren’t aware of all our cells and frequently kill them by doing such things as scratching) but the cell still isn’t the most basic unit of replication in evolution, that would be the gene. Strands of RNA or DNA. Genes (often but not always) use cells as part of the vehicle for their replication, and either way they are still just chemicals reacting with the environment they exist within. There’s no more intentionality behind what they do than there is behind, say, a magnet clinging to a fridge. That magnet does not “want” to cling to your fridge, like genes, it is reacting to it’s environment and this will be true regardless of where you draw the boundary between the “self” of the magnet and “the outside world.” To actually desire something the way we are talking about here requires the complexity of a brain capable of producing a mind.

    I don’t think it’s as much from the neurons themselves as it is the whole inference/action dialectic and the world/organism dialectic. […] Self-awareness resulted from real material pressures, actually existing relations between organisms, and the need to distinguish the self and the other for appropriate action

    Agreed. The emergent property of the mind and sentience comes out of the complexity of the interaction of the firing of neurons in a brain and the world they exist within, at least in all likelhood. We still don’t know exactly what produces our ability to experience, where exactly qualia originate (i.e. why we aren’t just philosophical zombies) but I think most neuroscientists (and philosophers who work on this stuff) would agree, as I do too, that without an outside non-self world for those neurons to interact with, there would be no actual mind. Even that the mind is a drawing of the distinction between self and non-self. But since that complex neural structure could never even begin to come about without that outside world and all the mechanisms of evolution (aside from a Boltzmann brain!), always having to include the phrase “and with the outside world” when describing the neurological origin of qualia and experience is some severe philosophical hair-splitting.

    I’d also argue that the genuine desire to survive as a psychic phenomenon has always existed at least from the first time a neural organism perceived the world, identical to qualia.

    Um, yeah… that’s pretty much what my argument was for the necessity of any genuine AI to have wants and desires, those “wants” necessarily would have had to have been there built in for it to even become AI.

    It’s not necessary to have self-awareness for that. Want as a mental phenomena exists prior to self-awareness

    Disagree. Again, if you want to split hairs on exactly where it is in that ladder of complexity that self-awareness arises, or where in the fuzzy chain we can draw a line between organisms capable of self-awareness vs those not, or even exactly what constitutes self-awareness then feel free. But a thing having an actual desire as something genuinely experienced, it requires some sense of selfhood for that experience to happen to.

    • semioticbreakdown [she/her]@hexbear.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 days ago

      But since that complex neural structure could never even begin to come about without that outside world and all the mechanisms of evolution (aside from a Boltzmann brain!), always having to include the phrase “and with the outside world” when describing the neurological origin of qualia and experience is some severe philosophical hair-splitting.

      You’d think, but damn the techbros really have forgotten about this one tbh. I think it’s still very relevant to the topic of AI honestly cause the people who make them keep ignoring that fact. I’ve seen emergent complexity get thrown around as justification for LLM sentience in some circles. And I don’t understand why when nearly everything in neuroscience and philosophy, as you said, contradicts that. Very frustrating, frankly. Even the term “world model” gets thrown around with LLMs too and its soooo aggravating.

      I agree with your thoughts on directionality i was just quibbling on evolution and yeah splitting hairs really. Like I have other thoughts on world models and sentience and selfhood but theyre probably pretty fringe so I’m not going to share them here.