• BigWeed [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    4 days ago

    The creator of Eliza found that people would sneak into his office late at night in order to have a secret conversations with it. They would say that the chatbot understood them, even those who understood it was a program. He would be asked to leave the room so they could have private conversations with it. He found this to be very alarming and it was one of the reasons he wrote the book. These stories are in that book.

      • BigWeed [none/use name]@hexbear.net
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        3 days ago

        Sure but the argument is that we shouldn’t be so quick to accept technology that has negative consequences. This thread is all about job layoffs and loss of positions for those first entering the labor market because of AI speculation and labor replacement for low productivity tasks. This specific technology has consequences and maybe we shouldn’t be so quick to fervently accept it with open arms.

        One big theme of the book is we have a moral obligation to withhold labor from developing technology that uniquely benefits governments and large corporations. Similarly, you’re defending the using ai to ‘stylize text’ even though it is disproportionately benefiting a fortune 500 news firm and hurting new labor entrants. The technology is not neutral and which side you are on?

        • ☆ Yσɠƚԋσʂ ☆@lemmygrad.mlOP
          link
          fedilink
          English
          arrow-up
          7
          ·
          3 days ago

          I mean any new technology can have negative consequences in a dystopian society. My point is that we should focus on the root causes rather than the symptoms.

          • BigWeed [none/use name]@hexbear.net
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            3 days ago

            I hoped the takeaway was to not evangelize the tech because it hurts people and withhold your labor from furthering this type of work, rather than “don’t use it”. People no longer have an option with not using it, the expectation of productivity has gone up and it’s either use it or be replaced by someone who will.

            • If someone wants to use an AI video editor to make rough cuts of socialist agitprop that they couldn’t otherwise afford to make solo, then I would encourage them to do just that. It’s better than the alternative, burying our head in the sand and cedeing that ground to some right winger. It’s just a tool.