• Chuck@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    98
    ·
    1 year ago

    Imagine a mine that could move around, target seek, refuel, rearm, and kill hundreds of people without human intervention. Comparing an autonomous murder machine to a mine is like comparing a flint lock pistol to the fucking gattling cannon in an a10.

    • gibmiser@lemmy.world
      link
      fedilink
      English
      arrow-up
      55
      ·
      1 year ago

      Well, an important point you and him. Both forget to mention is that mines are considered inhumane. Perhaps that means AI murdering should also be considered. Inhumane, and we should just not do it instead of allowing landmines.

      • livus@kbin.social
        link
        fedilink
        arrow-up
        21
        ·
        1 year ago

        This, jesus, we’re still losing limbs and clearing mines from wars that were over decades ago.

        An autonomous field of those is horror movie stuff.

    • Chozo@kbin.social
      link
      fedilink
      arrow-up
      21
      ·
      1 year ago

      Imagine a mine that could move around, target seek, refuel, rearm, and kill hundreds of people without human intervention.

      Pretty sure the entire DOD got a collective boner reading this.

    • Sterile_Technique@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      Imagine a mine that could move around, target seek, refuel, rearm, and kill hundreds of people without human intervention. Comparing an autonomous murder machine to a mine is like comparing a flint lock pistol to the fucking gattling cannon in an a10.

      For what it’s worth, there’s footage on youtube of drone swarm demonstrations that were posted 6 years ago. Considering that the military doesn’t typically release footage of the cutting edge of its tech to the public, so this demonstration was likely for a product that was already going obsolete; and that the 6 years that have passed since have made lightning fast developments in things like facial recognition… at this point I’d be surprised if we weren’t already at the very least field testing the murder machines you described.

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      4
      arrow-down
      17
      ·
      1 year ago

      Imagine a mine that could recognize “that’s just a child/civilian/medic stepping on me, I’m going to save myself for an enemy soldier.” Or a mine that could recognize “ah, CenCom just announced a ceasefire, I’m going to take a little nap.” Or “the enemy soldier that just stepped on me is unarmed and frantically calling out that he’s surrendered, I’ll let this one go through. Not the barrier troops chasing him, though.”

      There’s opportunities for good here.

      • key@lemmy.keychat.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Maybe it starts that way but once that’s accepted as a thing the result will be increased usage of mines. Where before there were too many civilians to consider using mines, now the soldiers say “it’s smart now, it won’t blow up children” and put down more and more in more dangerous situations. And maybe those mines only have a 0.1% failure rate in tested situations but a 10% failure rate over the course of decades. Usage increases 10 fold and then you quickly end up with a lot more dead kids.

        Plus it won’t just be mines, it’ll be automated turrets when previously there were none or even more drone strikes with less oversight required because the automated system is supposed to prevent unintended casualties.

        Availability drives usage.