• Ebby@lemmy.ssba.com
    link
    fedilink
    arrow-up
    5
    ·
    9 months ago

    I’m not going to comment on the racial bias part as I have no data on any of that, but I’m not sure they use “AI” in any modern sense of the term. It’s basically the same tech triggering the activation words of our voice activated home assistants but massively scaled.

    I lived in a city with shotspotter long before modern AI was popular. It is a simply sound detection and triangulation as far as I’m aware.

    The night it was activated, blanks were fired by police to calibrate the network. The next morning, there was an article that the system detected and triangulated 4 additional shots.

    • sylver_dragon@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      9 months ago

      I work in a field where “AI” has been all the rage for the last few years (cybersecurity). In my experience, if a vendor touts that their product uses “AI”, run. Run far, far away. The one thing AI is really good at is turning noisy data into a fuck ton of false positives. And I can’t imagine any noisier data than the noise in a city (pun not intended). Cities are a 24x7 cacophony of loud noise and you expect anything to pick out and triangulate gun shots? Sure, they are loud as can be, but that sound also reflects and there are lots of other loud sounds to deal with. And that doesn’t even touch on the problem of unscrupulous police forces using either bad data or just making shit up about the data to go harass people. Good riddance to bad rubbish.