Zoom Logo

Ban the Scan Townhall - Shared screen with speaker view
Lisa Jackson
40:36
Hi Ryan, Aidan, Michelle, Molly, Travis and everyone!
Molly Kleinman (she/her)
41:53
What is necessary to make sure everyone, not just some, feel unsafe.
John Smith
42:29
Government Propaganda!
Mike
42:29
lack of accountability
John Smith
43:46
Being Gaslighted
John Smith
43:52
Lack of lighting at night
John Smith
44:00
Feeling unseen
John Smith
44:16
Economic issues: housing, food insecurity
John Smith
44:55
Almost all of them.
Natalie Holbrook
44:55
feeling like you can’t call on anyone if there is a real emergency, because the folks who are supposed to make you “safe” might harm you or your loved ones
John Smith
45:03
Unlawful policing
Ryan Henyard - he/him
45:33
it's a whole other world
Viva Rosenfeld
46:05
Lack of health care
John Smith
47:36
Internet Data Collection by Private Companies
Ryan Henyard - he/him
58:08
https://www.redefinesafety.org/
Ryan Henyard - he/him
58:18
^Detroit Safety Team website
Aidan Sova
59:55
True!
Kwame Hooker
01:04:08
Psychological safety vs. physical safety and the threat of future trauma due to both.
John Smith
01:07:19
If he had said he could only speak to the police through an attorney when he was called, would that have made a big difference?
CM Travis Radina (he|him|his)
01:07:26
Nash mentioned the need to protect against both governmental AND CORPORATE surveillance. As we explore a local ban, the model ordinance we’ve seen from the ACLU focuses mostly on government use of facial recognition technology. Have any of you seen municipal examples that push into the sphere prohibiting the use of this technology by other entities in the city (apartment buildings, businesses, etc.)?
John Smith
01:08:15
I believe anybody can submit a picture to Google and have it scan for a name.
chris w (rt4mn)
01:08:16
Portland takes an interesting approach.
Kwame Hooker
01:09:30
The empirical evidence is stark. A groundbreaking study published in December by the US-based National Institute of Standards and Technology, which analysed 189 software algorithms from 99 developers – the majority of the industry – saw higher rates of inaccuracy for Asian and African-American faces relative to images of Caucasians, often by a factor of ten to one hundred times.
Ryan Henyard - he/him
01:09:32
major AI vendors also broke many user agreements on websites like flickr to include their photos in scans
chris w (rt4mn)
01:09:45
@johnsmith and amazon lets anyone use their rekognition frt platform as well.
Kwame Hooker
01:09:55
https://www.raconteur.net/technology/biometrics-ethics-bias/
John Smith
01:09:58
https://support.google.com/photos/answer/6128838?co=GENIE.Platform%3DAndroid&hl=en
John Smith
01:10:07
Search by people, things & places in your photos
Ryan Henyard - he/him
01:12:43
Louise Seamster has a concept called Predatory Inclusion - training with more marginalized folks will technically be inclusion, but towards predatory means
Linh Song
01:13:03
https://www.ajl.org/FYI on the Algorithmic Justice League, the org connected to the film, "Coded Bias."
Phil (he/him) - ACLU
01:13:10
^^^to Ryan's comment
Kwame Hooker
01:15:04
Should BIPOC's contact ACLU should we be detained because of faulty facial recognition?
John Smith
01:15:18
If there was any group with a 34% error rate, this really begs the question as to what type of testing was done on this software! Who approved of the test plan? These error rates should be clearly documented for any user to know.
Phil (he/him) - ACLU
01:16:04
Absolutely. If you believe you were wrongfully detained by FR technology (or know someone who has), please have them fill out an intake form on our website. https://intake.aclumich.org/
John Smith
01:17:11
If the cops want to talk to you, always tell them you can only answer their questions through an attorney!
Ryan Henyard - he/him
01:18:05
The Detroit Digital Justice Coalition released a report a couple years ago on the issues with Project Greenlight, if you're interested in diving more into that surveillance tech use in that city - https://detroitcommunitytech.org/?q=content/critical-summary-detroit%E2%80%99s-project-green-light-and-its-greater-context
Ryan Henyard - he/him
01:23:44
these aren't even just only municipally owned surveillance cameras, but many law enforcement agencies are taking videos from consumer devices like Ring cameras and running them through facial recognition systems
Ryan Henyard - he/him
01:24:04
and the cloud footage from those cameras can be obtained via subpoena, no strings attached
Lisa Jackson she/her
01:25:28
The confirmation bias is particularly insidious because most people are unaware that they are being influenced by this heuristic.
Mike (cryptoparty A2)
01:26:00
companies will sometimes cooperate with law enforcement even without orders like subpoenas
John Smith
01:26:08
What if you cut-off the funding? This technology is only available through a vendor?
Kwame Hooker
01:26:29
I'm sure owners of private prisons are investing heavily in facial recognition technology in municipalities.
Ryan Henyard - he/him
01:26:41
@John Smith - it's honestly not enough to cut off funding, because law enforcement can acquire federal grants to cover those purchases
Ryan Henyard - he/him
01:27:14
@John Smith also, as a technologist I've regularly seen vendors offer free tech opportunities as a loss leader to help make the sale
John Smith
01:27:25
@Ryan Henyard, you prohibit them from spending any money on this technology.
Molly Kleinman (she/her)
01:27:51
@John Smith, let’s focus on the conversation the speakers are having right now.
John Smith
01:27:59
The loss leader is only for a period of time because the vendor thinks you will pay later.
Ryan Henyard - he/him
01:30:34
without a ban on the explicit use the tech finds it's way in; the defense department 1033 program, which we often hear about being used to acquire Humvees and weaponry, can be used to acquire military surveillance tech for free through the federal program
Ryan Henyard - he/him
01:30:48
(more info at https://www.eff.org/deeplinks/2021/01/end-two-federal-programs-fund-police-surveillance-tech , from January)
John Smith
01:38:05
If you go to a protest, don't take your phone!
John Smith
01:41:01
I have to be honest, I feel like Facial Recognition tech is only the tip of the iceberg! It is important to oppose it in its own right but it is not enough.
Ryan Henyard - he/him
01:41:34
we've got an opportunity to do this before it's too embedded
John Smith
01:42:35
Agree!
John Smith
01:45:45
Informed consent would should down Google?
Ryan Henyard - he/him
01:46:05
you can read about the $650M settlement facebook paid this year to Illinois residents for violating the Illinois biometric privacy law https://apnews.com/article/technology-business-san-francisco-chicago-lawsuits-af6b42212e43be1b63b5c290eb5bfd85
John Smith
01:46:06
I like it if true!
CM Travis Radina (he|him|his)
01:46:14
Thank you, Nash. Incredibly helpful!
nash (he|ze) EFF
01:47:21
My pleasure
Molly Kleinman (she/her)
01:50:22
Sign up for action alerts at https://banthescana2.com/.
nash (he|ze) EFF
01:52:24
You can find EFF’s model legislation here: https://www.eff.org/aboutface/toolkit
John (he/him) BLM Detroit
01:53:37
thanks all for the invite. Check out Detroit Safety Team at www.redefinesafety.org and BLM Detroit at www.blmdetroit.com
beth parker
01:53:42
1
Adam Oxner (he/him)
01:54:56
Put a 1 in your name to be placed in a room with Molly about Ann Arbor, or put a “2” in your name to be put in a room about the region at large
1. erica briggs
01:54:59
Thank you to all of the amazing presenters and organizers!
beth parker
01:55:05
1 beth parker
Ryan Henyard - he/him - 2
01:55:09
thank you, this was great!
nash (he|ze) EFF
01:55:12
Thank you, all!
Phil (he/him) - ACLU
01:55:25
Thanks for the invite. So glad to be here with y'all.
Mike (cryptoparty A2) 2
01:55:31
Huge thanks to the panelists!
Katherine Griswold
01:55:32
very informative. thank you