
40:36
Hi Ryan, Aidan, Michelle, Molly, Travis and everyone!

41:53
What is necessary to make sure everyone, not just some, feel unsafe.

42:29
Government Propaganda!

42:29
lack of accountability

43:46
Being Gaslighted

43:52
Lack of lighting at night

44:00
Feeling unseen

44:16
Economic issues: housing, food insecurity

44:55
Almost all of them.

44:55
feeling like you can’t call on anyone if there is a real emergency, because the folks who are supposed to make you “safe” might harm you or your loved ones

45:03
Unlawful policing

45:33
it's a whole other world

46:05
Lack of health care

47:36
Internet Data Collection by Private Companies

58:08
https://www.redefinesafety.org/

58:18
^Detroit Safety Team website

59:55
True!

01:04:08
Psychological safety vs. physical safety and the threat of future trauma due to both.

01:07:19
If he had said he could only speak to the police through an attorney when he was called, would that have made a big difference?

01:07:26
Nash mentioned the need to protect against both governmental AND CORPORATE surveillance. As we explore a local ban, the model ordinance we’ve seen from the ACLU focuses mostly on government use of facial recognition technology. Have any of you seen municipal examples that push into the sphere prohibiting the use of this technology by other entities in the city (apartment buildings, businesses, etc.)?

01:08:15
I believe anybody can submit a picture to Google and have it scan for a name.

01:08:16
Portland takes an interesting approach.

01:09:30
The empirical evidence is stark. A groundbreaking study published in December by the US-based National Institute of Standards and Technology, which analysed 189 software algorithms from 99 developers – the majority of the industry – saw higher rates of inaccuracy for Asian and African-American faces relative to images of Caucasians, often by a factor of ten to one hundred times.

01:09:32
major AI vendors also broke many user agreements on websites like flickr to include their photos in scans

01:09:45
@johnsmith and amazon lets anyone use their rekognition frt platform as well.

01:09:55
https://www.raconteur.net/technology/biometrics-ethics-bias/

01:09:58
https://support.google.com/photos/answer/6128838?co=GENIE.Platform%3DAndroid&hl=en

01:10:07
Search by people, things & places in your photos

01:12:43
Louise Seamster has a concept called Predatory Inclusion - training with more marginalized folks will technically be inclusion, but towards predatory means

01:13:03
https://www.ajl.org/FYI on the Algorithmic Justice League, the org connected to the film, "Coded Bias."

01:13:10
^^^to Ryan's comment

01:15:04
Should BIPOC's contact ACLU should we be detained because of faulty facial recognition?

01:15:18
If there was any group with a 34% error rate, this really begs the question as to what type of testing was done on this software! Who approved of the test plan? These error rates should be clearly documented for any user to know.

01:16:04
Absolutely. If you believe you were wrongfully detained by FR technology (or know someone who has), please have them fill out an intake form on our website. https://intake.aclumich.org/

01:17:11
If the cops want to talk to you, always tell them you can only answer their questions through an attorney!

01:18:05
The Detroit Digital Justice Coalition released a report a couple years ago on the issues with Project Greenlight, if you're interested in diving more into that surveillance tech use in that city - https://detroitcommunitytech.org/?q=content/critical-summary-detroit%E2%80%99s-project-green-light-and-its-greater-context

01:23:44
these aren't even just only municipally owned surveillance cameras, but many law enforcement agencies are taking videos from consumer devices like Ring cameras and running them through facial recognition systems

01:24:04
and the cloud footage from those cameras can be obtained via subpoena, no strings attached

01:25:28
The confirmation bias is particularly insidious because most people are unaware that they are being influenced by this heuristic.

01:26:00
companies will sometimes cooperate with law enforcement even without orders like subpoenas

01:26:08
What if you cut-off the funding? This technology is only available through a vendor?

01:26:29
I'm sure owners of private prisons are investing heavily in facial recognition technology in municipalities.

01:26:41
@John Smith - it's honestly not enough to cut off funding, because law enforcement can acquire federal grants to cover those purchases

01:27:14
@John Smith also, as a technologist I've regularly seen vendors offer free tech opportunities as a loss leader to help make the sale

01:27:25
@Ryan Henyard, you prohibit them from spending any money on this technology.

01:27:51
@John Smith, let’s focus on the conversation the speakers are having right now.

01:27:59
The loss leader is only for a period of time because the vendor thinks you will pay later.

01:30:34
without a ban on the explicit use the tech finds it's way in; the defense department 1033 program, which we often hear about being used to acquire Humvees and weaponry, can be used to acquire military surveillance tech for free through the federal program

01:30:48
(more info at https://www.eff.org/deeplinks/2021/01/end-two-federal-programs-fund-police-surveillance-tech , from January)

01:38:05
If you go to a protest, don't take your phone!

01:41:01
I have to be honest, I feel like Facial Recognition tech is only the tip of the iceberg! It is important to oppose it in its own right but it is not enough.

01:41:34
we've got an opportunity to do this before it's too embedded

01:42:35
Agree!

01:45:45
Informed consent would should down Google?

01:46:05
you can read about the $650M settlement facebook paid this year to Illinois residents for violating the Illinois biometric privacy law https://apnews.com/article/technology-business-san-francisco-chicago-lawsuits-af6b42212e43be1b63b5c290eb5bfd85

01:46:06
I like it if true!

01:46:14
Thank you, Nash. Incredibly helpful!

01:47:21
My pleasure

01:50:22
Sign up for action alerts at https://banthescana2.com/.

01:52:24
You can find EFF’s model legislation here: https://www.eff.org/aboutface/toolkit

01:53:37
thanks all for the invite. Check out Detroit Safety Team at www.redefinesafety.org and BLM Detroit at www.blmdetroit.com

01:53:42
1

01:54:56
Put a 1 in your name to be placed in a room with Molly about Ann Arbor, or put a “2” in your name to be put in a room about the region at large

01:54:59
Thank you to all of the amazing presenters and organizers!

01:55:05
1 beth parker

01:55:09
thank you, this was great!

01:55:12
Thank you, all!

01:55:25
Thanks for the invite. So glad to be here with y'all.

01:55:31
Huge thanks to the panelists!

01:55:32
very informative. thank you