My thoughts on the Flock ALPR camera system: The reasons for criminal activity are many, but the most unfortunate causes are those driven by generational inequities and social disparities. What can we do to keep our communities safe while we strive to address these disparities as a society? Investing in technology seems like a cost-effective strategy to complement our efforts to address crime.

Does technology come with some inherent challenges? Yes.

Does that mean we should not be considering it? Absolutely not.

The world would be a very different place if we decided to stop the use of technology as opposed to refining and maturing it over time. This process can be scary, but cars, planes, medicines/vaccines, and cellphones are all examples of products that have matured over time. We often forget the journey and take their current versions for granted in our lives.

I see ALPR as a technology that will evolve over time. It gives us certain capabilities that can be used and abused. Does that mean that you don’t use and learn from that usage because we fear the abuse? No. We try to manage and mitigate the possibility of abuse through good governance, policies and procedures. My reading, and the expert comments from our Chief Reynolds, suggests that it’s an effective investigative tool that helps (has already helped) us close criminal cases by locating and apprehending the responsible individuals quickly. Of the 29 incidents of carjacking in 2021, our detectives have recovered 28 vehicles and made 16 arrests utilizing Chicago’s ALPR technology. While these cameras are not intended to deter crime, effective investigations and arrests resulting from their installation will act as a deterrent over time. 

Two concerns have come up in my conversations with community members on this topic and they require careful consideration.

Data, its privacy & sharing:

Is data collected from the cameras shared? Who is it shared with? How will they use it? Will we become a surveillance state?

Flock collects only public domain information and does not collect private or Personally Identifiable Information (PII). In addition, Flock cannot legally sell or share the data generated by the cameras we install in Oak Park without our permission. However, they can enable our police department’s sharing of this data with other municipalities for investigative purposes. Flock could also use the data to learn and refine their algorithms and technology. 

Not entirely within our control is how our data might be used by other municipalities with whom we share that data. Would that usage be consistent with our values? This is where good privacy policy, governance and oversight come in. The privacy outlined by Flock seems sufficient for already public information, and we will get to governance in a bit.

A municipality with surveillance as its goal wouldn’t need Flock cameras. Already, our digital footprint (e.g. credit cards, smart devices, social media, Google search, cellphone usage, etc.) can be stitched together to get a pretty accurate picture of our day-to-day life. In fact nothing in stored ALPR data even comes close to what is possible with the data that is already out there, which includes our PII. Our society seems unaware of — or at least accepting of — that level of monitoring of our daily life.

Technology errors & bias:

Should we expect some false positives with this technology? Yes. Does that mean that the system error is greater than the current human error rate? No.

We are talking about using technology and human capabilities in conjunction to bring down the cumulative error rate and systemic bias. Taking one or the other, calculating the error rate, and using that to make the case against the technology is not accurate. This technology should be used to funnel down the leads that then require a secondary human validation to bring down the error rate for the entire process. I truly believe that we have an opportunity to bring down error and bias. Here again, the key is good governance.

Oversight & governance

So in my assessment, the fear of the technology and its possible abuse can only be addressed through a robust governance mechanism — a mechanism guided by transparency, accountability and learning based on measures and metrics. Will it be perfect when we start? No. But given a chance, we have the ability to address community safety more effectively and at a lower cost for the community. 

I urge the community to give technology a chance by working through its challenges and thus live up to our progressive values by supporting progress.

Ravi Parrakat is an Oak Park village trustee.

Join the discussion on social media!