“Staying ahead of the curve” – Data-driven policing tools to combat crime and terrorism

In this blog, our partner Trilateral Research explore the ethical, legal and societal aspects of data-driven policing tools and outline COPKIT’s integrated data protection and ethical impact assessment (E+PIA) to achieve responsible use and innovation of the project’s technologies.


Terrorism and organised crime are evolving phenomena with high societal impacts. Increasingly, criminal organisations use new technologies to strengthen their capabilities to support their activities for instance by using on-line services (marketplaces) for acquisition or distribution of goods and services.

As emphasised by Europol,

for almost all types of organised crime, criminals are deploying and adapting technology with ever greater skill and to ever greater effect. This is now, perhaps, the greatest challenge facing law enforcement authorities around the world, including in the EU (Europol 2017)

Law Enforcement Agencies (LEAs) have to “stay ahead of the curve” of the use of new technology by organised crime and terrorist groups.

The COPKIT project is developing data-driven solutions to help explain how the crime is evolving, identify “weak signals” or trends and send alerts about new risks. Therefore, the COPKIT approach focuses on preparedness, mitigation, prevention and security policies.

Nevertheless, the use of technology by LEAs raises several ethical, legal and societal questions with serious implications for the relationship between LEAs and citizens.

COPKIT’s Ethical, Legal and Privacy (ELP) team (involving Law and Internet Foundation from Bulgaria, University of Granada from Spain and led by Trilateral Research from Ireland) work on addressing various ethical, legal and societal challenges related to the envisioned use of COPKIT as law enforcement investigation technologies, namely:

1. Data

What types of data and how ought the data available to the police be used by LEAs? Who has access to this data within and outside the police? Which data sources are used? To what extent is our digital footprint, such as our activity on social media, private and can it be used unconditionally? What are the legal limits of citizens profiling?

2. Transparency

How can predictive algorithms reach their conclusions? How do the analyses work? How are they used? How effective are they in contributing to the uncertainty?

3. Inaccuracy, automation bias and discriminatory results

To what extent may data-driven policing tools contribute to the stigmatisation of people (vulnerable and minority groups and individuals) and places (hot spots)? Could such information as our postal codes, age, sex, race, employment status, social and family situation be used as proxies for criminality? How to design such tools to minimise victimisation? How to reduce automation bias, where human decision-makers defer to computers and accept recommendations that may be incorrect or biased?

4. Misuse of research

Although research is usually carried out with benign intentions, it has the potential to harm humans or society. Thus, we investigate how to avoid misuse of data-driven policing tools, i.e. how they could be used for unethical ends.

Our integrated data protection and ethical impact assessment 

The potential for enormous benefits of data-driven policing tools is coupled with considerable risks.

Like all tools that collect and process potentially personal data, COPKIT technologies may interact with the rights and freedoms of individuals – in this case, cybercrime victims, perpetrators or other Internet users.

Our ELP team proposes an integrated data protection and ethical impact assessment (E+PIA) of COPKIT technologies to achieve responsible use and innovation. Moreover, the COPKIT technologies require a privacy-by-design approach during the technology development and a consideration of data ethics to create proportionate tools for related law enforcement activities.

An integrated E+PIA spans throughout the lifecycle of a project from the early design stage, to the deployment of the product or service. By taking an integrated and interdisciplinary approach to E+PIA in the COPKIT project, the ELP team provides their expertise and support at every stage of the project following ethics-by-design principles.

The potential ethical, legal and societal considerations are being addressed starting from the design and development activities, and we will continue during the piloting of the COPKIT tools. We work in a close cooperation with COPKIT partners and other stakeholders to understand and respond to their needs.

To that extent, the ELP team proposes that the foresight is conducted at two levels:

(1) E+PIA conducted by experts in law, ethics and societal impacts to ensure a high standard of the analysis

(2) a self-assessment tool for LEAs and tech developers

Preliminary results

So far, as part of the E+PIA, we have been analysing ethical, legal and societal requirements to ensure that the developed eco-system respects ethical principles, including privacy, as well as EU legal and societal requirements.

We have identified three major European acts, which set forth numerous provisions that need to be respected, such as fundamental human rights including dignity, non-discrimination, privacy and data protection; and therefore they should guide the COPKIT tools’ development:

(1) the Charter of Fundamental Rights of the EU,

(2) the General Data Protection Regulation (GDPR),

(3) the Law Enforcement Directive,

Moreover, the COPKIT project should follow six privacy principles established by the GDPR and the Law Enforcement Directive, namely: lawfulness, fairness and transparency (personal data processing); purpose limitation; data minimisation; accuracy; storage limitation; integrity and confidentiality.

Since data-driven policing tools increasingly use artificial intelligence (AI), they should also consider ethical guidelines regarding the use of AI. To date, there are no specific principles for the use of AI in policing. Nevertheless, such tools could follow the principles drafted by the High-Level Group of Experts on Artificial Intelligence (HLEG AI), established by the European Commission.


For more information and updates, follow us on TwitterLinkedIn and Facebook and feel free to contact our team at copkit@copkit.eu.