EU puts human rights at the heart of tougher tech export rules

Several EU flags being flown outside a glass building and against a backdrop of blue sky
(Image credit: Shutterstock)

European-based technology vendors must apply for government licenses to export certain ‘dual-use’ products, and consider whether the use of their products in any deal poses a risk to human rights.

New rules established by EU lawmakers and the European Council beef up the export criteria of so-called ‘dual-use’ technologies, meaning vendors will have to clear a much higher bar when striking licensing deals.

These technologies include high-performance computing (HPC), drones, and software such as facial recognition and spyware, spanning systems with civilian applications that can be repurposed for nefarious reasons. They also include certain chemicals.

The beefier update to existing controls includes new criteria to grant or reject export licenses for certain terms, including whether or not the technologies will be used in potential human rights abuses.

The regulation includes guidance for EU nations to “consider the risk of use in connection with internal repression or the commission of serious violations of international human rights and international humanitarian law”. Member states must also be more transparent by publicly disclosing details about the export licenses they grant.

The rules can be swiftly changed to cover emerging technologies, and an EU-wide agreement has also been reached on controlling cyber surveillance products not listed as ‘dual-use’, in order to better safeguard human rights.

“Parliament’s perseverance and assertiveness against a blockade by some member states has paid off: respect for human rights will become an export standard,” said German MEP Bernd Lange. “The revised regulation updates European export controls and adapts to technological progress, new security risks and information on human rights violations.

“This new regulation, in addition to the one on conflict minerals and a future supply chain law, shows that we can shape globalisation according to a clear set of values and binding rules to protect human and labour rights and the environment. This must be the blueprint for future rule-based trade policy.”

Technologies such as facial recognition have attracted major opposition due to the way they’re being used by law enforcement agencies. In light of the Black Lives Matter movement, for instance, a swathe of technology vendors announced they’d be curbing their own facial recognition projects due to the public backlash.

Keumars Afifi-Sabet
Contributor

Keumars Afifi-Sabet is a writer and editor that specialises in public sector, cyber security, and cloud computing. He first joined ITPro as a staff writer in April 2018 and eventually became its Features Editor. Although a regular contributor to other tech sites in the past, these days you will find Keumars on LiveScience, where he runs its Technology section.