In Denmark, caseworkers can demand Facebook log-ins to verify asylum-seekers’ identities, while in the UK, refugees’ social media accounts can be legally checked if they claim to be fleeing persecution for being gay. The US already collects public social media data from visa applicants and is extending this to refugees; valid visa holders have been denied entry due to posts made by friends.
A booming mobile forensics industry allows authorities to bypass passwords on digital devices, enabling them to download, analyse, and visualize personal data – even data that the user believes they have deleted. Immigration officials in Germany, UK, Norway and Austria routinely mine asylum-seekers’ phones for routes, messages and other private data.
Decision by algorithm
Having a computer program determine whether people can enter is bringing yet more bias and opacity to immigration systems. Canada has been using ‘augmented decisionmaking’ since 2014 to determine whether marriages are genuine, or a would-be citizen ‘poses a risk’; while in the UK in 2018 an algorithm led to the wrongful deportation of over 7,000 foreign students.
Moscow reportedly has 5,000 cameras equipped with automated facial recognition software, which cross-reference faces in real time against police and passport databases, and images on Russian social network Vkontakte. Amazon already sells its Rekognition software to police and has reportedly pitched it to US Immigration agency ICE.
Data collection, storage and sharing
Governments are now able to create and analyse huge databases on shareable cloud-based storage. In the US, authorities use this data to hunt down undocumented people living and working inside the country. In the UK, sophisticated data sharing and matching is used to block access to public services and confirm migrants’ whereabouts for deportation orders.
In Jordan, the World Food Programme is using retina scans to distribute food to refugees. Developed in the name of efficiency, the mass processing of the biometric data of a highly vulnerable group of people begs two questions – what if this data falls into the wrong hands? And how can someone ‘consent’ to their data being taken, when it’s tied to survival?