After the COVID-19 pandemic halted many asylum procedures around Europe, new technologies are actually reviving these types of systems. Out of lie diagnosis tools tested at the line to a system for confirming documents and transcribes selection interviews, a wide range of systems is being used in asylum applications. This article is exploring just how these systems have reshaped the ways asylum procedures are conducted. This reveals just how asylum seekers are transformed into pressured hindered techno-users: They are asked to adhere to a series of techno-bureaucratic steps and keep up with unstable tiny changes in criteria and deadlines. This kind of obstructs all their capacity to browse these devices and to follow their right for proper protection.

It also illustrates how these types of technologies happen to be embedded in refugee governance: They aid the ‘circuits of financial-humanitarianism’ that function through a flutter of distributed technological requirements. These requirements increase asylum seekers’ socio-legal precarity by simply hindering these people from getting at the programs of proper protection. It further argues that analyses of securitization and victimization should be along with an insight in the disciplinary mechanisms of such technologies, in which migrants happen to be turned into data-generating subjects just who are self-disciplined by their dependence on technology.

Drawing on Foucault’s notion of power/knowledge and comarcal understanding, the article argues that these technology have an natural obstructiveness. There is a double result: whilst they assist with expedite the asylum process, they also generate it difficult for the purpose of refugees to navigate these kinds of systems. They can be positioned in a ‘knowledge deficit’ that makes all of them vulnerable to illegitimate decisions manufactured by non-governmental celebrities, and here ill-informed and unreliable narratives about their cases. Moreover, they pose new risks of’machine mistakes’ that may result in erroneous or discriminatory outcomes.