Solutions and Asylum Procedures

UnCategorized

After the COVID-19 pandemic stopped many asylum procedures throughout Europe, new technologies are reviving these types of systems. Right from lie diagnosis tools examined at the line to a program for verifying documents and transcribes selection interviews, a wide range of solutions is being found in asylum applications. This article explores just how these solutions have reshaped the ways asylum procedures are conducted. That reveals just how asylum seekers happen to be transformed into compelled hindered techno-users: They are asked to adhere to a series www.ascella-llc.com/counseling-services-for-students/ of techno-bureaucratic steps and keep up with unstable tiny changes in criteria and deadlines. This kind of obstructs their capacity to navigate these devices and to pursue their right for cover.

It also illustrates how these types of technologies will be embedded in refugee governance: They accomplish the ‘circuits of financial-humanitarianism’ that function through a whirlwind of dispersed technological requirements. These requirements increase asylum seekers’ socio-legal precarity by hindering them from getting at the channels of security. It further states that analyses of securitization and victimization should be coupled with an insight in the disciplinary mechanisms of technologies, in which migrants are turned into data-generating subjects who are self-disciplined by their reliability on technology.

Drawing on Foucault’s notion of power/knowledge and comarcal understanding, the article argues that these systems have an natural obstructiveness. They have a double effect: while they assistance to expedite the asylum method, they also make it difficult to get refugees to navigate these systems. They are simply positioned in a ‘knowledge deficit’ that makes these people vulnerable to illegitimate decisions manufactured by non-governmental celebrities, and ill-informed and unreliable narratives about their conditions. Moreover, that they pose new risks of’machine mistakes’ that may result in inaccurate or discriminatory outcomes.