‘Robo-Debt’: Guilt, Responsibility and the Dehumanisation of Welfare Compliance

This paper explores how digital techniques of policy implementation can themselves propel, shape and/or disrupt processes of welfare reform. It does so via a close analysis of the implementation of the Online Compliance Intervention, popularly known as ‘robo-debt’, by the Australian Department of Human Services (DHS). As it argues, this automated debt recovery system has transformed the governance of welfare compliance in Australia. Robo-debt has shifted responsibility for proving the existence of welfare non-compliance from the welfare state to the individual welfare recipient and has automated and ‘dehumanised’ the infliction of informal punishments, including debt recovery processes. Ultimately, robo-debt has enabled and contributed to broader processes of punitive welfare reform. The case of robo-debt prompts wider questions about the relationship between law and its technical implementation, and highlights the productive possibilities of the technologies of welfare administration.

Past, Future, and Present: The New Temporality of Decision-Making Software

Administrative decisions are increasingly co-produced by public officials and specialized software. Draws on “the past,” as represented by government databases, this software assembles data points to generate decisions about an individual’s access to public benefits. While such decisions are, in theory, reviewed by a human before taking effect, the technical processes by which they are produced depart from notions of temporality common to procedural fairness principles.

This paper explores this temporal shift in administrative decisions. It shows how software reaches back in time, uses the past to predict the future, and influences the present via a benefits decision. To compare the notions of time underlying procedural fairness doctrines and decision-making software, this paper draws on recent interdisciplinary studies of legal temporality, media, and governance. It then closes by theorizing how legal principles might respond to the temporality of decision-making software.

The Constitution of Airbnb

On 1 November 2016, Airbnb introduced a “strengthened and more detailed” nondiscrimination policy, of which terms “are stronger than what is required by law.” “The Airbnb community,” the binding policy declares, “is committed to building a world where people from every background feel welcome and respected, no matter how far they have traveled from home.” “[A]ll of us, Airbnb employees, hosts and guests alike,” are “committed to doing everything we can to help eliminate all forms of unlawful bias, discrimination, and intolerance from our platform,” and agree to act in accordance with the policy.

This paper explores the auto-constitutionalization of Airbnb’s domain, analyzing the digital and legal technologies employed to “codify” its foundational principles of “inclusion” and “respect.” Using this investigation as a platform for general reflection, it examines the technical elements of governance, the encoding of constitutions, and the constitution of programmed bearers of rights.