Civil Law Articles
NATIONAL LEGAL HOTLINE
Call Our Lawyers NOW
1300 636 846

7am to Midnight , 7 Days

Have Our Lawyers Call YOU

Robo-Debt and the Digital Welfare State

Written by Fernanda Dahlstrom

Fernanda Dahlstrom holds a Bachelor of Laws, a Bachelor of Arts and a Master of Arts. She also completed a Graduate Diploma in Legal Practice at the College of Law in Victoria. Fernanda practiced law for eight years, working in criminal defence, child protection and domestic violence law in the Northern Territory. She also practiced in family law after moving to Brisbane in 2016. Fernanda has strong interests in Indigenous and refugee law, human rights and law reform.

A report released on 11 October 2019 by the United Nations Special Rapporteur on extreme poverty and human rights has warned of the dangers of the digital welfare state. Digital welfare states are states, like Australia, that allow their social security system to be administered and automated by computer technologies without the involvement of caseworkers or other human decision-makers. Such technologies have many potential benefits but have also been known to make systemic errors that impact large numbers of people.

The report, written by Professor Philip Alston, identifies a number of risks and drawbacks associated with the digital welfare state, warning that we risk “stumbling zombie-like into a digital welfare dystopia.”

Robo-debt

The report notes the Australian government’s ‘Robo-debt’ system as an example of the lack of attention to legality in digital welfare systems. Robo-debt uses automated data-matching between information provided to Centrelink with that provided to the tax office. If there is a discrepancy, it generates a notice asking the welfare recipient to explain the difference. If the issue isn’t cleared up, a debt notice is issued.

The Robo-debt system has been found to have a very high error rate, miscalculating debt and creating debts where none exist. The report labelled this system a ‘fiasco’.

“While the lack of a legal basis is deeply problematic per se, this gap also means that opportunities for legislative debate and for public inputs into shaping the relevant systems is also lacking. This has major potentially negative implications for transparency, design, legitimacy and the likelihood of acceptance,”

Professor Alstron said.

Australia’s robo-debt scheme is currently the subject of a senate inquiry and a class action.

The report also raises numerous other issues that are pertinent to Australia’s digital welfare system.

Electronic debit cards

The use of debit cards instead of cash welfare payments has commenced in a number of countries, including Australia, New Zealand and South Africa. These cards have been identified as causing a number of problems, such as allowing for the surveillance of welfare recipients’ consumer behaviour and the cards being identifiable as welfare-related, leading to feelings of shame and self-consciousness.

The cards also often involve the users being charged card fees and are associated with negative stereotypes about welfare recipients being untrustworthy.

Fraud prevention

A lot of digital welfare systems have been designed with a focus on preventing and detecting fraud. However, the Special Rapporteur’s investigations indicated that the emphasis on this issue was out of proportion to the scale of the problem.  Instead, the risk of fraud threatens to become a pretext for increased surveillance and intrusion.

Communication

In digital welfare states, a lot of interactions that previously occurred face to face or over the phone are now occurring through online applications. This poses problems for those lacking internet access or digital skills. Online applications can also make legal decisions unclear, leading to claimants not understanding their rights.

In particular, the report highlights Australia’s Targeted Compliance Framework, which requires users to report compliance with mandatory activities via a digital dashboard. If there is a failure to meet a mutual obligation, the payment can be automatically suspended and penalties imposed, without the involvement of a human.

This system has been criticised for failing to take into account factors like lack of internet access or digital literacy.

Human rights and digital equality

The report warns against using new technology at the expense of human rights, noting that the technology sector is essentially a human-rights-free zone, with major players working to keep it that way.

Practical issues raised by digital welfare systems that threaten digital equality include:

  • Limited or no access to the internet
  • Out of date equipment and unreliable connections
  • Lack of access to documents or inability to upload them
  • Fingerprints being unreadable due to lives of manual labour

Conclusion

The report concludes that states using digital welfare systems need to alter their course to avoid a future of unrestricted data matching. widespread surveillance and punitive sanctions. Governments using such systems need to ensure they take into account the concerns of humanity and not only of the well-off.

If you require legal advice or representation in a Centrelink matter or in any other legal matter, please contact Go To Court.

Call Our Lawyers NOW

7am to Midnight , 7 Days

Have Our Lawyers Call YOU

Legal Hotline. Open 7am - Midnight, 7 Days

Call Now