By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
IndebtaIndebta
  • Home
  • News
  • Banking
  • Credit Cards
  • Loans
  • Mortgage
  • Investing
  • Markets
    • Stocks
    • Commodities
    • Crypto
    • Forex
  • Videos
  • More
    • Finance
    • Dept Management
    • Small Business
Notification Show More
Aa
IndebtaIndebta
Aa
  • Banking
  • Credit Cards
  • Loans
  • Dept Management
  • Mortgage
  • Markets
  • Investing
  • Small Business
  • Videos
  • Home
  • News
  • Banking
  • Credit Cards
  • Loans
  • Mortgage
  • Investing
  • Markets
    • Stocks
    • Commodities
    • Crypto
    • Forex
  • Videos
  • More
    • Finance
    • Dept Management
    • Small Business
Follow US
Indebta > News > The problem of fairness in an automated world
News

The problem of fairness in an automated world

News Room
Last updated: 2025/02/04 at 12:16 AM
By News Room
Share
6 Min Read
SHARE

Stay informed with free updates

Simply sign up to the Social affairs myFT Digest — delivered directly to your inbox.

What does it mean for a machine’s decision to be “fair”? So far, the public debate has focused mostly on the issue of bias and discrimination. That is understandable: most people would expect machines to be less biased than humans (indeed, this is often given as the rationale for using them in processes such as recruitment), so it is right to pay attention to evidence that they can be biased, too.

But the word “fair” has a lot of interpretations, and “unbiased” is only one of them. I found myself on the receiving end of an automated decision recently which made me think about what it really means to feel that you have been treated justly, and how hard it might be to hold on to those principles in an increasingly automated world.

I have a personal Gmail account which I use for correspondence about a book project I am working on. I woke up one morning in November to discover that I could no longer access it. A message from Google said my access had been “restricted globally” because “it looks as though Gmail has been used to send unwanted content. Spamming is a violation of Google’s policies.” The note said the decision had been made by “automatic processing” and that if I thought it was a mistake, I could submit an appeal.

I had not sent any spam and couldn’t imagine why Google’s algorithm thought that I had. That made it hard to know what to write in the “appeal” text box, other than a panicked version of something like, “I didn’t do it (whatever it is)!” and, “Please help, I really need access to my email and my files”. (To my relief, I realised later that I hadn’t lost access to my drive.)

Two days later, I heard back: “After reviewing your appeal, your account’s access remains restricted for this service.” I wasn’t given any more information on what I had supposedly done or why the appeal had been rejected, but was told that “if you disagree with this decision, you can submit another appeal.” I tried again and was rejected again. I did this a few more times — curious, at this point, about how long this doom loop could continue. A glance at Reddit suggested other people had been through similar things. Eventually, I gave up. (Google declined to comment on the record.)

Among regulators, one popular answer to the question of how to make automated decisions more “fair” is to insist that people can request a human to review them. But how effective is this remedy? For one thing, humans are prone to “automation complacency” — a tendency to trust the machine too much. In the case of the UK’s Post Office scandal, for example, where sub-postmasters were wrongly accused of theft because of a faulty computer system called Horizon, a judge in 2019 concluded that people at the Post Office displayed “​​a simple institutional obstinacy or refusal to consider any possible alternatives to their view of Horizon”.

Ben Green, an expert on algorithmic fairness at the University of Michigan, says there can be practical problems in some organisations, too. “Often times the human overseers are on a tight schedule — they have many cases to review,” he told me. “A lot of the cases I’ve looked at are instances where the decision is based on some sort of statistical prediction,” he said, but “people are not very good at making those predictions, so why would they be good at evaluating them?”

Once my impotent rage about my email had simmered down, I found I had a certain amount of sympathy with Google. With so many customers, an automated system is the only practical way to detect breaches of its policies. And while it felt deeply unfair to have to plead my case without knowing what had triggered the system, nor any explanation of pitfalls to avoid in an appeal, I could also see that the more detail Google offered about the way the system worked, the easier it would be for bad actors to get around it.

But this is the point. In increasingly automated systems, the goal of procedural justice — that people feel the process has been fair to them — often comes into conflict with other goals, such as the need for efficiency, privacy or security. There is no easy way to make those trade-offs disappear.

As for my email account, when I decided to write about my experience for this column, I emailed Google’s press office with the details to see if I could discuss the issue. By the end of the day, my access to my email account had been restored. I was pleased, of course, but I don’t think many people would see that as particularly fair either.

[email protected]

Read the full article here

News Room February 4, 2025 February 4, 2025
Share this Article
Facebook Twitter Copy Link Print
Leave a comment Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Finance Weekly Newsletter

Join now for the latest news, tips, and analysis about personal finance, credit cards, dept management, and many more from our experts.
Join Now
How Black-ish Creator Kenya Barris and REVOLT Labs are building a creator empire

Watch full video on YouTube

Why Infiniti is pinning its turnaround hopes on its new SUV

Watch full video on YouTube

ConocoPhillips: More Upside Given Long-Term Cash Flow Tailwinds (NYSE:COP)

This article was written byFollowOver fifteen years of experience making contrarian bets…

LIVE Stock market today: Dow rises, S&P 500 and Nasdaq slip as chip stocks tank, oil surges

Watch full video on YouTube

Perspective: Apple’s crackdown on vibe coding apps

Watch full video on YouTube

- Advertisement -
Ad imageAd image

You Might Also Like

News

ConocoPhillips: More Upside Given Long-Term Cash Flow Tailwinds (NYSE:COP)

By News Room
News

MaxCyte, Inc. (MXCT) Q1 2026 Earnings Call Transcript

By News Room
News

Draganfly Inc. (DPRO) Q1 2026 Earnings Call Transcript

By News Room
News

Fidelity Blue Chip Growth Fund Q1 2026 Commentary (FBGRX)

By News Room
News

Ryerson Holding Corporation 2026 Q1 – Results – Earnings Call Presentation (NYSE:RYZ) 2026-05-09

By News Room
News

Gogo Inc. (GOGO) Q1 2026 Earnings Call Transcript

By News Room
News

Magnite, Inc. 2026 Q1 – Results – Earnings Call Presentation (NASDAQ:MGNI) 2026-05-07

By News Room
News

Sound Point Meridian Capital Preferreds: Inadequate Compensation For Embedded Credit Risk

By News Room
Facebook Twitter Pinterest Youtube Instagram
Company
  • Privacy Policy
  • Terms & Conditions
  • Press Release
  • Contact
  • Advertisement
More Info
  • Newsletter
  • Market Data
  • Credit Cards
  • Videos

Sign Up For Free

Subscribe to our newsletter and don't miss out on our programs, webinars and trainings.

I have read and agree to the terms & conditions
Join Community

2023 © Indepta.com. All Rights Reserved.

Welcome Back!

Sign in to your account

Lost your password?