【】

  发布时间:2025-03-13 16:44:31   作者:玩站小弟   我要评论
If the past few years have taught us anything, it's that algorithms should not be blindly trusted.Th 。

If the past few years have taught us anything, it's that algorithms should not be blindly trusted.

The latest math-induced headache comes from Australia, where an automated compliance system appears to be issuing incorrect notices to some of Australia's most vulnerable people, asking them to prove they were entitled to past welfare benefits.

Politicians and community advocates have called foul on the system, rolled out by Australia's social services provider, Centrelink.

SEE ALSO:Facebook reveals how many times governments requested data in 2016

Launched in July, the system was intended to streamline the detection of overpayments made to welfare recipients and automatically issue notices of any discrepancies.

Mashable Games

The media and Reddit threads have since been inundated with complaints from people who say they are being accused of being "welfare cheats" without cause, thanks to faulty data.

The trouble lies with the algorithm's apparent difficulty accurately matching tax office data with Centrelink records, according to the Guardian, although department spokesperson Hank Jongen told Mashableit remains "confident" in the system.

"People have 21 days from the date of their letter to go online and update their information," he said. "The department is determined to ensure that people get what they are entitled to, nothing more, nothing less."

Independent politician Andrew Wilkie accused the "heavy-handed" system of terrifying the community.

The siren call of big data has proved irresistible to governments globally, provoking a rush to automate and digitise.

"My office is still being inundated with calls and emails from all around the country telling stories of how people have been deemed guilty until proven innocent and sent to the debt collectors immediately," he said in a statement in early December.

Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

The situation is upsetting albeit unsurprising. The siren call of big data has proved irresistible to governments globally, provoking a rush to automate and digitise.

What these politicians seem to like, above all, is that such algorithms promise speed and less man hours.

Alan Tudge, the minister for human services, proudly announcedthat Centrelink's system was issuing 20,000 "compliance interventions" a week in December, up from a previous 20,000 per year when the process was manual. Such a jump seems incredible, and perhaps dangerous.

As data scientist Cathy O'Neil lays out in her recent book Weapons of Math Destruction, the judgments made by algorithms governing everything from our credit scores to our pension payments can easily be wrong -- they were created by humans, after all.

The math-powered applications powering the data economy were based on choices made by fallible human beings. Some of these choices were no doubt made with the best intentions. Nevertheless, many of these models encoded human prejudice, misunderstanding and bias into the software systems that increasingly managed our lives. Like gods, these mathematical models were opaque, their working invisible to all but the highest priests in their domain: mathematicians and computer scientists.

These murky systems can inflict the greatest punishment on the most vulnerable.

Take, for example, a ProPublicareport that found an algorithm being used in American criminal sentencing to predict the accused's likelihood of committing a future crime was biased against black people. The corporation that produced the program, Northpointe, disputed the finding.

O'Neil also details in her book how predictive policing software can create "a pernicious feedback loop" in low income neighbourhoods. These computer programs may recommend areas be patrolled to counter low impact crimes like vagrancy, generating more arrests, and so creating the data that gets those neighbourhoods patrolled still more.

Even Google doesn't get it right. Troublingly, in 2015, a web developer spotted the company's algorithms automatically tagging two black people as "gorillas."

Former Kickstarter data scientist Fred Benenson has come up with a good term for this rose-coloured glasses view of what numbers can do: "Mathwashing."

"Mathwashing can be thought of using math terms (algorithm, model, etc.) to paper over a more subjective reality," he told Technical.lyin an interview. As he goes on to to describe, we often believe computer programs are able to achieve an objective truth out of reach for us humans -- we are wrong.

"Algorithm and data driven products will always reflect the design choices of the humans who built them, and it's irresponsible to assume otherwise," he said.

The point is, algorithms are only as good as we are. And we're not that good.


Featured Video For You
The Time I Stole My GF's Identity | Tech Confessions
  • Tag:

相关文章

  • Slack goes down again, prompting anxiety everywhere

    Panic briefly took over on Tuesday when everyone's favorite messaging app/millstone went down tempor
    2025-03-13
  • 寶寶十一個月不會爬是怎麽回事

    寶寶從出生到長大這個過程每天都有不同的變化,從寶寶會哭,會笑,會跟大人互動到 ,會爬,坐著,說話  ,走路等這些都會逐漸發生變化,這是寶寶成長的過程  ,大多的寶寶到了十一個月的時候都會爬 ,但有的寶寶到了十一個
    2025-03-13
  • 拔牙後鄰牙鬆動正常嗎

    有一部分人們在拔牙後反而出現鄰牙鬆動的現象 ,這不是一個正常的反應 ,可能是由於拔牙的時候損傷到自己健康的牙齒而導致 ,必須要立即投入治療,千萬不要再去使用這顆牙齒咀嚼 ,防止在咀嚼的時候讓牙齒鬆動變得更加嚴
    2025-03-13
  • 4瓶啤酒等於多少白酒

    喝酒是如今餐桌上最熟悉不過的事情,尤其是到了過時過節的時候,大部分人都不可避免地要和朋友和家人聚會。也有一些經常應酬談判的人,喝酒也是一種餐桌上的禮儀 ,可以調節氣氛。在我們國家的餐桌上 ,來回勸酒 ,而且
    2025-03-13
  • Dressage horse dancing to 'Smooth' by Santana wins gold for chillest horse

    Okay forget everyone else -- this horse named Lorenzo is our favorite Olympic athlete now.。Lorenzo,
    2025-03-13
  • 2周歲寶寶走路不穩總是跑是怎麽回事  ?

    我們平時經常會說還沒學會走路 ,就開始跑了 ,這種情況在小寶寶的身上是很常見的 ,有很多的小寶寶在平時走路都不穩 ,但是在走路的時候卻已經開始跑了 ,很多家長在遇到這種情況之後,往往特別的擔心,擔心寶寶的身體健
    2025-03-13

最新评论