Algorithm Accountability 

Tuesday, November 12th, 2019

The recent word is that Apple’s new credit card has been providing women with substantially lower credit limits than men. Once a single viral Twitter thread raised this issue, breathless “Apple is sexist!” pieces quickly popped up across the web. With precious little data, however, it’s not yet clear if this specific concern is even valid.

The problem, however, is that it could be true, and it wouldn’t even require malicious intent. A credit card issuer could be “sexist” (or “racist”, or biased in countless other ways), without even intending to be. This is the result of a much broader issue, namely the black box nature of how too much of society now operates.

In the past decade or two, secret, unaccountable algorithms have taken control of far too many decisions which impact our lives. Mathematician and writer Cathy O’Neil discussed this broader problem with Slate, in a piece well worth reading. Perhaps a story about the failures of Apple (and Goldman Sachs) is how we push things forward to better transparency when utilizing algorithms. Here’s hoping.


If you enjoyed this post, get updates via Twitter, Facebook, or RSS.