The Plague-Ground – How baked-in bias cooks chaos

Back in 2018, Barack Obama claimed publicly that “President Trump is a total and complete dipshit.”

No, he did. Really. Watch it here.

That was one of the very first Deep Fake Videos in the world. But in the past two years, the technology that can put any words into anyone’s mouth has advanced astoundingly. Here’s Vladimir Putin addressing the American people last fall. Notice how much more real this fake is.

By next year, it will likely be impossible for people like us to tell if Justin Trudeau is bullying Julie Payette, or if Alexandria Ocasio-Cortez is calling Amanda Gorman a Proud Girl. Experts may be able to tell, but not us, likely until an even newer technology flags these fakes as we are watching them – the same way that Twitter and Facebook flagged Donald Trump’s tweets for “potentially misleading claims.”

That’s the problem. It took years and last month’s Washington insurrection to get the social media companies to do what only technology could — act on Trump’s fake views by cancelling him.

So I hope we don’t have to wait until it’s too late to regulate fake videos in the same way we’ve allowed social media to get away with.

And speaking of too late, yes, we are in the very early days of artificial intelligence, but the consequences of it running amok will be much worse than the damage inflicted by social media running wild and free.

I wrote last month how AI is being used in facial recognition technology to determine if you are gay just by looking at your face.

But AI is bigger than that, so its consequences can be more devastating. Today, AI is a novelty. AI will soon be as pervasive as electricity, powering everything without our much giving it a second thought. Five years from now, we will only pay attention when it fails us, like we do with WiFi today.

In fact AI is already creeping into more of our work and lives, often in a very creepy way – like fake videos, which may have been amusing a few years ago, but are growing seditious today.

On Monday, the MIT Technology Review published a piece headlined: An AI saw a cropped photo of AOC. It autocompleted her wearing a bikini.

Now I know that “Fake” anything is simply an extreme form of “Biased” anything, whether it’s videos or algorithms or the artificial intelligence that’s fuelled by these algorithms.

As the MIT article noted: “Researchers have now demonstrated that the same can be true for image-generation algorithms. Feed one a photo of a man cropped right below his neck, and 43% of the time, it will autocomplete him wearing a suit. Feed the same one a cropped photo of a woman, even a famous woman like US Representative Alexandria Ocasio-Cortez, and 53% of the time, it will autocomplete her wearing a low-cut top or bikini.”

For the past year, we’ve seen dozens of examples of often well-intentioned groups like hospitals and universities producing algorithms that are biased – and racist and sexist and ageist . Hardly any of them is deliberately so. Nearly all simply mirror the unconscious biases of their creators or use historical data that emerged from a time when racism was much more blatant and out in the open.

But this kind of sexist default is more serious and its consequences are much worse simply because we’re all spending far more time watching moving images than reading or hearing text. As Cisco reported last month, by 2022, online videos will make up more than 82% of all consumer internet traffic — 15 times higher than it was in 2017.

With online video viewing growing at the rate of 32% a year, it won’t be long until we spend most of our time in front of a computer watching moving images.

It would be nice by then if we had some way of telling if they were ‘true’ or not.


Share this post

Leave a Comment

Your email address will not be published. Required fields are marked *


Subscribe to my Free Weekly Omnium-Gatherum Blog:

  • Every Saturday the Omnium-Gatherum blog is delivered straight to your InBox
  • Full archive
  • Posting comments and joining the community
  • First to hear about other Ramsay events and activities

Get posts directly to your inbox

This field is for validation purposes and should be left unchanged.

Sign Up for Updates!

Get news from Ramsay Inc. in your inbox.

Email Lists
Email Lists(Required)