GET BREAKING NEWS IN YOUR BROWSER. CLICK HERE TO TURN ON NOTIFICATIONS.

X

social media bershidsky
Thinkstock by Getty Images

The big problem with artificial intelligence right now isn’t that it’s taking over; it’s that it’s being entrusted with serious tasks with real-world consequences before it works properly. It’s the equivalent of letting self-driving cars operate in a city without lane markings.

A viral post published recently on Medium by artist James Bridle is the latest case in point. Bridle took a deep dive into a below-the-radar industry: children’s content on YouTube.

Anyone who has ever given an iPad to a small kid knows the kind of thing children find on YouTube before they’re able to type: toy-unboxing and nursery-rhyme videos, official and pirated cartoons featuring popular characters like Peppa Pig. It’s up to parents, of course, if they are OK with their child getting engrossed in these (we took the iPad away from our 4-year-old because we noticed consuming the content made her reluctant to learn to read and irritable when the tablet wasn’t within reach). But the stuff Bridle found was arguably worse than what I’d seen before my wife and I made the decision.

These are videos cheaply thrown together from 3D animation libraries or even algorithmically produced, with names that are collections of tags (“Finger Family Song Nursery Rhymes Animation Education Learning Video”). A parent launches the initial search for educational videos and leaves a kid who can’t even read yet with the YouTube search results, which include weird, sometimes violent, often disturbing clips, which, of course, include ads — that’s the point of running a commercial YouTube account. Factories that produce this kind of thing aim to put together as many clips as possible and make sure they surface in the maximum possible number of searches. Then they harvest the ad revenue as kids “surf” and parents are happy that they’re quiet.

Bridle wrote:

“What we’re talking about is very young children, effectively from birth, being deliberately targeted with content which will traumatize and disturb them, via networks which are extremely vulnerable to exactly this form of abuse. It’s not about trolls, but about a kind of violence inherent in the combination of digital systems and capitalist incentives. It’s down to that level of the metal. This, I think, is my point: The system is complicit in the abuse.”

That, of course, is not just about children’s videos. The Macedonian fake news industry wouldn’t have been possible without Google’s programmatic advertising technology and Facebook’s propensity to tolerate fake accounts. The former is the basis of the business model — both for Google and for the fake news producers — and the latter ensured distribution and ad impressions. Both Google and Facebook — whose estimate that up to 3 percent of accounts are fake and up to 10 percent of its accounts are duplicates can’t be independently verified — are, in Bridle’s logic, complicit. I suspect that logic is correct: Both companies depend on selling enormous audiences to advertisers, and their platforms are designed for that goal more or less regardless of how it’s attained. But even if one gives these companies the benefit of the doubt, they are guilty of over-reliance on poor technology.

Both the creators of disturbing kids’ videos and fake news writers game the platforms. The tag-filled names of the videos are designed to exploit YouTube’s search algorithms, and that clearly works since the channels that run the content keep proliferating. The catchy headlines of the fake stories continue fooling Facebook’s supposedly sophisticated clickbait detection algorithms. During the recent congressional hearing on Russian meddling in the 2016 election, the platforms’ representatives were asked about fake accounts but couldn’t come up with any convincing answers about their efforts to purge them.

At least the tech platforms are beginning to recognize that, in order not to be gamed as easily and as often as today, they need more human eyes and human hands. But the hype they spurred by boasting about their intelligent algorithms has acquired a life of its own. I wouldn’t be surprised if a company testing autonomous vehicles took seriously a recent paper by a group of Massachusetts Institute of Technology and Carnegie Mellon University scientists describing something called the Moral Machine. The idea is to automate the ethical decisions that a human driver makes on the fly, even the toughest ones such as whether to hit a wall and kill the car’s passengers, including a young girl, or run over an athlete and his dog crossing the street on a red light. The researchers used a website to ask people about moral choices. The next step is to aggregate the data and have an AI-based algorithm figure out a decision that corresponding to the crowdsourced wisdom.

“The implementation of our algorithm on the Moral Machine dataset has yielded a system which, arguably, can make credible decisions on ethical dilemmas in the autonomous vehicle domain (when all other options have failed),” the researchers wrote. “But this paper is clearly not the end-all solution.”

Guess which parts of this sentence a tech company would throw away if it decided to implement the algorithm. My bet is on “arguably” and “clearly not the end-all solution.” It has been easy for tech firms to say their algorithmic solutions work because they’ve gotten a free pass on this in the name of progress. Only rarely do alarm bells ring — as in an example Bridle used in his post: A T-shirt maker selling through Amazon automated the creation of slogans and ended up offering shirts that read, “Stay Calm and Hit Her.”

That story is from 2013. Algorithms may have improved, but it’ll be a long time before they can perform tasks that require human judgment without some human figuring out how to game them. These algorithms belong in the lab, where academics apply all the necessary caveats and exercise reasonable doubt about their results. Unleashing them on the human world the way platforms do is molding children’s minds and aggravating political divisions. As Europe’s competition commissioner Margrethe Vestager put it in a speech at the Web Summit conference in Lisbon on Monday, “We have to take our democracy back. We cannot leave it neither to Facebook, nor Snapchat, nor anyone else. We have to take democracy back and renew it because society is about people, not technology.”

A step back to assert human control — even if it cuts into the tech companies’ wide profit margins — is overdue indeed.

Email Bloomberg View columnist Leonid Bershidsky at [email protected]. Follow him on Twitter: @Bershidsky

To send a letter to the editor about this article, submit online or check out our guidelines for how to submit by email or mail.

More in Perspective

<var id="usEQpNZ"></var>
<cite id="usEQpNZ"></cite>
<ins id="usEQpNZ"></ins>
<ins id="usEQpNZ"><span id="usEQpNZ"><cite id="usEQpNZ"></cite></span></ins>
<var id="usEQpNZ"><video id="usEQpNZ"></video></var><ins id="usEQpNZ"></ins>
<ins id="usEQpNZ"></ins>
<ins id="usEQpNZ"><span id="usEQpNZ"><cite id="usEQpNZ"></cite></span></ins>
<cite id="usEQpNZ"></cite>
<var id="usEQpNZ"></var>
<var id="usEQpNZ"></var>
<ins id="usEQpNZ"></ins>
<cite id="usEQpNZ"></cite>
<var id="usEQpNZ"><video id="usEQpNZ"></video></var>
<ins id="usEQpNZ"></ins>
<cite id="usEQpNZ"></cite>
<ins id="usEQpNZ"></ins><var id="usEQpNZ"><span id="usEQpNZ"></span></var>
<cite id="usEQpNZ"></cite>
<var id="usEQpNZ"><video id="usEQpNZ"></video></var>
<cite id="usEQpNZ"></cite>
<var id="usEQpNZ"><video id="usEQpNZ"><thead id="usEQpNZ"></thead></video></var><cite id="usEQpNZ"></cite>
<ins id="usEQpNZ"><span id="usEQpNZ"></span></ins>
<ins id="usEQpNZ"><span id="usEQpNZ"></span></ins>
<var id="usEQpNZ"><span id="usEQpNZ"></span></var>
<ins id="usEQpNZ"><span id="usEQpNZ"></span></ins>
  • 41084961 2018-01-20
  • 664995960 2018-01-20
  • 56136959 2018-01-20
  • 597906958 2018-01-20
  • 69830957 2018-01-20
  • 111429956 2018-01-20
  • 116631955 2018-01-19
  • 99691954 2018-01-19
  • 929719953 2018-01-19
  • 870384952 2018-01-19
  • 478156951 2018-01-19
  • 24828950 2018-01-19
  • 792391949 2018-01-19
  • 385424948 2018-01-19
  • 890694947 2018-01-19
  • 364330946 2018-01-19
  • 190232945 2018-01-19
  • 730220944 2018-01-19
  • 4019943 2018-01-19
  • 951651942 2018-01-19