Skip to content
Our world of words.

Are you afraid of Artificial Intelligence?

“Hey Siri… Start dictation.” Whether you think of digital assistants, Deep Blue or something more sinister, Alan Coates, 44’s Head of Digital Projects and Products, explores the landscape of Artificial Intelligence (AI), to find out if we really should be afraid of the machines taking over…

This year’s Consumer Electronics Show was the biggest trade show of all time. CES 2018 was held in Las Vegas in January, where 4,000 exhibitors showcased the brightest, smartest and newest gadgets to more than 180,000 attendees. If CES 2017 was the year of Virtual Reality, this year it was all about AI. Home speakers with Google Assistant built in. TVs that change the picture settings based on how it views the room it is placed in. All very intelligent. But why should we care?

A game of chess

The term Artificial Intelligence was coined in 1956 by Dartmouth professor John McCarthy. He invited a small group of students to come together and start thinking about how they could get machines to do things like understand and use language. The group initially failed, but others took on the challenge – such as teaching machines to play checkers.

Fast forward to 1997 where chess Grandmaster Garry Kasparov wins the first game against IBM’s Deep Blue but loses the second. Deep Blue was programmed to recognise every known move, and combination of moves, in its huge database. It would then scan this database to play the response which had the highest odds of winning. No human has won a game of chess against a super computer since. Artificial, yes. Intelligent, yes. But there’s more to AI than this.

Facing the truth

Deep Blue was based on ‘Machine Learning’, computers that use huge data sets to predict outcomes and deliver the best odds. More recently a technique called ‘Deep Learning’ has changed how we think about AI. Not only does Deep Learning use huge data sets for its intelligence, but it passes the information through ever-changing networks of calculations, inspired by how humans think.

The practical upshots of computers that ‘learn’ are seemingly endless, but one rather insidious application has emerged in just the past few weeks. Based on experimentation of open source AI software, a hacker has created a programme called ‘FakeApp’, and shared it on popular social news site Reddit. The software uses Deep Learning to filter through hundreds of images of chosen celebrities, and then map those images onto explicit videos, superimposing one face for the other.

In 2017, two similar experiments were done at the University of Washington. The first was to use AI to create a fake audio recording by Barack Obama, after learning his speech patterns. A second experiment used that fake speech, overlaying it onto a real Obama video, so it looks like he’s reading out the fake speech. In the wrong hands, the political impact of this is huge.

This echoes the initial fears around ‘Photoshopping’ images, when retouching photographs in the beauty industry became the way to sell cosmetics and dietary supplements. Since then, the world has become aware of the technology, and suspicious of images we see on the internet as a result. This is unfortunately how we now have to think about videos in the future too.

Should we be worried?

Perhaps the thing that worries IC professionals most about technology, is the effect of AI on the workforce and the impact it will have on jobs.

Economist Kenneth Rogoff argues that “since the dawn of the industrial age, a recurrent fear has been that technological change will spawn mass unemployment. Neoclassical economists predicted that this would not happen, because people would find other jobs, albeit possibly after a long period of painful adjustment. By and large, that prediction has proven to be correct.”

Supporting this theory, a study by Deloitte analysed 140 years of census data, and stated that it looks like technology has actually created more jobs than it has replaced. Where technology has made jobs like typists, secretaries and weavers (to use the study’s examples) obsolete, there has been a rise in more ‘caring’ jobs, such as teaching and educational support workers.

So Artificial Intelligence is on the rise, and we will be seeing more mention of it in the coming months and years. But we should remember that when one browser window closes, another one opens.

Hey Siri… Save.

Hey Siri… Save article.

Hey Siri… Don’t worry, I’ll do it myself.

Are you inspired? Let us give you a call