Game of Thrones for the Self-Driving Car Market: How the Battle Lines Are Being Drawn - June 9, 2018
New ‘Wonder Drug’ Made From Poo Represents Great Hope For Medicine - December 11, 2019
Naspers Ups Just Eat Offer To £5 Billion In Move To Gazump Takeaway.com - December 10, 2019
Ecommerce social media marketing strategies - December 9, 2019
Social shopping websites making customer journeys short and seamless - December 9, 2019
Fortnight Developer Epic Ploughing Profits Into Games Development Platform - December 4, 2019
Singapore’s Sovereign Wealth Fund Eyes UK Biotech Star Oxford Nanopore - December 3, 2019
On-Demand Wages Advance Start-Up Secures £8 Million Funding - December 3, 2019
Second Hand Fashion Platform Achieves Unicorn Status - December 1, 2019
Alibaba Fintech Arm Raising $1 Billion Investment Fund - December 1, 2019
RBS Takes First Step Into Digital-Only World As It Debuts New Bank Bó - November 29, 2019
AI Learns To Hide Information From Its Creators
CycleGAN, a kind of advanced AI known as a ‘neural network’ developed by leading experts from Google and Stanford University has been reported as making the intriguing, if somewhat disturbing, development of hiding data from its creators to ‘cheat’ at a task assigned to it. The sneaky AI is a reminder just how clever neural networks are becoming and the need for careful checks and balances to be put in place now before it becomes ever more sophisticated.
Tasked with teaching itself how to convert aerial satellite images into street maps such as those used by Google Maps, and then back into aerial images again, found a short cut to make its job easier. Details CycleGAN chose to omit when making the initial conversion suddenly reappeared when the street map images were converted back. Theoretically there should have been no connection between the original aerial images and those reverse engineered from the street maps. The two tasks should have been separate.
In its coverage of the spooky development, online technology media TechCrunch reported that details such as skylights on buildings not included in the street map image reappeared in the images converted back. It ‘hid’ the extra data inside the street map files in a ‘nearly imperceptible, high-frequency signal’. This helped it fulfil a command given to it for cyclical consistency but represents a slight of hand the AI came up with all on its own.
The AI taught itself to become a master of stenography – the skill of encoding data in images in a way imperceptible to the human eye. It was a short cut which allowed the neural network to achieve the results it was told to while avoiding actually learning how to perform the task in the way it was meant to, speeding up the process.
Artificial neural networks (ANN) try to simulate the way our own brains assimilate information and learn from it. ANNs pick out patterns in data and come to conclusions based on them. The data is processed through different levels, like contexts in real life, with the ‘learning’ a result of how the data fits into and reacts to these different contexts. A further more recent development has been Adversarial Neural Networks which consist of two competing AIs which learn from each other, further refining their final output.