Facebook Latest FAANG To Enter Race For Super-Powered AI Chips

Facebook Latest FAANG To Enter Race For Super-Powered AI Chips

Facebook has reportedly joined fellow FAANG members Alphabet and Amazon in ploughing its own resources into the in-house research and development of new, more powerful chips able to meet the demands that AI is placing on processing power. Moore’s law states that the processing speed and capabilities of computers double roughly every two years.

It held firm for a couple of decades but is now slowing. But our ambition and other tech advancements haven’t yet slowed. The result is the processing power of microchips beginning to lag what the most advanced sections of the tech sector needs them to do. Specifically, the next breakthroughs in AI need more powerful microchips.

Recognising this new bottleneck, the tech giants have decided to not only rely on the chip specialists but to develop their own processors fast enough to bring their AI ambitions to life. Next generation digital assistants that demonstrate ‘common sense’ and are able to hold authentic conversations on any subject are a key target for Facebook, Google (Alphabet) and Amazon.

In Facebook’s case the social media giant also hopes that more advanced AI algorithms will help it more effectively police content posted by users. It has come under a great deal of criticism for allowing extremist and other harmful or manipulative content to be disseminated via Facebook. However, the astounding volumes of content posted on Facebook every day makes doing so manually, or even with the help of current AI algos, an almost impossible task. Google faces a similar challenge with its YouTube video-streaming platform.

Facebook has announced a joint project with Intel but is also pursuing its own in-house R&D. Specialist AI chips will be designed to optimise the performance and speed of a specific task. The approach is a break from the general-purpose chips that the tech industry and computing more generally have been built on. The R&D focus for specialised chips is new silicon and hardware architectures.

Facebook isn’t, however, approaching the project with the ambition of becoming either a chip manufacturer or developing intellectual property that will give it a processing power edge over rivals. It plans to take the same approach as it has with other past hardware projects and develop AI chips to a point before then open-sourcing them.

Ways big tech are looking at to make chips both more powerful and power-efficient include architecture designs of AI neural networks that adapt in response to the particular kind of data passing through them or only ‘fire’ the elements required to solve a particular problem. This is similar to the way our brain’s neurons function. Future chip designs are expected to be heavily influenced by the design of the neural networks they will power.

Leave a Comment