To power Artificial Intelligence (AI), the startup creates a giant computer chip

Artificial Intelligence

Calculator Chips are generally small. The processor that powers the recent iPads and iPhones is tinier than a fingernail. Even the devices used in cloud servers are smaller than a postage stamp. With this new chip from a startup called Cerebras, is bigger than an iPad. The silicon monster is almost 22 Cm, which is approximately 9 inches. It makes it the biggest computer chip and a masterpiece to the tech industry’s hopes for AI. Cerebras expected to give it to tech companies who are attempting to build smarter AI promptly. The chip makers believe that it can be used in large data centers. They also think that it can help to accelerate the progress of AI from self-driving cars to digital assistants like Amazon’s Alexa. Many firms are developing new chips for AI, including traditional chipmakers like Intel Corporation and Qualcomm Incorporated and other startups in the USA, Britain, and China.

The growth in all things AI is driven by a technology termed as deep learning. AI systems built on it are developed with the help of a method named as training. In that algorithms optimize to a task by examining example data. The training data may be medical scans interpreted to mark tumors or a bot’s repeated trials to win a videogame. This software is made in a way to be more powerful when it has more data to learn from, or the learning system itself is more extensive and more complex. Eugenio Culurciello, who has operated on chip designs for AI, describes the scale and purpose of Cerebras’ chip “crazy.” Because intense computing power demands large scale AI projects such as self-driving cars and virtual assistants, also, it will be expensive, but some people will probably use it.

As per the new research on the energy consumption of deep-learning training, it may cost US$ 350,000 to produce a singular piece of language-processing software. The for-profit AI lab OpenAI has expected that in 2012-2018, the amount of computing power employed on the most extensive published AI experiments which developed in every three and a half months. To practice deep-learning software on responsibilities such as recognizing images and using clusters of many GPUs wired collectively. To make a bot that took on the videogame Dota 2 previous year, OpenAI joined hundreds of GPUs for weeks.