Artificial Intelligence: Explosive Growth, Power Crisis and Role of Photonics
Intelligence is a mechanism to solve problems, especially the issue of staying alive. It primarily involves finding food and shelter. It also includes the ability to gather knowledge, to learn, be creative, form strategies or engage in critical thinking. It manifests itself in a huge variety of behavior. All these are the aspects of real intelligence. It is very interesting to know that human endeavors lead to the creation of these features of human brain in machines it is known as “Artificial intelligence (AI)”. Once machines mimic humans and can make their own decision, think and learn, they are said to be artificially intelligent.
AI is anticipated to improve medical diagnosis
AI is anticipated to improve medical diagnosis, identify potential national security threats more quickly, and solve crimes. Generated AI systems viz. ChatGPT and Bard have the potential to provide new capabilities. These capabilities can be used in fields such as education, government, law and entertainment. Advanced chatbots, virtual assistants and language translation tools are examples of generative AI systems in widespread use.
AI is mitigating human efforts
AI is mitigating human efforts, with the growing population, global issues of food, safety, security, surveillance, water and global warming effectively. AI is also playing a crucial role in entertainment industries like Netflix, Spotify and Alexa.
The idea of artificial intelligence is very ancient. It has originated from the word “automaton” which comes from ancient Greek, and means “acting of one’s own will”. The actual work in the field of AI was done around 1900, with the creation of “robots” with artificial brains.
In 1929 Japanese professor Makoto Nishimura built the first Japanese robot
In 1929 Japanese professor Makoto Nishimura built the first Japanese robot, named “GAKUTENSOKU”. However, the real interest of scientific fraternity in AI came during 1950-1956. Alan Turning in 1950 published “computer machinery and intelligence”.
John Mc Carthy in 1955 in a workshop coined the term “Artificial intelligence” for the first time. The field started maturing during 1957-1979. AI based computer program came into picture. It started replicating the thinking and decision-making abilities of human experts. Mathematicians and computer scientists globally contributed towards the maturation of this field and in the year 1979 “Association for the advancement of Artificial intelligence” (AAAI) was founded.
The boom in the field of AI came during 1980-1987. It came from both major breakthroughs in research and additional government funding to support the researchers. A new concept called “Deep Learning” technique became popular which allowed computers to learn from their mistakes and make independent decisions. In 1981 the Japanese government allocated dollar 850 million to the fifth-generation computer project based on AI.
The major concern is a huge power requirement
Since the inception of concept of AI, it has come long way. The major concern is a huge power requirement. The power required and slow transmission of electronic data between the processor and the memory is limiting various complex operations with greater accuracy. The AI algorithms are huge, power consumption in the data centers is just going through the roof. GPT-3 took something like 300,000 exaflops of computing power to train with an ordinary high-end PC. It could take something like 200-300 years to run an AI algorithm is incredibly expensive.
It takes a huge number of computational resources and then one has to train it which means putting it through its spaces until it actually gives accurate answers to the queries. Power consumption for high performance computing is just going sky high. A compounded annual growth rate of somewhere around 8.5% is predicted annually. AI workloads will account for about 80 % of that power.
Photonics in AI is being explored to reduce the hardware pain points in computing
In this regard, photonics is anticipated to play a major role. Photonics in AI is being explored to reduce the hardware pain points in computing. As compared to electronics, optical technology has the distinct advantages of both lower power and lower latency, which are two elements with particular significance to AI. Industry leader such as IBM, HP and Intel are investing billions of dollars in the transition to photonics in AI due to high bandwidth, low power consumption and efficient data movement. Photonic AI chips are on the way. Computers that literally thinks with light instead of electrons.
Startup like lightelligence is developing optical chips that empower the next generation of high performing computing tasks. By processing information with light, chips offer ultrahigh speed, low latency and low power consumption which has not been possible in traditional electronic architecture. Researchers at Monash University in Melbourne have developed a single finger nail sized chip that can transmit 39 terabits per second.
Future of AI is photonics
In 2020, researchers at George Washington university demonstrated that the use of different wavelengths of light can be used to run multiple signals on a single optical fiber. Therefore, the future of AI is photonics, the computer that uses laser lights. Using light to run AI tasks at blazing speed with minimal energy. Photons flow through waveguides and tunable lasers to perform matrix multiplication in parallel, reducing heat and latency. Expert says these chips could empower smartphones, edge sensors and data centers with on device processing that protects privacy. Recent DMOS shows real time image and voice tasks achieving accuracy close to GPUS, but with far lower power drawn.
Therefore, the way we can capture light and then put it into our chip and manipulate it and further transfer information, it’s all amazing. Processor that exits in our phone, computers, works with the electron. If we compare electron and photons (light), the photons are much faster. Moore’s law is still going strong. Moore’s law is a prediction made by American engineer Gordon Moore in 1965 that the “number of transistors per silicon chip will double every year”.
Process nodes are getting smaller and smaller every year and they are jamming more transistors in every CPU that are made today. But there is also a companion law called “Dennard scaling” it states that, as transistors gets smaller, their power density stays constant. In case of small geometries, leakage becomes dominant and “Dennard scaling” is no longer valid. So, it’s really hard for chip companies to be able to get that performance up now without really burning a ton of power.
There are so many bandaid solutions that are out there, and there more coming. It’s just more tape on the pipe that is going to burst. The idea of photonics has been a bit of a holy grail in computing for a long time. Totally unique than anything, today it has laser that need to have fibres that connect to our chip. We actually have to not just build the chip, but we also have to build the boards that the chip goes onto. Therefore, photonics is anticipated to play a crucial role towards future generations of AI.
Author
The author Dr. Satchi Singh is a researcher in the field of optics and photonics.