The rapid advancements in AI have brought about significant changes in the world and continue to do so. While we are all aware of the challenges that come with these improvements, such as privacy invasion and socioeconomic inequality, energy and resource strain remain an unspoken risk. The problem is that the increased use of AI is placing a significant impact on the workload of data centers globally, leading to more environmental damage.
The workload of video, artificial intelligence, machine learning, data analytics, and more in the cloud and at the edge, all dependent on CMOS processors, is skyrocketing beyond their capacity. The challenges that arise due to the pandemic have accelerated the usage of these processors at an exponential rate, placing the data centers under a tremendous burden. Dr. Avi Messica, co-founder of Neologic, notes this impact, stating that power consumption is a hidden cost of AI use that may have more immediate effects on our lives than others.
The high cost of operating data centers has to be dealt with immediately, Messica notes. The continued usage of data centers places significant pressure on power consumption, which in turn affects CO2 emissions. Moreover, the high level of power consumption results in increased operational costs and affects the profitability of data center providers.
To tackle this issue, the industry must step up its game and come up with innovative solutions that support power efficiency while meeting the growing demands of the global market. Neologic’s founders have developed a new approach to microprocessor design that relies on reducing the complexity of digital circuits. The design allows for more efficient and compact processors than those built using current CMOS technology.
Neologic’s chip design technology delivers up to two-digits of percentage improvement in processors’ power consumption or area. This contributes to reduced chip costs, but it also means extended lifespan of IT hardware as well as other advantages, including ESG contribution, less consumption of raw materials, and less waste at the foundry, which takes care of end-of-life hardware.
Dr. Messica and his partner Ziv Leshem are confident that their innovative technology provides a solution at the processor level that data center server providers will find valuable— especially in the ChatGPT-4 era. ChatGPT-4 has a mind-blowing number of 100 trillion parameters that require massive and long training to get it to work; 400 terabytes of data are loaded to memory and computed. The cost of training such models over periods of weeks, can easily skyrocket to the tens to hundred millions of dollars.”
Currently, Neologic is offering chip designs that are manufacturing-ready (Soft and Hard IP), but it has plans to introduce its own data center processors. Overall, the company’s innovative solution goes beyond current state-of-the-art technology, delivering the industry with a more cost-effective solution in terms of power efficiency.