SC2023
Highlights from the International Conference for High Performance Computing, Networking, Storage, and Analysis.
Great Partner Demonstrations of AI Inference With & Without CPU’s
We shared much more exciting news at SC23, including partnerships with valued Deep Learning Accelerator leaders such as Qualcomm, AMD and IBM who demonstrated the NR1-S™ on their own DLAs. In other words, a CPU-free server with DLA and NR1-S only to achieve the ideal AI-centric inferencing in the data center.
Lenovo and Supermicro were also at our side as first OEM adopters of the NR1-M for integrated inferencing with a combination of CPU, DLA and NAPU to offload the hardware for 10x performance, lower latency and higher energy-efficiency. A triple win for the deployment of trained AI models – good for linear scalability, good for your wallet and good for the environment.
Thinking Differently: The Future of AI-Centric Data Centers
LIVE SESSION
During the conference, CEO Moshe Tanach held a live talk at the main exhibit hall emphasizing the problems in today’s CPU-centric data centers that were never designed for AI. Indeed, he asserted that the world’s data centers currently utilize only 30 percent of their CPU silicon to full capacity, leaving a significant 70 percent of waste. NeuReality addresses this issue with an exceptionally efficient AI system architecture taking silicon to full performance, where agile NR1 NAPUs now handle tasks that CPUs with GPU add-ons attempt to manage, but inadvertently create bottlenecks—very costly ones at that.
He asked the AI tech community to think differently about the kind of data centers the world needs to support power-hungry AI pipelines and applications of the future.
He made the case for far more efficient AI Inference at scale and highlighted the first movers making bold moves with NeuReality – AMD, IBM, Qualcomm, Lenovo and Supermicro. It takes an industry to make revolutionary change, so he invited technologists, engineers and software developers to join NeuReality to achieve full democratization of AI – affordable, accessible and available to all.
Missed Us at SC23? Connect With Us Now!
Whether you’re handling Computer Vision, Recommendation Engines, Natural Language Processing, Fraud Detection or Financial Risk Modeling today, or getting ready for big, intensive future Generative AI and Large Language Models in the future, let’s talk. We’ll provide you with different ways to take advantage of AI Inference as it was meant to be.
With NeuReality’s performance promise, you can create more and better customer AI experiences to drive revenue, while reducing daily operation costs at the same time.
Find out how. Contact us today.