The hardware and infrastructure that enables AI, machine learning, and inference has consistently improved year after year. As high-performance processing, GPU-acceleration, and data storage have advanced, the performance available in the data center has become powerful enough to support machine learning and training workloads. Still, data bottleneck challenges continue to exist over WANs when teams look to implement AI workloads in real world, production environments.

With the advent of 5G wireless networks, deploying AI at the edge – and managing the movement of crucial data between the edge and the data center – is becoming more practical.

Source: https://www.hpcwire.com/2021/08/18/leveraging-5g-to-support-inference-at-the-edge/


0 Comments

Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *

Facebook
LinkedIn