Edgecore Networks Expands AI Data Center Solutions with New 32-Port 800G Switch at OCP Summit
Edgecore Networks, a leader in innovative networking solutions for enterprises, data centers, and service providers, has announced the addition of a new 32-port switch to its 800G AIS series. These AI switches are designed to meet the high demands of AI and machine learning (ML) workloads, providing low latency, high radix, and dynamic load balancing to minimize network congestion.
The switches are optimized for lossless networking, incorporating key technologies such as RoCEv2, DCQCN, and Dynamic Load Balancing (DLB), while also supporting the SONiC operating system. This open-source platform enables seamless scaling and optimization, critical for managing AI/ML traffic patterns and latency-sensitive applications. Edgecore’s comprehensive solution also includes pluggable optical modules and a robust SONiC-based network operating system, offering easy integration with modern data centers.
The partnership with SONiC software ecosystem collaborators such as Aviz Networks, BE Networks, Dorado, and Netris enhances the solutions with unparalleled visibility, performance assurance, and deployment efficiency. This vendor-agnostic approach ensures compatibility with popular server NIC devices, providing flexibility and adaptability to evolving AI data center needs.
Nanda Ravindran, Vice President of Product Management at Edgecore Networks, emphasized the importance of this innovation: “We are thrilled to showcase our latest AI/ML data center 800G platforms at the OCP Summit. Through our open-source community and innovations, we offer our customers the flexibility they need to scale and optimize their AI/ML fabric solutions.”
Key highlights of Edgecore’s AI data center solutions:
- 64-Port 800G Switch: Supports 128-256 nodes, featuring NCCL PXN to reduce latency in cross-switch communication, ensuring scalability for AI workloads.
- Lossless Ethernet: Optimized for high-performance AI applications, offering maximum throughput and minimal latency.
- DCQCN with PFC/ECN Support: Advanced traffic management ensures smooth AI fabric operation with minimal congestion.
- ECMP Eligible Mode: Improves load distribution by breaking down large flows into smaller flowlets, reducing congestion and improving performance across ECMP links.
Edgecore will present its end-to-end AI networking solutions, including demonstrations of advanced open-loop and immersion cooling AI technologies, at the OCP Global Summit, taking place at the San Jose Convention Center (Booth #A4) from October 15-17.
About Edgecore Networks
Edgecore Networks Corporation, a subsidiary of Accton Technology Corporation, provides wired and wireless networking products to customers across AI/ML, cloud data centers, service providers, and enterprises. As a leader in open networking, Edgecore offers a wide range of products, including Wi-Fi access points, packet transponders, virtual PON OLTs, and 800G switches with open-source or commercial NOS and SDN software. For more information, visit www.edge-core.com.