DeepL to Launch Nvidia DGX SuperPOD: A New Era in Language AI Computation

DeepL is set to deploy the Nvidia DGX SuperPOD with DGX GB200 systems by mid-2025, enhancing its computational power for Language AI research and development. This deployment will exceed the capabilities of its previous supercomputer, DeepL Mercury, supporting the company in delivering instant, high-quality translations that meet the needs of its global clientele, including many Fortune 500 companies.

In a game-changing announcement from the bustling tech landscape of Cologne, Germany, DeepL is poised to pioneer the European deployment of the Nvidia DGX SuperPOD featuring cutting-edge DGX GB200 systems. Slated to become operational by mid-2025, this robust computational backbone will supercharge DeepL’s research efforts, allowing the company to push the boundaries of its innovative Language AI platform—a tool designed to shatter the barriers of language for enterprises and professionals across the globe. Jarek Kutylowski, the visionary CEO and Founder of DeepL, shared his enthusiasm: “DeepL has always been a research-led company, which has enabled us to develop Language AI for translation that continues to outperform other solutions on the market.” This substantial investment in Nvidia’s accelerated computing technology promises enhanced capabilities for DeepL’s engineering teams, equipping them to introduce new features and refine existing models that their clientele admires. The Nvidia DGX GB200 systems, characterized by their ultra-modern liquid cooling and extensive rack-scale architecture, boast the capability to scale up to tens of thousands of GPUs. With these resources, DeepL will deftly manage high-performance AI models critical for advancing its generative AI ambitions. Furthermore, this deployment eclipses the performance benchmarks previously set by DeepL Mercury, a member of the prestigious Top500 supercomputers, reinforcing DeepL’s commitment to excellence in the AI landscape. Charlie Boyle, Nvidia’s vice president for the DGX platform, emphasized the need for speed and efficiency in AI applications: “Customers using Language AI applications expect nearly instant responses…” This sentiment underscores the significance of DeepL’s latest infrastructure enhancements for producing deployable AI solutions that elevate cross-cultural communication.

DeepL, founded in 2017, has made significant strides in the AI landscape, particularly in language processing and translation technologies. The current push towards implementing the Nvidia DGX SuperPOD reflects DeepL’s strategic vision to increase its computational capabilities. By utilizing these advanced systems, DeepL aims to bolster its already vast customer base—comprising over 100,000 organizations, including major players from the Fortune 500—by further enhancing the quality and responsiveness of its AI-driven language tools. The announcement of the DGX SuperPOD addition follows a series of advancements from DeepL, including the launch of a New York tech hub and the introduction of a next-generation large language model (LLM) that exceeds the benchmarks set by contemporary competitors like GPT-4.

DeepL’s venture into deploying the Nvidia DGX SuperPOD signifies not only a monumental leap in computing power but also a profound commitment to refining its Language AI technologies. This strategic move is set to empower businesses globally, enabling them to communicate more effectively across languages and cultures. As DeepL continues its trajectory of growth and innovation, including the recent addition to Forbes’ Cloud 100 list, its role as a leader in breaking down language barriers becomes more pronounced and impactful than ever.

Original Source: www.hpcwire.com


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *