Highlights
Docker’s container solution was a game changer
The ZEISS Research Microscopy Solutions (RMS) team faced challenges in its artificial intelligence (AI) and machine learning (ML) journey. The team crafted a strategy centered on the powerful combination of cloud platforms and Docker’s container technology, marking the beginning of a transformative era. Working with Docker, the ZEISS team took advantage of rapid deployment and scaling, GPU accessibility, cross-platform portability, resource efficiency, version control and reusability, isolation and security, developer familiarity with Docker solutions, and the vast Docker ecosystem.
Introduction
International optics leader leverages containerized AI
For more than 175 years, ZEISS has been shaping technological progress, advancing the world of optics with solutions from its four segments — Semiconductor Manufacturing Technology, Industrial Quality & Research, Medical Technology, and Consumer Markets — and meeting its customers’ needs.
The ZEISS Research Microscopy Solutions (RMS) team uses containers to deploy AI models and code across various platforms. For example, ZEISS leverages its cloud-based arivis AI platform to annotate multi-dimensional microscopy image data and train AI models for segmentation. Those models — including all the code required to run them — are deployed via containers with application-specific interfaces for different clients.
Company profile
About ZEISS
ZEISS is an internationally leading technology enterprise in the optics and optoelectronics industries. The ZEISS Group has more than 48,000+ employees across six continents with more than €10 billion in annual revenue.
Challenges
ZEISS Research Microscopy Solutions (RMS) need for computer vision across platforms
As a leading manufacturer of microscopes, ZEISS offers solutions and services for life sciences and materials research, teaching, and clinical routine. Reliable ZEISS systems are used for manufacturing and assembly in high-tech industries and the exploration and processing of raw materials worldwide and for life science research and basic research. Because modern microscopy is all about “actionable information” extracted from large, multi-dimensional datasets, powerful computer vision and processing methods are an integral component of the RMS software portfolio. In particular, the need for robust and powerful segmentation methods using state-of-the-art AI models is a common use case.
The R&D team at the Product Center for Software inside RMS faced challenges in its artificial intelligence (AI) and machine learning (ML) journey. They needed to execute AI models trained on their cloud platform also in local Windows-based clients such as ZEN, ZEN core, and Arvis Pro, including GPU support. They wanted an efficient way to distribute these AI models and update them independently of the client code. The AI models needed consistent results across different platforms, regardless of where they ran.
The RMS product teams also wanted to shield clients from the complexities of the model internals. These application-specific internals included various ML technologies, data handling methods, pre-and post-processing routines, and APIs. Their primary goal was straightforward: run AI models and code consistently and distribute code with diverse dependencies smoothly.
Integrating AI functionality into the ZEISS team’s client systems presented a challenge. To achieve the same results when running AI models on a client, the local environment needs to be identical to the environment used during the training on the cloud platform. In other words, the challenge is to keep both worlds in sync. Containers are the solution to this challenge.
Additionally, the containerized AI algorithms require access to GPU resources from a Linux -based container on a Windows host system, which added another layer of complexity to their challenges.
Solution
Docker helps put a lense on ZEISS AI solutions
Containers have become a mature and standard deployment technology. Always at innovation’s cutting edge, the ZEISS Research Microscopy Solutions team saw the clear benefits of containers. These containers deploy seamlessly in any environment, are simple to manage, scale, and patch, and are the preferred technology for their machine-learning distribution needs. Their compatibility with a wide range of tools and frameworks boosts their attractiveness. The team’s hands-on work with their training data platform’s cloud modules, which use containers, gave them a deep understanding of the technology’s potential.
Facing these challenges, the ZEISS team crafted a strategy centered on the powerful combination of cloud platforms and container technology. They centralized annotations and training on their cloud platform. The ZEISS team used AzureML to execute and oversee this training, simplifying the training process and automatically creating and registering the AiModelContainer. They designed this container to house the model, all vital libraries, and the necessary code for data I/O, pre- and post-processing, tiling, and model inference.
The team set up a secure connection to connect the cloud and local environments. This connection lets the local client download the needed image directly to the local client system, preserving the data’s integrity and security. ZEISS entrusted the ContainerApps library with managing these container images. This library took on tasks from starting and stopping containers to mounting the GPU, ensuring peak performance.
Communication between the AiModelContainer and local clients became a solution’s cornerstone. The ZEISS team rolled out a well-structured and versioned REST-API interface, guaranteeing smooth, efficient, and mistake-free communication. This focused strategy tackled the immediate challenges and set the stage for future scalable and lasting AI integrations. Using containers with a well-defined interface allows decoupling the development of new AI methods from the release cycle of the main SW platform (separation of concerns). The AI code can evolve without impacting existing client applications, thereby reducing cross-dependencies between components.
Key benefits
Speed, security, and choice
Working with Docker yielded numerous benefits for ZEISS:
Together, these factors solidified Docker as the optimal solution for the ZEISS team, empowering them to amplify performance, guarantee reproducibility, and streamline their software architecture while working securely.
Results
Docker helps ZEISS see a clear solution in containerization for AI/ML
The ZEISS Research Microscopy Solutions team adopted Docker’s solution, marking the beginning of a transformative era. This adoption improved consistency in AI model outcomes and streamlined the deployment process, harmonizing the relationship between models and their dependencies. Using the “containers and code” approach for deployment lets the team get the same results in all client applications because they all use identical environments and tools to run the models inside the containers. This greatly reduced the need for code duplication, letting the development teams focus on creating new features instead of keeping code in sync across platforms. The ZEISS team didn’t just address the challenges of AI model deployment and execution — they also boosted their operational efficiency and value delivery.
As the ZEISS team embarked on this journey, they faced a clear set of challenges. Docker’s container technology offered a compelling solution to these challenges. With a clear vision and the evident benefits of Docker, ZEISS easily gained support and buy-in from key stakeholders.
They shifted to package AI models with their dependencies, renewing their focus on the interface design between containers and clients. This strategic decision refined the internal architecture and clarified team roles. A significant benefit of this strategy was the autonomy it provided to teams. Teams could now innovate on AI methods and integrate them into clients independently. This clear division of responsibilities streamlined workflows, reduced alignment efforts, and paved the way for greater efficiency and innovation at ZEISS.