コース概要
Introduction to AI Inference with Docker
- Understanding AI inference workloads
- Benefits of containerized inference
- Deployment scenarios and constraints
Building AI Inference Containers
- Selecting base images and frameworks
- Packaging pretrained models
- Structuring inference code for container execution
Securing Containerized AI Services
- Minimizing container attack surface
- Managing secrets and sensitive files
- Safe networking and API exposure strategies
Portable Deployment Techniques
- Optimizing images for portability
- Ensuring predictable runtime environments
- Managing dependencies across platforms
Local Deployment and Testing
- Running services locally with Docker
- Debugging inference containers
- Testing performance and reliability
Deploying on Servers and Cloud VMs
- Adapting containers for remote environments
- Configuring secure server access
- Deploying inference APIs on cloud VMs
Using Docker Compose for Multi-Service AI Systems
- Orchestrating inference with supporting components
- Managing environment variables and configs
- Scaling microservices with Compose
Monitoring and Maintenance of AI Inference Services
- Logging and observability approaches
- Detecting failures in inference pipelines
- Updating and versioning models in production
Summary and Next Steps
要求
- An understanding of basic machine learning concepts
- Experience with Python or backend development
- Familiarity with foundational container concepts
Audience
- Developers
- Backend engineers
- Teams deploying AI services
お客様の声 (5)
OC is new to us and we learnt alot and the labs were excellent
sharkey dollie
コース - OpenShift 4 for Administrators
Very informative and to the point. Hands on pratice
Gil Matias - FINEOS
コース - Introduction to Docker
Labs and technical discussions.
Dinesh Panchal - AXA XL
コース - Advanced Docker
It gave a good grounding for Docker and Kubernetes.
Stephen Dowdeswell - Global Knowledge Networks UK
コース - Docker (introducing Kubernetes)
I mostly enjoyed the knowledge of the trainer.