Docker Interview Pack
This folder contains Docker-focused interview material grounded in the MangaAssist project.
What This Covers
- ECS Fargate as the steady-state container platform
- ECR, multi-stage Docker builds, and image minimization
- SageMaker inference containers for custom model serving
- vLLM container optimizations for throughput and cost
- Container startup, warmup, and cold-start mitigation
- Dockerized integration testing with LocalStack and TestContainers
- Container supply-chain security with scans, SBOMs, signatures, and rollback policy
Primary Study File
Companion LLD
Individual Scenario Deep Dives
Each scenario broken out separately with: full story grounded to MangaAssist, deep-dive Q&A, decision tables, tradeoffs discussed, scale planned, and intuition gained.
| # | File | Topic |
|---|---|---|
| 1 | scenario-01-multistage-fargate.md | Multi-stage builds + ECS Fargate hybrid compute |
| 2 | scenario-02-coldstart-elimination.md | SageMaker inference container cold-start elimination |
| 3 | scenario-03-vllm-serving.md | vLLM serving container — throughput, cost, operability |
| 4 | scenario-04-gpu-oom-containment.md | GPU OOM → container restart containment |
| 5 | scenario-05-dockerized-ci-testing.md | Dockerized CI integration testing (LocalStack + TestContainers + WireMock) |
| 6 | scenario-06-supply-chain-security.md | Container supply-chain security and release gating |
Grounding Documents
- ../04-architecture-hld.md
- ../04b-architecture-lld.md
- ../11-scalability-reliability.md
- ../Tech-Stack/01-detailed-tech-stack.md
- ../Tech-Stack/02-open-source-libraries.md
- ../Model-Inference/01-inference-pipeline-challenges.md
- ../Model-Inference/06-gpu-architecture-challenges.md
- ../API-Design-and-Testing/02-api-testing-strategy.md
- ../Security-Privacy-Guardrails/07-third-party-supply-chain-risk.md