This design addresses full use of PyTorch installation built into the base Docker image that we use for the runtime.
graph TD
A[RunPod Request] --> B[src/handler.py]
B --> C[RemoteExecutor]
C --> D[Environment Detection]
D --> E{Docker?}
E -->|Yes| F[System UV Install]
E -->|No| G[Local UV Install]
F --> H[Function Execution]
G --> H
J[DependencyInstaller] --> C
K[FunctionExecutor] --> C
flowchart LR
A[Dependencies Required] --> B{Environment Check}
B -->|Docker| C[uv pip install --python sys.executable]
B -->|Local| D[uv pip install --python python]
C --> E[Direct System Installation]
D --> F[Managed Environment Installation]
graph TB
A[handler.py] --> B[RemoteExecutor]
B --> D[DependencyInstaller]
B --> E[FunctionExecutor]
B --> F[ClassExecutor]
G[subprocess_utils] --> D
G --> E
G --> F
I[serialization_utils] --> E
I --> F
- Environment detection handles Docker vs local contexts
- Centralized subprocess handling through
run_logged_subprocess - Consistent error handling via
FunctionResponsepattern
- Faster cold starts without venv initialization
- Reduced container size from simplified builds
- Direct package access eliminates the re-downloading torch and other built-in libraries
# Docker environment
import sys
command = ["uv", "pip", "install", "--python", sys.executable, "--break-system-packages"] + packages
# Local environment
command = ["uv", "pip", "install", "--python", "python"] + packagesThis architecture refactor addresses the core PyTorch installation issues while maintaining API compatibility and improving operational simplicity.
The GPU base image installs Python interpreters 3.9 through 3.14 via the deadsnakes PPA. These multiple interpreters exist to support interactive pod and notebook use cases where users may need a specific Python version. However, for serverless workers, only Python 3.12 is usable because:
- torch and CUDA packages are installed only for 3.12. The base image runs
pip install torchunder 3.12, so the compiled.sofiles and ABI tags targetcp312. No other interpreter has these packages available. - C-extension wheels must match the interpreter ABI. Packages like numpy, Pillow, and tokenizers ship pre-compiled wheels tagged to a specific CPython version (e.g.,
cp312-cp312-linux_x86_64). Installing them under a different interpreter requires recompilation or downloading a different wheel. - The
pythonsymlink points to 3.12. The base image runsln -sf /usr/bin/python3.12 /usr/local/bin/python, so all unqualifiedpythoninvocations use 3.12.
/usr/local/bin/python -> /usr/bin/python3.12
/usr/local/bin/python3 -> /usr/bin/python3.12
When the worker runs uv pip install --python sys.executable --break-system-packages, packages install into the 3.12 site-packages directory, coexisting with the pre-installed torch and CUDA libraries.
CPU images use python:X.Y-slim (Debian-based) where only a single Python version is present. The PYTHON_VERSION build arg selects which base image to use (3.10, 3.11, or 3.12). There is no ambiguity about which interpreter is active.