-
Notifications
You must be signed in to change notification settings - Fork 28
Open
Description
This tutorial was a great intro to GPT-2's architecture, thanks! I'm able to run inference (slowly) on my CPU.
Bug report
I wasn't able to get it running on my Apple GPU as the model fails to compile with current nightly build of Max.
File "modular/max-llm-book/.pixi/envs/default/lib/python3.14/site-packages/max/engine/api.py", line 424, in load
_model = self._impl.compile_from_object(
model._module._CAPIPtr, # type: ignore
custom_extensions_final,
model.name,
)
ValueError: Failed to resolve module path for `MOGGKernelAPI`
The above exception was the direct cause of the following exception:
...
RuntimeError: Failed to compile the model. Please file an issue, all models should be correct by construction and this error should have been caught during construction.
For more detailed failure information run with the environment variable `MODULAR_MAX_DEBUG=True`.
I have confirmed identical error output when running pixi run main, please see attached below.
Environment
| MAX version | 26.2.0.dev2026021505 |
|---|---|
| Hardware | Apple M2 Max, 38-core GPU |
| OS | macOS 26.2 (25C56) |
| Metal Version | Metal 4 |
| Python Version | 3.14.3 |
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels