Skip to content

Feature/tirex integration#3038

Open
lukfischer wants to merge 54 commits intounit8co:masterfrom
NX-AI:feature/tirex-integration
Open

Feature/tirex integration#3038
lukfischer wants to merge 54 commits intounit8co:masterfrom
NX-AI:feature/tirex-integration

Conversation

@lukfischer
Copy link
Copy Markdown

Checklist before merging this PR:

  • Mentioned all issues that this PR fixes or addresses.
  • Summarized the updates of this PR under Summary.
  • Added an entry under Unreleased in the Changelog.

Fixes #3016.
Addresses NX-AI/tirex#29.

Summary

This PR adds and hardens the TiRexModel integration in Darts.

It includes the TiRex forecasting wrapper, model registration in darts.models, targeted tests, and example notebook/docs integration. It also fixes a few integration issues discovered during review:

  • align TiRexModel.fit() with the standard FoundationModel/Darts API
  • expose the correct capability flags for covariates and multivariate support
  • keep strict validation for unsupported covariates and multivariate inputs
  • decouple top-level foundation model imports so TiRex is not disabled by unrelated optional dependency failures
  • add TiRex to the examples page and model capability table
  • clean up TiRex notebook and docstrings so they pass Sphinx/doc generation

The integration also requires explicit license acknowledgement. Users must pass accept_license=True when constructing TiRexModel to confirm acceptance of the NXAI Community License.

Other Information

Validation performed:

  • pytest -q darts/tests/models/forecasting/test_tirex.py
  • pre-commit run --all-files
  • make build-all-docs SPHINXOPTS="-W"

The changelog entry under Unreleased still needs to be added before merge.

- Implement TiRexModel based on Darts FoundationModel API
- Add optional tirex-ts integration via load_model() + forecast()
- Require accept_license=True to acknowledge NXAI Community License
- Support deterministic + quantile probabilistic forecasting
- Add unit tests with stub TiRex pipeline (no external dependency required)
- updated .gitignore (added .python-version)
added detailed docstrings to tirex_model.py
set fixed chunck lenghts
added examples notebook
there is still something wrong with the forecast. need to investigate!
- align TiRexModel.fit with FoundationModel API (incl. val/trainer kwargs)
- expose correct covariate capability flags (no past/future covariates)
- keep strict univariate/covariate validation for train + validation inputs
- decouple Chronos2/TimesFM2p5/TiRex imports in darts.models.__init__
- extend TiRex tests for API compatibility and capability flags
- add TiRex notebook to docs examples and TiRex row to model support table
- clean TiRex example notebook wording and TiRex docstring examples
- fix TiRex docstrings to be Sphinx/numpydoc compatible
@lukfischer lukfischer requested a review from dennisbader as a code owner March 17, 2026 06:47
@review-notebook-app
Copy link
Copy Markdown

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@daidahao
Copy link
Copy Markdown
Contributor

@lukfischer

Thank you for the PR! The integration is in a very good shape already.

I scanned the code and noticed some small issues like duplicated validation logic and class signature. Because I don’t know how busy you might be, I wonder if it would easier for you to let me modify the code directly (I wrote the initial FoundationModel class and am familiar with Darts torch infrastructure) to build a tighter integration? Alternatively, I could leave some comments here for your reference in the coming days. Let me know what you prefer!

@lukfischer
Copy link
Copy Markdown
Author

Hi @daidahao!
Thx for the quick code scan and your suggestions! Right now we are indeed under quite heavy workload and therefore it would be great if you just do the edits directly. And of course comments are also fine. I will try to incorporate those as fast as possible! Thx for your support!

@daidahao
Copy link
Copy Markdown
Contributor

@lukfischer
Glad that I could help! Because I am not the owner of this repo, I could apply the edits only if you add me as a collaborator on your fork repo, aka, NX-AI/darts.

If you could do that, I could then apply necessary edits to your code in the coming days. There might be a few optional edits you might consider like adding fidelity tests, I will leave some comments for those and we can discuss about them.

@lukfischer
Copy link
Copy Markdown
Author

@daidahao
I added you as a collaborator to our fork. Looking forward to integrate this together ;)

- Update `example.rst` as well

Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
- Use Darts default docstring where necessary
- Format notes and warnings

Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
- Add back TorchForecastingModel argument docstring
- Remove any that are training-specific

Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
- Remove `device` argument to model.
- Use `PLForecastingModule.device` as the device for initiating TiRex
  model.

This prevents moving tensor across devices when inference.

Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
- Fold multivariate target components into batch dimension so
  multivariate forecasting can be supported.
- Remove duplicated covariate validation logic.

Note that all torch forecasting models in Darts support multivariate
forecasting.

Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
According to docstring, when `enable_finetuning=None`, foundation model
is not fine-tuned. However, the `model_params` was not being set
correctly.

Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
@daidahao
Copy link
Copy Markdown
Contributor

I think the fidelity tests would still fail on GitHub CI, but not locally. I investigated the tirex codebase and could not find randomness or sampling that could explain the numerical differences. Any ideas why that happened? @lukfischer @martinloretzzz

Other than that, I think this PR is review-ready. @dennisbader

@martinloretzzz
Copy link
Copy Markdown

We observe the same thing in our repositories, there's a small dependence on the actual hardware it is run on (TiRex is a recurrent model, so floating point rouding errors can get amplified over the sequence length; we also use BF16).

The only thing we did about it was to increase the error tolerances so that the tests still pass on all hardware we test on.

Copy link
Copy Markdown
Collaborator

@dennisbader dennisbader left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for this great PR and the contribution @lukfischer, @martinloretzzz and @daidahao 🚀 TiRex will be a really nice addition to Darts :)

The implementation is solid! I added a couple of suggestions in the comments which we should address before merging. The main one being that we should enable fine-tuning for TiRex to be aligned with all general torch (and foundation) model support.

pyproject.toml Outdated
"ray>=2.53.0",
"plotly>=6.5.2",
"neuralforecast>=3.0.0",
"tirex-ts>=1.4.0; python_version >= '3.11'",
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As @daidahao mentioned, it would be great if we could have python 3.10 supported as well

logger,
)

if kwargs.get("enable_finetuning", False) not in (None, False):
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

see comment above

@daidahao
Copy link
Copy Markdown
Contributor

daidahao commented Apr 7, 2026

@dennisbader Thanks for the review here.

Besides the fine-tuning support which TiRex team are perhaps best positioned to address, most of them are minor so I will try to address them in the coming days.

@lukfischer
Copy link
Copy Markdown
Author

@dennisbader thx a lot for the through review!
As already mentioned by @daidahao unfortunately we cannot publicly provide fine tuning code for TiRex. This is our main value preposition. Maybe in the future this might change when TiRex 2 is released. For now lets please disregard fine-tuning. If we in the future also talk about classification this is of course different. Here the training code for the classification heads is publicly available. I hope this is fine for you!
@daidahao thx for taking the minor changes task up!

@dennisbader
Copy link
Copy Markdown
Collaborator

Thanks a lot for the answer @lukfischer :) I think we're talking about two different things.
I don't mean that you would publish your training / fine-tuning code. I understand that this is your service / expertise, and should of course stay on your side.

I'm referring to our (Darts) own existing ability to fine-tune (or train) the foundation models on your own data using the enable_finetuning parameter at model creation (simple "full" or "partial" model training on sequential datasets extracted from the input time series).

This fine-tuning logic is already in place for all our torch-based models including the existing foundation models. There is no magic behind our default training scenario, so it would still be up to the user / your services to perform robust / quality training. The same thing would apply if a user downloads the model from hugging face / the tirex library.

I do hope that we can enable this, as our idea was that all foundation models in Darts would support it.

Let me know what you think.

daidahao added 8 commits April 9, 2026 22:40
Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
@daidahao
Copy link
Copy Markdown
Contributor

daidahao commented Apr 9, 2026

Hi everyone, all the minor issues should be addressed with the new commits.

There might still be line coverage gaps but I will try to address them once I have the codecov report.

@martinloretzzz
Copy link
Copy Markdown

We've relaxed the python requirements to include 3.10, package is published to pypi as v1.4.1: https://github.com/NX-AI/tirex/releases/tag/v1.4.1

@lukfischer
Copy link
Copy Markdown
Author

Thanks a lot for the answer @lukfischer :) I think we're talking about two different things. I don't mean that you would publish your training / fine-tuning code. I understand that this is your service / expertise, and should of course stay on your side.

I'm referring to our (Darts) own existing ability to fine-tune (or train) the foundation models on your own data using the enable_finetuning parameter at model creation (simple "full" or "partial" model training on sequential datasets extracted from the input time series).

This fine-tuning logic is already in place for all our torch-based models including the existing foundation models. There is no magic behind our default training scenario, so it would still be up to the user / your services to perform robust / quality training. The same thing would apply if a user downloads the model from hugging face / the tirex library.

I do hope that we can enable this, as our idea was that all foundation models in Darts would support it.

Let me know what you think.

Oh ok, sorry, my bad! I get you point and after initial discussion with the team, we tend to agree that we indeed should enable this. Give me some more time to double check this with our management. But I'm positive, that we will be able to enable this.

Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
@daidahao
Copy link
Copy Markdown
Contributor

We've relaxed the python requirements to include 3.10, package is published to pypi as v1.4.1: https://github.com/NX-AI/tirex/releases/tag/v1.4.1

@martinloretzzz That's great to hear! I've also removed the py bound for tirex-ts in pyproject.toml. Thank you for the update.

Oh ok, sorry, my bad! I get you point and after initial discussion with the team, we tend to agree that we indeed should enable this. Give me some more time to double check this with our management. But I'm positive, that we will be able to enable this.

@lukfischer Great to hear that TiRex team is supportive of the initiative!

@dennisbader Could you please start the CI workflow for testing?

Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
@dennisbader
Copy link
Copy Markdown
Collaborator

@martinloretzzz and @lukfischer, thanks a lot for adding support for Python 3.10 and the re-consideration of the "fine-tuning" 🚀

Co-authored-by: Zhihao Dai <zhihao.dai@eng.ox.ac.uk>
@dennisbader
Copy link
Copy Markdown
Collaborator

@daidahao , I just merged the import time improvement PR #3066, which introduced some merge conflicts. Should probably be an easy fix by updating the import in darts/models/__init__.py.

@codecov
Copy link
Copy Markdown

codecov bot commented Apr 12, 2026

Codecov Report

❌ Patch coverage is 93.84615% with 4 lines in your changes missing coverage. Please review.
✅ Project coverage is 96.21%. Comparing base (adb418e) to head (50f935c).

Files with missing lines Patch % Lines
darts/models/forecasting/tirex_model.py 93.75% 4 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #3038      +/-   ##
==========================================
- Coverage   96.28%   96.21%   -0.08%     
==========================================
  Files         160      161       +1     
  Lines       17207    17271      +64     
==========================================
+ Hits        16568    16617      +49     
- Misses        639      654      +15     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[New Model] TiRex Foundation Model

4 participants