Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .dockerignore
100755 → 100644
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
configs
data
experiments
venv
venv
1 change: 1 addition & 0 deletions .env
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION='python'
78 changes: 78 additions & 0 deletions .github/workflows/docker-image.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
name: Create and publish a Docker image

# Configures this workflow to run every time a change is pushed to the branch called `release`.
on:
push:
branches: [ "main","dev" ]

# Defines two custom environment variables for the workflow. These are used for the Container registry domain, and a name for the Docker image that this workflow builds.
env:
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}

# There is a single job in this workflow. It's configured to run on the latest available version of Ubuntu.
jobs:
build-and-push-image:
runs-on: ubuntu-latest
# Sets the permissions granted to the `GITHUB_TOKEN` for the actions in this job.
permissions:
contents: read
packages: write
attestations: write
id-token: write

steps:
# Check out branch
- name: Checkout repository
uses: actions/checkout@v4

#js: read version
- name: set VER
run: echo "VER=$(awk -F \" '/version/ {print $2 }' pyproject.toml)" >> $GITHUB_ENV

# Uses the `docker/login-action` action to log in to the Container registry registry using the account and password that will publish the packages. Once published, the packages are scoped to the account defined here.
- name: Log in to the Container registry
uses: docker/login-action@v3
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}

# This step uses [docker/metadata-action](https://github.com/docker/metadata-action#about) to extract tags and labels that will be applied to the specified image. The `id` "meta" allows the output of this step to be referenced in a subsequent step. The `images` value provides the base name for the tags and labels.
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@v5
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
tags: |
type=raw,value=latest,enable=${{ github.ref_name == 'main' }}
type=semver,pattern={{version}},value=${{ env.VER }},enable=${{ github.ref_name == 'main' }}
type=raw,value=dev,enable=${{ (github.ref_name == 'dev') }}

# js: added in for multi arch building
- name: Set up QEMU
uses: docker/setup-qemu-action@v3

- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3

# This step uses the `docker/build-push-action` action to build the image, based on your repository's `Dockerfile`. If the build succeeds, it pushes the image to GitHub Packages.
# It uses the `context` parameter to define the build's context as the set of files located in the specified path. For more information, see [Usage](https://github.com/docker/build-push-action#usage) in the README of the `docker/build-push-action` repository.
# It uses the `tags` and `labels` parameters to tag and label the image with the output from the "meta" step.
- name: Build and push Docker image
id: push
uses: docker/build-push-action@v6
with:
platforms: linux/amd64,linux/arm64 #js: added for multi arch building
context: .
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}

# This step generates an artifact attestation for the image, which is an unforgeable statement about where and how it was built. It increases supply chain security for people who consume the image. For more information, see [Using artifact attestations to establish provenance for builds](/actions/security-guides/using-artifact-attestations-to-establish-provenance-for-builds).
- name: Generate artifact attestation
uses: actions/attest-build-provenance@v2
with:
subject-name: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME}}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true
25 changes: 25 additions & 0 deletions .github/workflows/release-on-main.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
on:
push:
branches: ["main"]

name: Create Release

jobs:
build:
name: Create Release
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v2
- name: set VER
run: echo "VER=$(awk -F \" '/version/ {print $2 }' pyproject.toml)" >> $GITHUB_ENV
- name: Create Release
id: create_release
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # This token is provided by Actions, you do not need to create your own token
with:
tag_name: ${{ env.VER }}
release_name: Release ${{ env.VER }}
draft: false
prerelease: false
4 changes: 2 additions & 2 deletions .gitignore
100755 → 100644
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@ configs
experiments
data
.vscode/settings.json
playground.ipynb
playground*.ipynb
coverage.xml
.coverage
ao-env
dev_scripts
dev_scripts
52 changes: 44 additions & 8 deletions .pre-commit-config.yaml
100755 → 100644
Original file line number Diff line number Diff line change
@@ -1,6 +1,22 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
hooks:
- id: check-ast
- id: check-docstring-first
- id: check-executables-have-shebangs
- id: check-json
- id: check-yaml
- id: check-toml
- id: end-of-file-fixer
- id: trailing-whitespace
- id: mixed-line-ending
- id: name-tests-test
args: [--pytest-test-first]
- id: pretty-format-json
args: [--autofix]
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.0.289 # Ruff version.
rev: v0.12.7
hooks:
- id: ruff
args:
Expand All @@ -12,19 +28,39 @@ repos:
--ignore=E501,
--ignore=F401,
]
- repo: https://github.com/psf/black
rev: de65741b8d49d78fa2675ef79b799cd35e92e7c1
- repo: https://github.com/shellcheck-py/shellcheck-py
rev: v0.10.0.1
hooks:
- id: shellcheck
- repo: https://github.com/pycqa/pydocstyle
rev: 6.3.0
hooks:
- id: black
language_version: python3.9
args: [--line-length=120]

- id: pydocstyle
args:
- --ignore=D203,D213,D401,D413
# - repo: local
# hooks:
# - id: pdoc
# name: pdoc
# language: system
# pass_filenames: false
# entry: poetry run pdoc --html -f autoxai4omics -o docs
# TODO: Include when/if hook is extended to check .toml files
# - repo: https://github.com/pivotal/LicenseFinder
# rev: v7.1.0
# hooks:
# - id: license-finder
# - repo: https://github.com/PyCQA/bandit
# rev: "1.7.9"
# hooks:
# - id: bandit
# args: ["--exclude", "tests"]
- repo: https://github.com/ibm/detect-secrets
# If you desire to use a specific version of detect-secrets, you can replace `master` with other git revisions such as branch, tag or commit sha.
# You are encouraged to use static refs such as tags, instead of branch name
#
# Running "pre-commit autoupdate" automatically updates rev to latest tag
rev: 0.13.1+ibm.61.dss
rev: 0.13.1+ibm.62.dss
hooks:
- id: detect-secrets # pragma: whitelist secret
# Add options for detect-secrets-hook binary. You can run `detect-secrets-hook --help` to list out all possible options.
Expand Down
Empty file modified .travis.yml
100755 → 100644
Empty file.
36 changes: 32 additions & 4 deletions CHANGELOG.md
100755 → 100644
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
<!--
Copyright 2024 IBM Corp.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
Expand All @@ -18,7 +18,34 @@

Change log for the codebase. Initialised from the developments following version `V0.11.3`

## [unreleased] - 2025-05-09
## [v1.3.0] - 2025-08-01

### Added

- Added: test case for variance threshold
- Added: `.env' file to set some required env vars
- Added: extra & missing cicd packages

### Changed

- changed: dockerfile to only install main dependencies
- changed: streamlined imports
- changed: updated as many packages as possible
- Changed: updated pre-commit-config

### Fixed

- fix: omic path parsing bug
- fix: plotting bug arising from api change
- fix: correct file permissions
- fix: cicd to build image and push to the `ghcr.io`
- fix: shuffle bug

### Security

- security: updated packages to resolve dependabot alerts

## [v1.2.0] - 2025-05-09

### Changed

Expand Down Expand Up @@ -186,6 +213,7 @@ Change log for the codebase. Initialised from the developments following version
- Detect secrets added
- Upgraded python base image from `3.9.14` to `3.9.18` for additional security fixes

[V1.2.0]: https://github.com/IBM/AutoXAI4Omics/releases/tag/V1.2.0
[V1.1.1]: https://github.com/IBM/AutoXAI4Omics/releases/tag/V1.1.1
[V1.1.0]: https://github.com/IBM/AutoXAI4Omics/releases/tag/V1.1.0
[V1.0.1]: https://github.com/IBM/AutoXAI4Omics/releases/tag/V1.0.1
Expand Down
8 changes: 4 additions & 4 deletions DEV_MANUAL.md
100755 → 100644
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
<!--
Copyright 2024 IBM Corp.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
Expand All @@ -30,7 +30,7 @@ Please use blacks & ruff to format any code contributions, we have a pre-commit

To create the virtual enviroment for AutoXAI4Omics using an enviroment manager of your choice, like conda for example, using `python3.9` as your starting point. Then proceed to install the contents of `pyproject.toml` for both the main and dev dependencies. Note that you may also need to set `PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python` within your enviroment.

*Note:* Since May '25 we have switched to using [Poetry](https://python-poetry.org/) for managing our env and dependencies which we also recommend to use. There is also an associated `poetry.lock` file to ensure consistent versions between developers.
*Note:* Since May '25 we have switched to using [Poetry](https://python-poetry.org/) for managing our env and dependencies which we also recommend to use. There is also an associated `poetry.lock` file to ensure consistent versions between developers. If you also the poetry shell extension it will automatically load the `.env` file setting the env var mentioned above.

## Testing

Expand Down
12 changes: 6 additions & 6 deletions Dockerfile
100755 → 100644
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
# Copyright 2024 IBM Corp.
#
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#
# http://www.apache.org/licenses/LICENSE-2.0
#
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
Expand Down Expand Up @@ -35,7 +35,7 @@ COPY --chown=omicsuser:0 poetry.lock pyproject.toml ./
COPY --chown=omicsuser:0 autoxai4omics .

RUN poetry env use system
RUN poetry install --no-root
USER omicsuser
RUN poetry install --no-root --only main
USER omicsuser

CMD ["$@"]
CMD ["$@"]
12 changes: 7 additions & 5 deletions README.md
100755 → 100644
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
<!--
Copyright 2024 IBM Corp.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
Expand Down Expand Up @@ -49,9 +49,11 @@ AutoXAI4Omics is a command line automated explainable AI tool that easily enable
3. Make the experiments folder accessaible by running the following the directory where the experiment directory exists:

```shell
chmod 777 -R experiments
chmod 777 -R experiments
```

*note* If you dont wish to build the image you can pull the image from the [github container regristry, found here](https://github.com/IBM/AutoXAI4Omics/pkgs/container/autoxai4omics)

## Citation

For citation of this tool, please reference this article:
Expand Down Expand Up @@ -101,7 +103,7 @@ Data to be used by AutoXAI4Omics needs to be stored in the `AutoXAI4Omics/data`
* `./autoxai4omics.sh -m bash -r`

* AutoXAI4Omics has a config duplication function (for when you wish to run the same config over multiple datasets). To use this you need to build the image and the run it in `bash` mode. Once there you can then run:

```shell
python mode_config_duplicate.py -c SUBPATH_TO_TEMPLATE_CONFIG -d DATA_SUBDIR
```
Expand Down
2 changes: 1 addition & 1 deletion _version.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,4 +13,4 @@
# limitations under the License.

# current version of the tool
__version__ = "1.2.0"
__version__ = "1.3.0"
8 changes: 4 additions & 4 deletions autoxai4omics/models/tabauto/keras_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
from tensorflow.keras.layers import Dense, Flatten, Conv1D
from tensorflow.keras import losses

import tensorflow.keras.optimizers.legacy
from tensorflow.keras.optimizers.legacy import Adam

from tensorflow.keras.callbacks import (
LearningRateScheduler,
Expand Down Expand Up @@ -95,7 +95,7 @@ def __init_ak__(self, input_dim, output_dim, dataset_type):
objective=["val_loss"],
tuner=self.tuner,
seed=self.random_state,
optimizer=tensorflow.keras.optimizers.legacy.Adam(),
optimizer=Adam(),
)

else:
Expand All @@ -117,7 +117,7 @@ def __init_ak__(self, input_dim, output_dim, dataset_type):
objective=["accuracy"],
tuner=self.tuner,
seed=self.random_state,
optimizer=tensorflow.keras.optimizers.legacy.Adam(),
optimizer=Adam(),
)

self.model = model
Expand Down Expand Up @@ -242,7 +242,7 @@ def __init_fx__(self, input_dim, output_dim, dataset_type):
)

# choose optimizer
opt = tensorflow.keras.optimizers.legacy.Adam()
opt = Adam()

# choose loss function
if self.dataset_type == REGRESSION:
Expand Down
Loading