- Run the notebook ai_automated_loop.ipynb
-
define in first cell:
- are we running in debug mode (False/True)
- are we live (if True, Shimin's code will be executed)
-
define parameters
- the IPTS (ex: 37493)
- sample name (ex: 'this is my sample')
- experiment conditions (ex: 'T10C')
- rotation stage number (ex: 1)
- number of obs requested (ex: 3)
- proton charge (in coulombs) for each run (ex: 0.1)
- number of TIFF each run will produce (ex: 2628)
- first run number (ex: 7864)
- [optional] list of initial angles (in degrees) (ex: [45, 75, 90])
-
Run the next cell (launching acquisition of open beams, 0 and 180 degrees) This is the pre-processing step (described in detail below)
-
The next cell (to run on a regular basis) will check the state of the pre-processing step and inform when all the OBs and 0, 180 degrees have been found and moved to their final location.
-
Run next cell to calculate the center of rotation using the 0 and 180 degrees projections
-
Launch the AI loop by running the last cell
Use this section as a quick operator runbook.
- Update configs/config.yaml:
- EIC token (ask Ray) and IPTS
- Confirm pre-processing input folder exists:
- /SNS/VENUS/IPTS-/shared/autoreduce/images/tpx1/raw/ct
- Confirm write access to:
- /data/VENUS/IPTS-
- Edit cron:
- crontab -e
- Enable this line:
- * * * * * /SNS/VENUS/shared/software/git/hype_scripts/scripts/cron_job_script_pre_processing.sh > /dev/null
- Save and exit.
- Append timestamp to logs/cron_jobs.txt.
- Run:
- /home/j35/.pixi/bin/pixi run --manifest-path /SNS/VENUS/shared/software/git/hype_scripts python /SNS/VENUS/shared/software/git/hype_scripts/scripts/ai_processing_loop.py -p
- ai_processing_loop.py (pre_processing):
- exits early if ai_pre_process_running is false
- waits for Run_<run_number_expected>
- waits until TIFF count reaches number_of_tiff_for_each_run
- copies run folder to DataPath and chmod -R 777
- appends copied path to:
- ob_local_path (for OB runs)
- 0_and_180_local_path (for last 2 runs)
- increments run_number_expected by 1
When the last expected pre-processing run is collected:
- ai_pre_process_running is set to false.
- working_with_first_processing_angles is set to false.
- run_number_expected is incremented.
- config is copied to /data/VENUS/shared and ~/.
- /data/VENUS/shared/software/hype_scripts/hype_loop/hyperct_toolkit_depoly/hyperct_loop_autogen/ini_exp_hype.sh is launched.
- Cron heartbeat:
- tail -n 20 /SNS/VENUS/shared/software/git/hype_scripts/logs/cron_jobs.txt
- Main pre-processing log:
- tail -n 200 /data/VENUS/IPTS-/logs/ai_processing_loop.log
- Current config state:
- grep -E "ai_pre_process_running|run_number_expected|starting_run_number|working_with_first_processing_angles" /SNS/VENUS/shared/software/git/hype_scripts/configs/config.yaml
- No new runs processed:
- confirm cron line is enabled
- confirm Run_<run_number_expected> exists in MCP folder
- confirm TIFF count reached expected threshold
- Repeatedly waiting on same run:
- verify run_number_expected in config.yaml
- verify detector finished writing files
- Permission/copy errors:
- verify write access under /data/VENUS/IPTS-
- Stop loop safely:
- set ai_pre_process_running: false in config.yaml
Use this section as a quick operator runbook for the main CT acquisition loop.
- Confirm pre-processing is complete:
ai_pre_process_running: falsein configs/config.yamlob_local_pathand0_and_180_local_pathare populated
- Confirm configs/config.yaml fields are set correctly:
ai_process_running: truerun_number_expected= first CT run number to collectworking_with_first_processing_angles: true(first batch usesnum_ini_angfiles)num_ini_ang= number of initial anglesstep= number of angles per subsequent batchnumber_of_tiff_for_each_run= expected TIFF count per runDataPath= target output folder on hype
- Confirm the AI reconstruction script is accessible:
/data/VENUS/shared/software/run/run_ai_loop.sh
- Edit cron:
crontab -e
- Enable this line (and disable the pre-processing one):
* * * * * /SNS/VENUS/shared/software/git/hype_scripts/scripts/cron_job_script_full_loop.sh > /dev/null
- Save and exit.
- Runs
ai_processing_loop.py(no-pflag) →processing(). - Exits early if
ai_process_runningis false. - Reads
run_number_expectedand determines batch size:- First pass:
num_ini_angruns - Subsequent passes:
stepruns
- First pass:
- Checks that all
Run_XXXXfolders in the batch exist under MCP folder. - Checks that all TIFF files are present in each folder.
- For each new run: copies folder to
DataPathwithchmod -R 777. - Launches
run_ai_loop.sh(Shimin AI reconstruction). - Increments
run_number_expectedby batch size and saves config.
- Cron heartbeat:
tail -n 20 /SNS/VENUS/shared/software/git/hype_scripts/logs/cron_jobs.txt
- Main processing log:
tail -n 200 /data/VENUS/IPTS-<ipts>/logs/ai_processing_loop.log
- Current config state:
grep -E "ai_process_running|run_number_expected|working_with_first_processing_angles" /SNS/VENUS/shared/software/git/hype_scripts/configs/config.yaml
- Check copied runs in DataPath:
ls /data/VENUS/IPTS-<ipts>/
- No new runs processed:
- Confirm
ai_process_running: truein config.yaml - Confirm
Run_<run_number_expected>exists in MCP folder - Confirm TIFF count reached
number_of_tiff_for_each_run
- Confirm
- Stuck on same run number:
- Check detector is done writing files
- Verify
run_number_expectedvalue in config.yaml
- Shimin script not launching:
- Confirm
run_ai_loop.shis executable and path is correct
- Confirm
- Stop loop safely:
- Set
ai_process_running: falsein config.yaml
- Set
This workflow describes what happens each time the cron pre-processing trigger runs scripts/ai_processing_loop.py with the -p flag.
- Cron executes scripts/cron_job_script_pre_processing.sh.
- The shell script appends a timestamp to logs/cron_jobs.txt.
- The shell script runs: /home/j35/.pixi/bin/pixi run --manifest-path /SNS/VENUS/shared/software/git/hype_scripts python /SNS/VENUS/shared/software/git/hype_scripts/scripts/ai_processing_loop.py -p
- In ai_processing_loop.py, -p dispatches execution to pre_processing().
- ai_processing_loop.log is trimmed to the last 1000 lines.
- configs/config.yaml is loaded.
- If ai_pre_process_running is false, the routine exits immediately.
- Expected run list is built from: starting_run_number ... starting_run_number + number_of_obs + 1 where:
- OB runs = all except last 2
- last 2 runs = 0 and 180 degree runs
- DataPath is created if missing, then permissions are recursively opened under /data/VENUS/IPTS-.
For current run_number_expected:
- Check for MCP folder Run_<run_number_expected>.
- Ensure all TIFF files exist (must reach number_of_tiff_for_each_run).
- Build short destination name from first TIFF filename.
- If destination folder was not already copied, copy run folder to DataPath and chmod -R 777.
- Update config lists:
- If run is in OB range, append destination path to ob_local_path.
- If run is in last two runs, append destination path to 0_and_180_local_path.
If run_number_expected equals the last expected run:
- Set ai_pre_process_running = false.
- Set DataPath to OUTPUT_FOLDER_ON_HYPE.
- Increment run_number_expected by 1.
- Set working_with_first_processing_angles = false.
- Save config.
- Copy config to /data/VENUS/shared and ~/.
- Launch /data/VENUS/shared/software/auto_gen_run_scrs/ini_exp_hype.sh.
Otherwise:
- Increment run_number_expected by 1.
- Save config.
- Exit, waiting for next cron cycle.
This workflow describes what happens each time the cron main-loop trigger runs scripts/ai_processing_loop.py without the -p flag.
- Cron executes scripts/cron_job_script_full_loop.sh.
- The shell script appends a timestamp to logs/cron_jobs.txt.
- The shell script runs: /home/j35/.pixi/bin/pixi run --manifest-path /SNS/VENUS/shared/software/git/hype_scripts python /SNS/VENUS/shared/software/git/hype_scripts/scripts/ai_processing_loop.py
- In ai_processing_loop.py, no -p flag dispatches execution to processing().
- configs/config.yaml is loaded.
- If ai_process_running is false, the routine exits immediately.
- run_number_expected is read.
- Batch size is determined:
- If working_with_first_processing_angles is true: batch = num_ini_ang
- Otherwise: batch = step
- list_of_runs_expected is built: run_number_expected, run_number_expected+1, ..., run_number_expected+N-1
For all runs in list_of_runs_expected:
- Check that every Run_XXXX folder exists under MCP_FOLDER.
- If any is missing, exit immediately.
- Check that every run folder has all TIFF files written (count >= number_of_tiff_for_each_run).
- If any is incomplete, exit immediately.
- For each run, derive the short destination name from the first TIFF filename.
- If the destination folder does not already exist in DataPath:
- Copy the run folder to DataPath.
- chmod -R 777 on the copied folder.
- If the destination already exists, skip that run.
Once all runs in the batch are confirmed and copied:
- Launch /data/VENUS/shared/software/run/run_ai_loop.sh (Shimin AI reconstruction).
- Increment run_number_expected by the batch size.
- Save configs/config.yaml.
- Exit, waiting for next cron cycle.
- config.ymal copied to /data/VENUS/shared/software/
- run "/data/VENUS/shared/software/hyperct_toolkit_depoly/hyperct_loop_autogen/ini_exp_hype.sh" to automatically generate running scripts
- Cron job start
- Cron job run: "/data/VENUS/shared/software/run_ai_loop.sh"
- define paremeter: sample name, user_condition, number of init angles, acquire type (pCharge/time), sample postion files paths
- remove experiment titile
/data/VENUS/shared/software
├─ hyperct_toolkit_depoly
│ ├─ ainct_lib
│ ├─ ctqa
│ ├─ hyperct_loop_autogen
│ ├─ pixi.lock
│ └─ pixi.toml
├─ logs
├─ run
│ ├─ ang_prop.sh
│ ├─ eva.sh
│ ├─ load_1.sh
│ └─ rec_1.sh
├─ scrs
│ ├─ ai_loop.py
│ ├─ AIRobo.py
│ └─ sync_data.py
└─ run_ai_loop.sh
- AIRobo.py: there are four modes: 1). Load 2). Recon 3). Evaluate 4). Angle Propose
- ai_loop.py is the script calling AIRobo to run different mode at different node
- sync_data.py: move the results back to /SNS/VENUS/IPTS-XXXX/shared/hyperct_output

- run /data/VENUS/shared/software/hyperct_toolkit_depoly/hyperct_loop_autogen/ini_exp_hype.sh to automatically generate running scripts
