Author: Ian Trimble
Version: 1.0.0
Created: April 28, 2025
The analyze_with_ollama_picker_analysis.sh script is a command-line tool designed to analyze script files using Ollama models and generate detailed Markdown documentation. It supports various script types, including shell and AppleScript, and provides extensive benchmarking to track performance metrics.
- Operating System: macOS
- Required Tools:
jq: For JSON processing.bc: For mathematical calculations.curl: For making API requests.ollama: For running local language models.
- Ollama Models: At least one model must be installed (e.g.,
qwen2.5:1.5b).
Use Homebrew to install the required tools:
brew install jq bc curl ollama-
Download the Script
Save the script asanalyze_with_ollama_picker_analysis.shin your desired directory. -
Make the Script Executable
Run the following command:chmod +x analyze_with_ollama_picker_analysis.sh
-
Start the Ollama Server
Ensure the Ollama server is running:ollama serve & -
Pull a Model
Download at least one Ollama model:ollama pull qwen2.5:1.5b
To analyze a script, run:
./analyze_with_ollama_picker_analysis.sh <input_file> [model]<input_file>: Path to the script file (e.g.,.sh,.scpt).[model]: Optional. Specify an Ollama model (e.g.,qwen2.5:1.5b).
If no model is specified, the script will prompt for selection or default to the first available model.
./analyze_with_ollama_picker_analysis.sh my_script.sh qwen2.5:1.5b- Benchmark Directory:
${HOME}/ollama_benchmarks
Stores benchmark logs and metrics. - README File:
README.mdin the current directory.
Appends the generated documentation.
The script generates:
- README.md: Appends a new section with the analysis results.
- Benchmark Logs:
benchmark_log.csv: Detailed performance metrics.metrics_<session_id>.json: Session-specific metrics.response_<session_id>.md: Raw Ollama response.
- Model Not Found:
- Ensure the model is installed:
ollama pull <model>.
- Ensure the model is installed:
- Dependency Errors:
- Install missing tools:
brew install jq bc curl ollama.
- Install missing tools:
- API Connection Issues:
- Start the Ollama server:
ollama serve &.
- Start the Ollama server:
This script is provided under the MIT License. See the generated README.md for full license details.