mirror of
https://github.com/flakestorm/flakestorm.git
synced 2026-04-25 00:36:54 +02:00
Revise installation instructions in README.md, CONTRIBUTING.md, and USAGE_GUIDE.md to clarify the installation order for Ollama and flakestorm. Added detailed platform-specific installation steps for Ollama and emphasized the need for a virtual environment for Python packages. Included troubleshooting tips for common installation issues.
This commit is contained in:
parent
cb034b41ff
commit
dbbdac9d43
3 changed files with 173 additions and 22 deletions
70
README.md
70
README.md
|
|
@ -46,7 +46,61 @@ Instead of running one test case, Flakestorm takes a single "Golden Prompt", gen
|
|||
|
||||
## Quick Start
|
||||
|
||||
### Installation
|
||||
### Installation Order
|
||||
|
||||
1. **Install Ollama first** (system-level service)
|
||||
2. **Create virtual environment** (for Python packages)
|
||||
3. **Install flakestorm** (Python package)
|
||||
4. **Start Ollama and pull model** (required for mutations)
|
||||
|
||||
### Step 1: Install Ollama (System-Level)
|
||||
|
||||
FlakeStorm uses [Ollama](https://ollama.ai) for local model inference. Install this first:
|
||||
|
||||
**macOS Installation:**
|
||||
|
||||
```bash
|
||||
# Option 1: Homebrew (recommended)
|
||||
brew install ollama
|
||||
|
||||
# If you get permission errors, fix permissions first:
|
||||
sudo chown -R $(whoami) /Users/imac-frank/Library/Logs/Homebrew
|
||||
sudo chown -R $(whoami) /usr/local/Cellar
|
||||
sudo chown -R $(whoami) /usr/local/Homebrew
|
||||
brew install ollama
|
||||
|
||||
# Option 2: Official Installer
|
||||
# Visit https://ollama.ai/download and download the macOS installer (.dmg)
|
||||
```
|
||||
|
||||
**Windows Installation:**
|
||||
|
||||
1. Visit https://ollama.com/download/windows
|
||||
2. Download `OllamaSetup.exe`
|
||||
3. Run the installer and follow the wizard
|
||||
4. Ollama will be installed and start automatically
|
||||
|
||||
**Linux Installation:**
|
||||
|
||||
```bash
|
||||
# Using the official install script
|
||||
curl -fsSL https://ollama.com/install.sh | sh
|
||||
|
||||
# Or using package managers (Ubuntu/Debian example):
|
||||
sudo apt install ollama
|
||||
```
|
||||
|
||||
**After installation, start Ollama and pull the model:**
|
||||
|
||||
```bash
|
||||
# Start Ollama (macOS/Linux - Windows starts automatically)
|
||||
ollama serve
|
||||
|
||||
# In another terminal, pull the model
|
||||
ollama pull qwen3:8b
|
||||
```
|
||||
|
||||
### Step 2: Install flakestorm (Python Package)
|
||||
|
||||
**Using a virtual environment (recommended):**
|
||||
|
||||
|
|
@ -65,19 +119,7 @@ pip install flakestorm
|
|||
pipx install flakestorm
|
||||
```
|
||||
|
||||
**Note:** Requires Python 3.10 or higher. On macOS, Python environments are externally managed, so using a virtual environment is required.
|
||||
|
||||
### Prerequisites
|
||||
|
||||
Flakestorm uses [Ollama](https://ollama.ai) for local model inference:
|
||||
|
||||
```bash
|
||||
# Install Ollama (macOS/Linux)
|
||||
curl -fsSL https://ollama.ai/install.sh | sh
|
||||
|
||||
# Pull the default model
|
||||
ollama pull qwen3:8b
|
||||
```
|
||||
**Note:** Requires Python 3.10 or higher. On macOS, Python environments are externally managed, so using a virtual environment is required. Ollama runs independently and doesn't need to be in your virtual environment.
|
||||
|
||||
### Initialize Configuration
|
||||
|
||||
|
|
|
|||
|
|
@ -29,8 +29,27 @@ Please be respectful and constructive in all interactions. We welcome contributo
|
|||
```
|
||||
|
||||
3. **Install Ollama** (for mutation generation)
|
||||
|
||||
**macOS:**
|
||||
```bash
|
||||
# Option 1: Homebrew (recommended)
|
||||
brew install ollama
|
||||
|
||||
# Option 2: Official installer
|
||||
# Visit https://ollama.ai/download and download the .dmg file
|
||||
```
|
||||
|
||||
**Windows:**
|
||||
- Visit https://ollama.com/download/windows
|
||||
- Download and run `OllamaSetup.exe`
|
||||
|
||||
**Linux:**
|
||||
```bash
|
||||
curl -fsSL https://ollama.com/install.sh | sh
|
||||
```
|
||||
|
||||
**Then pull the model:**
|
||||
```bash
|
||||
curl -fsSL https://ollama.ai/install.sh | sh
|
||||
ollama pull qwen3:8b
|
||||
```
|
||||
|
||||
|
|
|
|||
|
|
@ -90,16 +90,78 @@ flakestorm is an **adversarial testing framework** for AI agents. It applies cha
|
|||
- **Ollama** (for local LLM mutation generation)
|
||||
- **Rust** (optional, for performance optimization)
|
||||
|
||||
### Step 1: Install Ollama
|
||||
### Installation Order
|
||||
|
||||
**Important:** Install Ollama first (it's a system-level service), then set up your Python virtual environment:
|
||||
|
||||
1. **Install Ollama** (system-level, runs independently)
|
||||
2. **Create virtual environment** (for Python packages)
|
||||
3. **Install flakestorm** (Python package)
|
||||
4. **Start Ollama service** (if not already running)
|
||||
5. **Pull the model** (required for mutation generation)
|
||||
|
||||
### Step 1: Install Ollama (System-Level)
|
||||
|
||||
**macOS Installation:**
|
||||
|
||||
```bash
|
||||
# macOS
|
||||
# Option 1: Homebrew (recommended)
|
||||
brew install ollama
|
||||
|
||||
# Linux
|
||||
# If you get permission errors, fix permissions first:
|
||||
sudo chown -R $(whoami) /Users/imac-frank/Library/Logs/Homebrew
|
||||
sudo chown -R $(whoami) /usr/local/Cellar
|
||||
sudo chown -R $(whoami) /usr/local/Homebrew
|
||||
brew install ollama
|
||||
|
||||
# Option 2: Official Installer (if Homebrew doesn't work)
|
||||
# Visit https://ollama.ai/download and download the macOS installer
|
||||
# Double-click the .dmg file and follow the installation wizard
|
||||
```
|
||||
|
||||
**Windows Installation:**
|
||||
|
||||
1. **Download the Installer:**
|
||||
- Visit https://ollama.com/download/windows
|
||||
- Download `OllamaSetup.exe`
|
||||
|
||||
2. **Run the Installer:**
|
||||
- Double-click `OllamaSetup.exe`
|
||||
- Follow the installation wizard
|
||||
- Ollama will be installed and added to your PATH automatically
|
||||
|
||||
3. **Verify Installation:**
|
||||
```powershell
|
||||
ollama --version
|
||||
```
|
||||
|
||||
**Linux Installation:**
|
||||
|
||||
```bash
|
||||
# Install using the official script
|
||||
curl -fsSL https://ollama.com/install.sh | sh
|
||||
|
||||
# Start Ollama service
|
||||
# Or using package managers:
|
||||
# Ubuntu/Debian
|
||||
sudo apt install ollama
|
||||
|
||||
# Fedora/RHEL
|
||||
sudo dnf install ollama
|
||||
|
||||
# Arch Linux
|
||||
sudo pacman -S ollama
|
||||
```
|
||||
|
||||
**Start Ollama Service:**
|
||||
|
||||
After installation, start Ollama:
|
||||
|
||||
```bash
|
||||
# macOS/Linux - Start the service
|
||||
ollama serve
|
||||
|
||||
# Windows - Ollama runs as a service automatically after installation
|
||||
# You can also start it manually from the Start menu or run:
|
||||
ollama serve
|
||||
```
|
||||
|
||||
|
|
@ -113,12 +175,12 @@ ollama pull qwen2.5-coder:7b
|
|||
ollama run qwen2.5-coder:7b "Hello, world!"
|
||||
```
|
||||
|
||||
### Step 3: Install flakestorm
|
||||
### Step 3: Create Virtual Environment and Install flakestorm
|
||||
|
||||
**Important:** On macOS (and some Linux distributions), Python environments are externally managed. You should use a virtual environment:
|
||||
**Important:** On macOS (and some Linux distributions), Python environments are externally managed. You must use a virtual environment:
|
||||
|
||||
```bash
|
||||
# Create a virtual environment
|
||||
# Create a virtual environment (do this AFTER installing Ollama)
|
||||
python3 -m venv venv
|
||||
|
||||
# Activate it (macOS/Linux)
|
||||
|
|
@ -137,6 +199,8 @@ cd flakestorm
|
|||
pip install -e ".[dev]"
|
||||
```
|
||||
|
||||
**Note:** Ollama is installed at the system level and doesn't need to be in your virtual environment. The virtual environment is only for Python packages (flakestorm and its dependencies).
|
||||
|
||||
**Alternative: Using pipx (for CLI applications)**
|
||||
|
||||
If you only want to use flakestorm as a CLI tool (not develop it), you can use `pipx`:
|
||||
|
|
@ -758,6 +822,32 @@ agent:
|
|||
3. Improve your agent's handling of those cases
|
||||
4. Re-run tests
|
||||
|
||||
#### "Homebrew permission errors when installing Ollama"
|
||||
|
||||
If you get `Operation not permitted` errors when running `brew install ollama`:
|
||||
|
||||
```bash
|
||||
# Fix Homebrew permissions
|
||||
sudo chown -R $(whoami) /Users/imac-frank/Library/Logs/Homebrew
|
||||
sudo chown -R $(whoami) /usr/local/Cellar
|
||||
sudo chown -R $(whoami) /usr/local/Homebrew
|
||||
|
||||
# Then try again
|
||||
brew install ollama
|
||||
|
||||
# Or use the official installer from https://ollama.ai/download instead
|
||||
```
|
||||
|
||||
#### "Ollama binary contains HTML or syntax errors"
|
||||
|
||||
If you downloaded a file that contains HTML instead of the binary:
|
||||
|
||||
1. **macOS:** Use Homebrew or download the official `.dmg` installer from https://ollama.ai/download
|
||||
2. **Windows:** Download `OllamaSetup.exe` from https://ollama.com/download/windows
|
||||
3. **Linux:** Use the official install script: `curl -fsSL https://ollama.com/install.sh | sh`
|
||||
|
||||
Never download binaries directly via curl from the download page - always use the official installers or package managers.
|
||||
|
||||
### Debug Mode
|
||||
|
||||
```bash
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue