-
Notifications
You must be signed in to change notification settings - Fork 75
Docker Installation Guide for BambooAI
Cross-platform instructions for Windows, Mac, and Linux
BambooAI can be easily deployed using Docker, providing a consistent and isolated environment that simplifies installation across different operating systems. Docker containerization offers several advantages:
- Simplified Deployment: Run BambooAI without complex setup or dependency management
- Environment Isolation: Keep BambooAI and its dependencies separate from your system
- Security: Execute LLM-generated code within a sandboxed container for enhanced safety
- Consistency: Ensure BambooAI runs the same way across different machines
- Easy Updates: Quickly update to newer versions without reinstallation
This guide presents two options for deploying BambooAI with Docker, depending on your needs.
Quick setup using the pre-built Docker Hub image. Ideal for users who want a simple installation without building from source. You'll download configuration files, create directories, and run the container with minimal technical steps required. Like all Docker solutions, provides containerized sandboxing of code execution for enhanced security.
Prerequisites:
- Docker installed on your system
- Docker Compose installed on your system
-
Create a new directory for BambooAI:
Mac/Linux:
mkdir bambooai cd bambooai
Windows (PowerShell):
mkdir bambooai cd bambooai
-
Create the necessary subdirectories:
Mac/Linux:
mkdir -p web_app/storage/favourites web_app/storage/threads web_app/temp web_app/logs
Windows (PowerShell):
mkdir web_app\storage\favourites, web_app\storage\threads, web_app\temp, web_app\logs
-
Download example configuration files:
Mac/Linux:
# Download .env example curl -o web_app/.env https://raw.githubusercontent.com/pgalko/BambooAI/main/.env.example # Edit LLM_CONFIG.json with your settings # Download LLM_CONFIG example curl -o web_app/LLM_CONFIG.json https://raw.githubusercontent.com/pgalko/BambooAI/main/LLM_CONFIG_sample.json # Edit LLM_CONFIG.json with your settings
Windows (PowerShell):
# Download .env example Invoke-WebRequest -Uri https://raw.githubusercontent.com/pgalko/BambooAI/main/.env.example -OutFile web_app\.env # Edit .env with your settings # Download LLM_CONFIG example Invoke-WebRequest -Uri https://raw.githubusercontent.com/pgalko/BambooAI/main/LLM_CONFIG_sample.json -OutFile web_app\LLM_CONFIG.json # Edit .env with your settings
Edit these files with your settings after downloading.
-
Create a docker-compose.yml file:
Mac/Linux:
touch docker-compose.yml
Windows (PowerShell):
New-Item -Path . -Name "docker-compose.yml" -ItemType "file"
Add the following content to docker-compose.yml:
services: bambooai-webapp: image: pgalko/bambooai:latest container_name: bambooai volumes: # Mount configuration files - ./web_app/.env:/app/web_app/.env - ./web_app/LLM_CONFIG.json:/app/web_app/LLM_CONFIG.json # Mount persistent storage directories - ./web_app/storage:/app/web_app/storage - ./web_app/temp:/app/web_app/temp - ./web_app/logs:/app/web_app/logs ports: - "5000:5000" restart: unless-stopped
-
Run the Docker container:
docker-compose up -d
Access web interface at http://localhost:5000
Note: The Docker setup preserves all configuration and data between container restarts. If you encounter any issues, check the container logs with docker-compose logs -f
.
Create a Docker image using the cloned repository. This approach gives you visibility into the codebase and configuration before building. Clone the repository, configure your environment, and build the image locally. This option maintains the same containerized sandboxing security benefits while allowing you to inspect the code and make any modifications if needed.
Prerequisites:
- Docker installed on your system
- Docker Compose installed on your system
-
Clone repository:
git clone https://github.com/pgalko/BambooAI.git cd BambooAI
-
Configure environment:
Mac/Linux:
cp .env.example web_app/.env # Edit .env with your settings
Windows (PowerShell):
Copy-Item -Path .env.example -Destination web_app\.env # Edit .env with your settings
-
Configure LLM agents, models and parameters:
Mac/Linux:
cp LLM_CONFIG_sample.json web_app/LLM_CONFIG.json # Edit LLM_CONFIG.json with your settings
Windows (PowerShell):
Copy-Item -Path LLM_CONFIG_sample.json -Destination web_app\LLM_CONFIG.json # Edit LLM_CONFIG.json with your settings
- Edit
web_app/LLM_CONFIG.json
with your desired combination of agents and models - Ensure all necessary API keys are present in your .env file
- Edit
-
Build and run the Docker container using the docker-compose file from the repo:
docker-compose up -d
Access web interface at http://localhost:5000
Note: The Docker setup preserves all configuration and data between container restarts. If you encounter any issues, check the container logs with docker-compose logs -f
.