System Requirements
System requirements for installing AIRGAP Studio
Hardware Requirements
Minimum & Recommended Specs
| Component | Minimum | Recommended |
|---|---|---|
| CPU | 64-bit x86 processor | Intel Core i7 / AMD Ryzen 7 or higher |
| RAM | 16GB | 32GB |
| Disk Space | 10GB | 20GB SSD |
| Display | 1280x720 | 1920x1080 or higher |
GPU (Optional)
A GPU significantly improves AI model inference speed.
| GPU Type | VRAM | Performance Level |
|---|---|---|
| NVIDIA (Vulkan) | 8GB+ | Optimal (30-45 tokens/s) |
| AMD Radeon (Vulkan) | 8GB+ | Good (25-40 tokens/s) |
| NVIDIA/AMD (Vulkan) | 6GB | Functional (with some limitations) |
| Intel iGPU (Vulkan) | — | Basic (15-25 tokens/s) |
| CPU only | - | Basic (8-15 tokens/s) |
AIRGAP Studio works fine without a GPU. In CPU mode, AI response generation will take longer.
GPU Compatibility
AIRGAP Studio's AI engine (llama-server) uses the Vulkan backend. GPU acceleration is available for any NVIDIA, AMD, or Intel GPU that supports Vulkan.
# Check NVIDIA GPU information
nvidia-smi
You can verify Vulkan support by updating your GPU drivers. Make sure you have the latest graphics drivers installed.
Operating System
| OS | Support | Notes |
|---|---|---|
| Windows 11 64-bit | Supported | Recommended |
| Windows 10 64-bit (1903+) | Supported | |
| Windows 10 32-bit | Not supported | |
| Windows 8.1 or earlier | Not supported | |
| macOS / Linux | Not supported | Windows only |
Software Requirements
Required
| Software | Purpose | Notes |
|---|---|---|
| Ollama | Local LLM server | Included in installer |
| Qwen3:8b model | AI inference | Installed separately via model pack |
Optional
| Software | Purpose | Notes |
|---|---|---|
| Git | Checkpoint feature | Portable Git supported |
| Python 3.8+ | Jupyter notebooks, scripts | For data analysis tasks |
| Node.js 18+ | MCP server execution | When using MCP servers |
| Figma Desktop | Bridge prototyping feature | Requires MCP plugin |
Disk Space Breakdown
| Component | Size |
|---|---|
| AIRGAP Studio (IDE + extensions) | ~1GB |
| Ollama runtime | ~200MB |
| Qwen3:8b model | ~5GB |
| Workspace (cache, logs, etc.) | ~2-5GB |
| Total | ~8-12GB |
Reserve additional disk space for project files and extra model installations.
Network (Optional)
AIRGAP Studio is designed for offline environments, so a network connection is not required.
| Feature | Network Required |
|---|---|
| Core IDE features | No |
| Local AI (Ollama) | No |
| External providers (Anthropic, OpenAI, etc.) | Yes |
| Remote MCP servers | Yes |
| Extension installation (marketplace) | Yes |
Performance Optimization Tips
- RAM: With 32GB or more, the IDE and Ollama run smoothly at the same time.
- SSD: Model loading speeds are significantly faster compared to HDD.
- GPU VRAM: With a Vulkan-capable GPU (NVIDIA/AMD) with 8GB+ VRAM, the Qwen3:8b model loads entirely into GPU memory.
Related Documentation
- Getting Started - Installation and initial setup
- Provider Configuration - AI model configuration