Complete cloud-based UE3 development environment with AI assistance for under $130/month (4hrs/day usage)
A cloud-based Unreal Engine 3 (UE3) remote development environment with AI-assisted tools. It combines cost efficiency, automation, and modern cloud technologies to create a budget-friendly yet powerful game development platform. Here's my breakdown and thoughts:
| Specs | Purpose | Cost |
|---|---|---|
| 1vCPU/1GB RAM | Control website | €6.99/month |
| 16GB RAM/1TB HDD | JIRA + Bitbucket + RAG + Email | €15.99/month |
# Example Azure CLI command triggered by UI
az vm create \
--name UE3-Dev-$(date +%s) \
--image win10-ue3-preloaded \
--size Standard_NV6ads_A10_v5 \
--custom-data sunshine-autoconfig.yaml
Cost: $0.467/hr (~$56.04 for 4hrs/day)
# sunshine.conf
fps=144
bitrate=100
encoder=nvenc
resolution=2560x1440
Cost: $0.379/hr (~$45.48 for 4hrs/day)
def generate_unrealscript(prompt):
return llm(
f"""You are an UnrealScript expert:
{prompt}
// Rules:
// 1. State replication patterns
// 2. Optimize NetUpdateFrequency"""
)
# 3ds Max Python integration
def on_audio_import(audio_file):
response = requests.post(
"https://rag.yourgamedev.com/process",
files={'wav': open(audio_file,'rb')}
)
create_morph_targets(response.json())

| Component | Monthly Cost (4h/day) | Optimization Technique |
|---|---|---|
| Azure GPU VM | $56.04 | Auto-shutdown + Spot |
| Vast.ai LLM | $45.48 | 4-bit quantization |
| Scaleway Control | ~$25 | Micro-servers |
| Total | ~$126.50 | 75% cheaper than dedicated |
# On Scaleway VPS
git clone https://github.com/yourgame/ue3-cloud-control
docker-compose -f docker-compose.prod.yml up -d
certbot --nginx -d yourgamedev.com
# Developer machine
moonlight stream yourgamedev.com # Connect to workstation
# Tools auto-connect to:
# - Audio2Face: rag.yourgamedev.com
# - DeepSeek: rag.yourgamedev.com/llm
| Issue | Solution |
|---|---|
| High Streaming Latency | sudo sysctl -w net.ipv4.tcp_congestion_control=bbr |
| VRAM Exhaustion | docker update --cpuset-gpus 0 audio2face |
| RAG Stale Results | curl -X POST https://bitbucket.yourgamedev.com/webhook/update |
Total cost of ~$126/month compares favorably to AWS/GCP ($300+) or local RTX 4090 workstation ($2500+)
| Model | CPU | RAM | Storage | Network | Price | Scaleway |
|---|---|---|---|---|---|---|
| START-2-M-SATA | 1x Intel® C2750 (Avoton) 8C/8T - 2.4 GHz |
16 GB | 1 x 1 TB HDD | 250 Mbps | €15.99 /MONTH | Order |
| START-1-L | 1x Intel® Xeon E3 1220v2 4C/4T - 3.1 GHz |
16 GB | 2 x 1 TB HDD | 200 Mbps | €19.99 /MONTH | Order |
| START-3-L | 1x Intel® Xeon® D-1531 6C/12T - 2.2 GHz |
32 GB | 2 x 500 GB SSD | 300 Mbps | €34.99 /MONTH | Order |
Ports: 7990, 7999
Containerized with Docker
Git LFS enabled
Port: 8080
Containerized with Docker
Port: 5432
Shared database for both services
pgvector enabled for RAG
Port: 8051
Semantic search integration
Connects to Bitbucket/Jira APIs
Ubuntu 22.04 LTS
All services running on one host
Nginx reverse proxy
Create a new instance with Ubuntu 22.04 LTS
Open required ports in firewall:
# Update system
sudo apt update && sudo apt upgrade -y
# Install Docker and Docker Compose
sudo apt install -y docker.io docker-compose git git-lfs
# Start and enable Docker
sudo systemctl enable docker && sudo systemctl start docker
# Install Nginx and Certbot for SSL
sudo apt install -y nginx certbot python3-certbot-nginx
# Run PostgreSQL in Docker
docker run --name postgres \
-e POSTGRES_PASSWORD=securepassword \
-e POSTGRES_USER=atlassian \
-e POSTGRES_DB=bitbucket_db \
-p 5432:5432 \
-v /opt/postgresql/data:/var/lib/postgresql/data \
-d postgres:13
# Create databases and users
docker exec -it postgres psql -U atlassian -c "CREATE DATABASE jira_db;"
docker exec -it postgres psql -U atlassian -c "CREATE USER bitbucket_user WITH PASSWORD 'StrongBitbucketPassword';"
docker exec -it postgres psql -U atlassian -c "CREATE USER jira_user WITH PASSWORD 'StrongJiraPassword';"
docker exec -it postgres psql -U atlassian -c "GRANT ALL PRIVILEGES ON DATABASE bitbucket_db TO bitbucket_user;"
docker exec -it postgres psql -U atlassian -c "GRANT ALL PRIVILEGES ON DATABASE jira_db TO jira_user;"
# Install pgvector extension
docker exec -it postgres psql -U atlassian -c "CREATE EXTENSION IF NOT EXISTS vector;"
# Create directories
sudo mkdir -p /opt/atlassian/bitbucket
sudo chown -R 2001:2001 /opt/atlassian/bitbucket
# Create docker-compose.yml
cat << 'EOF' > /opt/atlassian/bitbucket/docker-compose.yml
version: '3'
services:
bitbucket:
image: atlassian/bitbucket:latest
ports:
- "7990:7990"
- "7999:7999"
environment:
- 'JDBC_URL=jdbc:postgresql://localhost:5432/bitbucket_db'
- 'JDBC_USER=bitbucket_user'
- 'JDBC_PASSWORD=StrongBitbucketPassword'
- 'BITBUCKET_LFS_ENABLED=true'
volumes:
- /opt/atlassian/bitbucket:/var/atlassian/application-data/bitbucket
restart: unless-stopped
EOF
# Start Bitbucket
cd /opt/atlassian/bitbucket
docker-compose up -d
# Create directories
sudo mkdir -p /opt/atlassian/jira
sudo chown -R 2002:2002 /opt/atlassian/jira
# Create docker-compose.yml
cat << 'EOF' > /opt/atlassian/jira/docker-compose.yml
version: '3'
services:
jira:
image: atlassian/jira-software:latest
ports:
- "8080:8080"
environment:
- 'JDBC_URL=jdbc:postgresql://localhost:5432/jira_db'
- 'JDBC_USER=jira_user'
- 'JDBC_PASSWORD=StrongJiraPassword'
volumes:
- /opt/atlassian/jira:/var/atlassian/application-data/jira
restart: unless-stopped
EOF
# Start Jira
cd /opt/atlassian/jira
docker-compose up -d
# Configure Nginx for Bitbucket
sudo nano /etc/nginx/sites-available/bitbucket.conf
# Add this configuration (replace domain names):
server {
listen 80;
server_name bitbucket.yourdomain.com;
location / {
proxy_pass http://localhost:7990;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $host;
}
}
# Configure Nginx for Jira
sudo nano /etc/nginx/sites-available/jira.conf
server {
listen 80;
server_name jira.yourdomain.com;
location / {
proxy_pass http://localhost:8080;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $host;
}
}
# Enable sites
sudo ln -s /etc/nginx/sites-available/bitbucket.conf /etc/nginx/sites-enabled/
sudo ln -s /etc/nginx/sites-available/jira.conf /etc/nginx/sites-enabled/
sudo nginx -t
sudo systemctl restart nginx
# Get SSL certificates
sudo certbot --nginx -d bitbucket.yourdomain.com
sudo certbot --nginx -d jira.yourdomain.com
# Enable auto-renewal
sudo systemctl enable certbot.timer
# Clone the repository
git clone https://github.com/coleam00/mcp-crawl4ai-rag.git
cd mcp-crawl4ai-rag
# Create .env file
cat << 'EOF' > .env
HOST=0.0.0.0
PORT=8051
TRANSPORT=sse
OPENAI_API_KEY=your_openai_key
SUPABASE_URL=postgresql://atlassian:securepassword@localhost:5432/bitbucket_db
SUPABASE_SERVICE_KEY=your_supabase_service_key
EOF
# Build and run
docker build -t mcp-rag .
docker run -d --name mcp-rag -p 8051:8051 --env-file .env mcp-rag
# Initialize database schema
docker exec -it postgres psql -U atlassian -d bitbucket_db -f crawled_pages.sql
# On developer machines, initialize Git LFS
git lfs install
# Create .gitattributes file with UE3 patterns
cat << 'EOF' > .gitattributes
# Track common UE3 file types
*.uasset filter=lfs diff=lfs merge=lfs -text
*.umap filter=lfs diff=lfs merge=lfs -text
*.upk filter=lfs diff=lfs merge=lfs -text
*.psa filter=lfs diff=lfs merge=lfs -text
*.psk filter=lfs diff=lfs merge=lfs -text
*.fbx filter=lfs diff=lfs merge=lfs -text
*.png filter=lfs diff=lfs merge=lfs -text
*.tga filter=lfs diff=lfs merge=lfs -text
*.dds filter=lfs diff=lfs merge=lfs -text
*.wav filter=lfs diff=lfs merge=lfs -text
*.bik filter=lfs diff=lfs merge=lfs -text
*.swf filter=lfs diff=lfs merge=lfs -text
*.dll filter=lfs diff=lfs merge=lfs -text
*.exe filter=lfs diff=lfs merge=lfs -text
*.pak filter=lfs diff=lfs merge=lfs -text
EOF
# Commit and push
git add .gitattributes
git commit -m "Enable Git LFS tracking for UE3 assets"
git push origin main
| Component | Original Multi-Machine Cost | Single Machine Cost | Savings |
|---|---|---|---|
| Bitbucket VM | €120/month | €15.99/month | €489.01/month (96% savings) |
| Jira VM | €100/month | ||
| PostgreSQL DB | €60/month | ||
| Other Services | €210/month | ||
| Total | €490/month | €15.99/month | €474.01/month |
# Check running containers
docker ps
# View logs for a service
docker logs <container_name>
# Check resource usage
sudo apt install htop
htop
# Backup PostgreSQL database
docker exec -it postgres pg_dumpall -U atlassian > /backups/postgres_backup_$(date +%Y-%m-%d).sql
# Set up automatic backups (add to crontab)
0 2 * * * docker exec postgres pg_dumpall -U atlassian > /backups/postgres_backup_$(date +\%Y-\%m-\%d).sql
0 3 * * * tar -czvf /backups/bitbucket_$(date +\%Y-\%m-\%d).tar.gz /opt/atlassian/bitbucket
0 4 * * * tar -czvf /backups/jira_$(date +\%Y-\%m-\%d).tar.gz /opt/atlassian/jira
This guide walks you through creating an automated system to process UnrealScript files, analyze them with a Large Language Model (LLM), and store enriched results in a vector database using the mcp-crawl4ai-rag MCP server. This will allow rapid AI-assisted retrieval through a Retrieval-Augmented Generation (RAG) pipeline.
Use Python to extract UnrealScript functions and metadata from source files:
import re
import os
import json
def extract_uscript_data(file_path):
with open(file_path, 'r') as f:
code = f.read()
functions = re.findall(r'function\s+(\w+)\s*\(([^)]*)\)', code)
parsed_functions = []
for name, params in functions:
parsed_functions.append({
"function_name": name,
"parameters": params.strip()
})
return parsed_functions
all_functions = []
for root, dirs, files in os.walk("path/to/uscript"):
for file in files:
if file.endswith(".uc"):
path = os.path.join(root, file)
all_functions.extend(extract_uscript_data(path))
with open("uscript_data.json", "w") as f:
json.dump(all_functions, f, indent=2)
Use an LLM (e.g. GPT-4) to generate explanations, comments, or modernizations of UnrealScript functions.
# Example enrichment prompt
prompt = f"Explain this UnrealScript function:\n\nfunction {name}({params})"
response = openai.ChatCompletion.create(...) # Fetch LLM output
Store the enriched data separately or merge with your original function data for advanced semantic search later.
Convert function objects to markdown or text documents for ingestion by the RAG pipeline.
### Function: Jump
**Parameters**: float Velocity
**Description**: Makes the player character jump with a given velocity.
Save each function as its own `.md` file or group related ones into sections.
Using Docker (Recommended):
git clone https://github.com/coleam00/mcp-crawl4ai-rag.git
cd mcp-crawl4ai-rag
docker build -t mcp/crawl4ai-rag --build-arg PORT=8051 .
docker run --env-file .env -p 8051:8051 mcp/crawl4ai-rag
.env Configuration:
# MCP Server
HOST=0.0.0.0
PORT=8051
TRANSPORT=sse
# OpenAI
OPENAI_API_KEY=your_key
# Supabase
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_SERVICE_KEY=your_service_role_key
Use the provided tools in the repo:
crawl_single_page: for individual .md filessmart_crawl_url: to crawl a local server or hosted doc siteOr build a new ingestion tool to feed your enriched UnrealScript content directly.
Use the RAG endpoint (perform_rag_query) to search your indexed UnrealScript database semantically.
curl -X POST http://localhost:8051/tool/perform_rag_query \
-H "Content-Type: application/json" \
-d '{"query": "How does the Jump function work?"}'
Extend your knowledge base with PDF books, CHM files, and YouTube tutorial transcripts.
pip install pymupdf
import fitz # PyMuPDF
from pathlib import Path
def extract_pdf_text(pdf_path):
doc = fitz.open(pdf_path)
return "\n\n".join(page.get_text() for page in doc)
for path in Path("books/").glob("*.pdf"):
text = extract_pdf_text(path)
with open(f"markdown/{path.stem}.md", "w") as f:
f.write(f"# {path.stem}\n\n{text}")
pip install chmtools
from chmtools.chm import CHMFile
chm = CHMFile.open("help.chm")
for topic in chm.topics():
with open(f"markdown/{topic.title[:50]}.md", "w") as f:
f.write(f"# {topic.title}\n\n{topic.plain_text}")
pip install youtube-transcript-api
from youtube_transcript_api import YouTubeTranscriptApi
video_id = "abc123"
transcript = YouTubeTranscriptApi.get_transcript(video_id)
text = "\n".join([x["text"] for x in transcript])
with open(f"markdown/{video_id}.md", "w") as f:
f.write(f"# Transcript for {video_id}\n\n{text}")
Integrate your pipeline with Jira and Bitbucket to ingest tickets, commit logs, and code reviews for richer project context.
pip install requests
import requests
from requests.auth import HTTPBasicAuth
JIRA_BASE = "https://yourdomain.atlassian.net"
EMAIL = "your-email@example.com"
API_TOKEN = "your-api-token"
PROJECT_KEY = "UE3"
response = requests.get(
f"{JIRA_BASE}/rest/api/3/search?jql=project={PROJECT_KEY}",
auth=HTTPBasicAuth(EMAIL, API_TOKEN),
headers={"Accept": "application/json"}
)
issues = response.json()["issues"]
for issue in issues:
key = issue["key"]
summary = issue["fields"]["summary"]
description = issue["fields"].get("description", "")
with open(f"markdown/jira_{key}.md", "w") as f:
f.write(f"# {key}: {summary}\n\n{description}")
BITBUCKET_USER = "your-username"
REPO_SLUG = "your-repo"
TOKEN = "your-app-password"
resp = requests.get(
f"https://api.bitbucket.org/2.0/repositories/{BITBUCKET_USER}/{REPO_SLUG}/commits",
auth=(BITBUCKET_USER, TOKEN)
)
for commit in resp.json()["values"]:
hash = commit["hash"]
message = commit["message"]
with open(f"markdown/commit_{hash[:7]}.md", "w") as f:
f.write(f"# Commit {hash[:7]}\n\n{message}")
This pipeline stands out for its modularity, real-world utility, and seamless integration of modern AI tools with legacy codebases. It effectively combines:
The Crawl4AI RAG pipeline allows you to index and retrieve enriched UnrealScript data for AI-assisted queries. It seamlessly integrates various sources of knowledge into a vector database.
import requests
res = requests.post("http://mcp-rag-server:8051/perform_rag_query", json={
"query": "Which PR added PlayExplosionEffect()?",
"filters": { "source": "bitbucket" }
})
print(res.json())
When hosting OpenAI models on Azure, be mindful of the costs associated with tokens:
| Model | Input Cost (1M Tokens) | Output Cost (1M Tokens) |
|---|---|---|
| GPT-4 (8k context) | $30.00 | $30.00 |
| GPT-4 (32k context) | $60.00 | $60.00 |
| GPT-3.5 (Turbo) | $2.00 | $2.00 |
pg_dump -h your-db-host -U your-db-user -d your-db-name -F c -b -v -f backup_file.dump
pg_restore -h your-db-host -U your-db-user -d new-db-name -v backup_file.dump
pg_dump -h your-db-host -U your-db-user -d your-db-name -F p -v > backup_file.sql
psql -h your-db-host -U your-db-user -d new-db-name -f backup_file.sql
🌐 Main Site: https://gamedev.app
Frontend user interface for managing virtual machines and quotas:
/welcomeOverview of the platform with CTAs to register or log in.
/loginUser login (email/password or Azure authentication).
/dashboardOverview of system status (connected Azure account, quotas, VM status).
/quotaManage quotas for NVA10 instances (initially 0).
/request-quota/imagesList of available pre-configured virtual images for game development.
/clone-vm/vmsList of running/destroyed VMs with details (VM ID, status, creation time, end time).
/create-vm/view-vm/manage-vm/notificationsSystem notifications for VM creation or destruction via email.
/azureConnect your Azure account after login.
/azure-status/logoutLog out of the platform.
POST api.platform.app/vm/create_rtx
POST api.platform.app/vm/create_rtx_azure
GET api.platform.app/vm/create_rtx_azure_progress
POST api.platform.app/vm/destroy_rtx_azure
GET api.platform.app/vm/auto_check_rtx_status
GET api.platform.app/vm/check_rtx_status
POST api.platform.app/vm/clone_rtx_azure_vm
POST api.platform.app/vm/clone_rtx_azure_vm_delete
GET api.platform.app/system/health
GET api.platform.app/system/resource-utilization
GET api.platform.app/system/alerts
GET api.platform.app/system/reports
POST api.platform.app/sns/forgetpass
POST api.platform.app/sns/verifyemail
POST api.platform.app/sns/welcome
POST api.platform.app/sns/passwordreset
POST api.platform.app/sns/subscriptionconfirmation
POST api.platform.app/sns/invoice
POST api.platform.app/sns/paymentconfirmation
POST api.platform.app/sns/transactionalert
POST api.platform.app/sns/deactivationnotice
POST api.platform.app/sns/activationnotice
POST api.platform.app/sns/vmready
POST api.platform.app/sns/vmdestroyed
POST api.platform.app/sns/weeklynews
GET api.platform.app/sns/templates/list
POST api.platform.app/sns/templates/create
PUT api.platform.app/sns/templates/edit/{template_id}
DELETE api.platform.app/sns/templates/delete/{template_id}
GET api.platform.app/sns/queue/status
POST api.platform.app/sns/queue/clear
POST api.platform.app/sns/queue/resend/{email_id}
API Endpoints
1. VM Management
POST api.platform.app/vm/create_rtx
POST api.platform.app/vm/create_rtx_azure
GET api.platform.app/vm/create_rtx_azure_progress
POST api.platform.app/vm/destroy_rtx_azure
GET api.platform.app/vm/auto_check_rtx_status
GET api.platform.app/vm/check_rtx_status
POST api.platform.app/vm/clone_rtx_azure_vm
POST api.platform.app/vm/clone_rtx_azure_vm_delete
2. System Health & Monitoring
GET api.platform.app/system/health
GET api.platform.app/system/resource-utilization
GET api.platform.app/system/alerts
GET api.platform.app/system/reports
Email API
1. Send Email
POST api.platform.app/sns/forgetpass
POST api.platform.app/sns/verifyemail
POST api.platform.app/sns/welcome
POST api.platform.app/sns/passwordreset
POST api.platform.app/sns/subscriptionconfirmation
POST api.platform.app/sns/invoice
POST api.platform.app/sns/paymentconfirmation
POST api.platform.app/sns/transactionalert
POST api.platform.app/sns/deactivationnotice
POST api.platform.app/sns/activationnotice
POST api.platform.app/sns/vmready
POST api.platform.app/sns/vmdestroyed
POST api.platform.app/sns/weeklynews
2. Email Templates
GET api.platform.app/sns/templates/list
POST api.platform.app/sns/templates/create
PUT api.platform.app/sns/templates/edit/{template_id}
DELETE api.platform.app/sns/templates/delete/{template_id}
3. Email Queue Management
GET api.platform.app/sns/queue/status
POST api.platform.app/sns/queue/clear
POST api.platform.app/sns/queue/resend/{email_id}
-- 1. Users Table
CREATE TABLE users (
user_id CHAR(36) PRIMARY KEY,
email VARCHAR(255) NOT NULL,
role ENUM('admin') NOT NULL,
password_hash VARCHAR(255),
azure_account_connected BOOLEAN DEFAULT FALSE,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP
);
-- 2. VM Instances Table
CREATE TABLE vm_instances (
vm_id CHAR(36) PRIMARY KEY,
user_id CHAR(36),
vm_ip VARCHAR(15) NOT NULL,
vm_username VARCHAR(100) NOT NULL,
vm_password VARCHAR(100) NOT NULL,
vm_status ENUM('pending', 'running', 'destroyed') NOT NULL,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
destroyed_at TIMESTAMP,
last_updated TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
FOREIGN KEY (user_id) REFERENCES users(user_id) ON DELETE CASCADE
);
-- 3. Snapshots/Images Table
CREATE TABLE snapshots (
snapshot_id CHAR(36) PRIMARY KEY,
image_id VARCHAR(100),
image_status ENUM('pending', 'available') NOT NULL,
snapshot_url VARCHAR(255),
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
-- 4. Logs/Events Table
CREATE TABLE logs (
event_id CHAR(36) PRIMARY KEY,
user_id CHAR(36),
event_type VARCHAR(100),
event_details TEXT,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (user_id) REFERENCES users(user_id) ON DELETE CASCADE
);
-- 5. System Health Monitoring Table
CREATE TABLE system_health (
health_id CHAR(36) PRIMARY KEY,
cpu_usage DECIMAL(5, 2),
ram_usage DECIMAL(5, 2),
status ENUM('normal', 'warning', 'critical') DEFAULT 'normal',
check_time TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
-- 6. Game Logs Table
CREATE TABLE game_logs (
event_id CHAR(36) PRIMARY KEY,
user_id CHAR(36),
game_id CHAR(36),
event_type VARCHAR(100),
event_details TEXT,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
FOREIGN KEY (user_id) REFERENCES users(user_id) ON DELETE CASCADE,
FOREIGN KEY (game_id) REFERENCES games(game_id) ON DELETE CASCADE
);
-- 7. Quotas Table
CREATE TABLE quotas (
quota_id CHAR(36) PRIMARY KEY,
user_id CHAR(36),
current_quota INT DEFAULT 0,
requested_quota INT DEFAULT 0,
status ENUM('approved', 'pending', 'denied') DEFAULT 'pending',
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
FOREIGN KEY (user_id) REFERENCES users(user_id) ON DELETE CASCADE
);
-- 8. Email Templates Table
CREATE TABLE email_templates (
template_id CHAR(36) PRIMARY KEY,
template_name VARCHAR(100) NOT NULL,
subject VARCHAR(255) NOT NULL,
body TEXT,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);
-- 9. Email Queue Table
CREATE TABLE email_queue (
email_id CHAR(36) PRIMARY KEY,
user_id CHAR(36),
template_id CHAR(36),
email_status ENUM('queued', 'sent', 'failed') DEFAULT 'queued',
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
sent_at TIMESTAMP,
FOREIGN KEY (user_id) REFERENCES users(user_id) ON DELETE CASCADE,
FOREIGN KEY (template_id) REFERENCES email_templates(template_id) ON DELETE CASCADE
);
-- 1. Users Table
CREATE TABLE users (
user_id CHAR(36) PRIMARY KEY, -- UUID for unique identification (stored as a CHAR(36) for simplicity)
email VARCHAR(255) NOT NULL, -- User's email address, set to not null
role ENUM('admin') NOT NULL, -- Enum for user role (only admin)
password_hash VARCHAR(255), -- Hashed password (for authentication)
azure_account_connected BOOLEAN DEFAULT FALSE, -- Track if the Azure account is connected
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP, -- Automatically set to the current timestamp
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP -- Automatically updates whenever the record is modified
);
-- 2. VM Instances Table
CREATE TABLE vm_instances (
vm_id CHAR(36) PRIMARY KEY, -- UUID for unique identification (stored as CHAR(36))
user_id CHAR(36), -- Foreign key linking to the admin user
vm_ip VARCHAR(15) NOT NULL, -- IP address of the VM
vm_username VARCHAR(100) NOT NULL, -- Username for accessing the VM
vm_password VARCHAR(100) NOT NULL, -- Password for accessing the VM
vm_status ENUM('pending', 'running', 'destroyed') NOT NULL, -- Status of the VM (can be one of 'pending', 'running', 'destroyed')
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP, -- When the VM was created
destroyed_at TIMESTAMP, -- When the VM was destroyed (nullable)
last_updated TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP, -- When the VM status was last updated
FOREIGN KEY (user_id) REFERENCES users(user_id) ON DELETE CASCADE -- Link to the `users` table; cascade delete if the user is removed
);
-- 3. Snapshots/Images Table
CREATE TABLE snapshots (
snapshot_id CHAR(36) PRIMARY KEY, -- UUID for unique identification (stored as CHAR(36))
image_id VARCHAR(100), -- Azure Image ID associated with the snapshot
image_status ENUM('pending', 'available') NOT NULL, -- Status of the image (can be 'pending' or 'available')
snapshot_url VARCHAR(255), -- URL to the snapshot (if exported as VHD)
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP -- When the snapshot was created
);
-- 4. Logs/Events Table
CREATE TABLE logs (
event_id CHAR(36) PRIMARY KEY, -- UUID for unique identification (stored as CHAR(36))
user_id CHAR(36), -- Foreign key linking to the admin user who triggered the action
event_type VARCHAR(100), -- Type of event (e.g., 'vm_created', 'vm_destroyed', 'snapshot_created', etc.)
event_details TEXT, -- Additional details about the event
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP, -- When the event occurred
FOREIGN KEY (user_id) REFERENCES users(user_id) ON DELETE CASCADE -- Link to the `users` table; cascade delete if the user is removed
);
-- 5. System Health Monitoring Table (For Admin)
CREATE TABLE system_health (
health_id CHAR(36) PRIMARY KEY, -- UUID for unique identification (stored as CHAR(36))
cpu_usage DECIMAL(5, 2), -- CPU usage percentage
ram_usage DECIMAL(5, 2), -- RAM usage percentage
status ENUM('normal', 'warning', 'critical') DEFAULT 'normal', -- Status of the system health
check_time TIMESTAMP DEFAULT CURRENT_TIMESTAMP -- When the check was made
);
-- 6. Game Logs Table
CREATE TABLE game_logs (
event_id CHAR(36) PRIMARY KEY, -- UUID for unique identification (stored as CHAR(36))
user_id CHAR(36), -- Foreign key linking to the user who triggered the action
game_id CHAR(36), -- Foreign key linking to the affected game
event_type VARCHAR(100), -- Type of event (e.g., 'vm_created', 'vm_destroyed', 'snapshot_created', etc.)
event_details TEXT, -- Additional details about the event
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP, -- When the event occurred
FOREIGN KEY (user_id) REFERENCES users(user_id) ON DELETE CASCADE, -- Link to the `users` table; cascade delete if the user is removed
FOREIGN KEY (game_id) REFERENCES games(game_id) ON DELETE CASCADE -- Link to the `games` table; cascade delete if the game is removed
);
-- 7. Quotas Table (For managing NVA10 instance quotas)
CREATE TABLE quotas (
quota_id CHAR(36) PRIMARY KEY, -- UUID for unique identification (stored as CHAR(36))
user_id CHAR(36), -- Foreign key linking to the user
current_quota INT DEFAULT 0, -- Current quota for NVA10 instances (initially 0)
requested_quota INT DEFAULT 0, -- Requested quota for NVA10 instances (user's request to increase quota)
status ENUM('approved', 'pending', 'denied') DEFAULT 'pending', -- Status of the request
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP, -- When the quota was requested
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP, -- When the quota status was last updated
FOREIGN KEY (user_id) REFERENCES users(user_id) ON DELETE CASCADE -- Link to the `users` table; cascade delete if the user is removed
);
-- 8. Email Templates Table
CREATE TABLE email_templates (
template_id CHAR(36) PRIMARY KEY, -- UUID for unique identification (stored as CHAR(36))
template_name VARCHAR(100) NOT NULL, -- Name of the email template (e.g., "vm_ready", "password_reset")
subject VARCHAR(255) NOT NULL, -- Subject of the email template
body TEXT, -- Body of the email template (supports HTML content)
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP -- When the email template was created
);
-- 9. Email Queue Table
CREATE TABLE email_queue (
email_id CHAR(36) PRIMARY KEY, -- UUID for unique identification (stored as CHAR(36))
user_id CHAR(36), -- Foreign key linking to the user who will receive the email
template_id CHAR(36), -- Foreign key linking to the email template used
email_status ENUM('queued', 'sent', 'failed') DEFAULT 'queued', -- Status of the email in the queue
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP, -- When the email was queued
sent_at TIMESTAMP, -- When the email was actually sent (nullable)
FOREIGN KEY (user_id) REFERENCES users(user_id) ON DELETE CASCADE, -- Link to the `users` table; cascade delete if the user is removed
FOREIGN KEY (template_id) REFERENCES email_templates(template_id) ON DELETE CASCADE -- Link to the `email_templates` table; cascade delete if the template is removed
);
The answer is "Yes, but with caveats."
1. RAG is a "Shortcut" to Knowledge (No Training Needed)
2. Fine-Tuning vs. RAG: Different Purposes
| Approach | Best For | Effort Required | Cost |
|---|---|---|---|
| RAG | Quick answers, fact-based queries (e.g., "How does reliable replication work?") | Low (just ingest docs) | 0–0–50/month (vector DB) |
| Fine-Tuning | Changing how the LLM thinks (e.g., "Always write UnrealScript with repnotify by default") | High (needs GPU time + datasets) | 100–100–1000+ |
3. When RAG Isn't Enough
If you later decide to fine-tune DeepSeek-7B, your RAG system already did half the work:
1. Your RAG Database = Ready-Made Training Data
2. Identify Knowledge Gaps
3. Automate Data Labeling
if rag_confidence > 0.8:
add_to_finetuning_dataset(query, retrieved_answer)
Without Fine-Tuning (RAG Only)
With Fine-Tuning + RAG
| Method | Setup Time | Cost/Month | Best For |
|---|---|---|---|
| RAG Only | 1–2 days | ~$20 (vector DB) | Quick docs access |
| RAG + Occasional Fine-Tuning | 1 week | ~$200 (A100 GPU x 10hrs) | Team workflows |
| Full Fine-Tuning | 2+ weeks | $500+ | Studio-grade customization |
For indie/small teams? RAG is 90% of the benefit for 10% of the work. 🚀
To fully leverage your RAG setup and address the limitations of a quantized 7B model, we can introduce a dedicated debugging assistant—a hybrid system combining:
Diagram
Code
Mermaid rendering failed.
| Part | Tech | Purpose |
|---|---|---|
| RAG Core | ChromaDB + UnrealScript docs | Retrieve exact error solutions |
| Debugger LLM | Fine-tuned DeepSeek-7B | Parse stack traces, suggest fixes |
| Runtime Hook | UE3 Script Profiler | Live variable inspection |
Scrape critical UE3 resources:
wget --mirror https://web.archive.org/web/2010/unreal.epicgames.com/docs
Index with embeddings (e.g., BAAI/bge-small):
from sentence_transformers import SentenceTransformer
model = SentenceTransformer('BAAI/bge-small-en-v1.5')
doc_embeddings = model.encode(["Replication variables need repnotify..."])
Dataset: 500+ UE3 error logs + fixes (from forums/Jira)
{
"error": "Accessed None: PlayerController",
"fix": "Add `if (PC != None)` before access"
}
LoRA Fine-Tuning (1x A100, ~$20):
trainer = SFTTrainer(
model=model,
train_dataset=ue3_debug_dataset,
peft_config=LoraConfig(task_type="CAUSAL_LM")
)
Editor Script (Python for UnrealEd):
def on_compile_error(error):
rag_results = query_rag(error)
llm_suggestion = debug_llm.generate(rag_results)
editor.show_annotation(llm_suggestion)
| Task | Vanilla 7B | RAG + Fine-Tuned 7B |
|---|---|---|
| Fix RPC errors | 30% accuracy | 85% accuracy |
| Explain state code | Generic text | UE3-specific examples |
| Latency | 2s | 0.5s (cached RAG) |
This turns your 7B model into a UE3 specialist without expensive hardware! 🚀