Install & Integrate
Get the local evaluation demo running in minutes: download → unzip →
docker compose up → open the UI → run the smoke test.
iRNDOM sits between your app and outside vendors/LLMs/SaaS so raw identifiers never leave your trust boundary.
Download Install Bundle
View Live Demo
Demo UI: http://localhost:7071/demo
Demo API: http://localhost:7070
What you download
✓
Docker launcher + docs
A small bundle with
docker-compose.yml, smoke tests, and step-by-step instructions.✓
Prebuilt images
The demo runs from container images so teams can validate behavior without receiving source.
✓
Policy + audit proof
Exportable artifacts to show “what rules were enforced” and “what happened”.
Expected file name:
irndom_install_package.zip inside irondom/static/downloads/.
Quick start (6 steps)
1
Download the bundle
Click “Download Install Bundle” above.
2
Unzip to a folder
Example:
C:\irndom_install (Windows) or ~/irndom_install (Mac/Linux).3
Start Docker
Make sure Docker Desktop (or Docker Engine) is running.
4
Run the demo
In that unzip folder:
docker compose up -d.5
Open the UI
Go to
http://localhost:7071/demo.6
Run the smoke test
Use the included script to confirm endpoints + policy export work.
Want the “tell me it’s working” check? After
docker compose up -d, docker compose ps should show the services as Up.
Copy/paste commands
These commands must be run inside your unzipped install folder (customer install folder),
not in the
d22 repo folder.
# Docker (recommended eval) # 1) unzip the downloaded bundle somewhere, e.g. C:\irndom_install cd "C:\path\to\irndom_install" # 2) start iRNDOM demo (UI + API) docker compose pull docker compose up -d docker compose ps # 3) open the local demo UI start http://localhost:7071/demo # 4) run smoke tests powershell -ExecutionPolicy Bypass -File .\smoke_test.ps1 # 5) optional: enterprise proof run (customer POV) # Runs protect -> vendor LLM -> internal restore + attacker blocks, # prints success rate + latency + audit event counts (screenshot this output). # Requires internet + LLM_API_KEY (any OpenAI-compatible endpoint). Only Safe Labels™ are sent to the vendor LLM. # Vendor LLM key (any OpenAI-compatible endpoint) $env:LLM_API_KEY = "YOUR_LLM_KEY_HERE" # Optional: # $env:LLM_BASE_URL = "https://api.openai.com/v1/chat/completions" # $env:LLM_MODEL = "gpt-4o-mini" powershell -ExecutionPolicy Bypass -File .\pressproof.ps1 -N 5 # stop when done docker compose down
Ports used by the local demo:
7071 (site/UI), 7070 (demo API), 7001 (sidecar), 8080 (upstream).
Troubleshooting (common fixes)
•
Docker isn’t running
Start Docker Desktop, then retry
docker compose up -d.•
Port already in use
Stop conflicting services or change ports in
docker-compose.yml, then restart.•
UI loads but actions fail
Run the smoke test script and check
docker compose logs -f for errors.•
Policy export returns 401
Paste your
X-API-Key below and click “Test key”.
If you want, share the output of
docker compose ps + the last 30 lines of docker compose logs and we can diagnose quickly.
License / evaluation note
This bundle is intended for evaluation. Use synthetic or redacted data. Do not redistribute outside your organization.
✓
No source code included
Only a launcher + documentation. The app runs as containers.
✓
Confidential evaluation
Do not publish or share screenshots/logs containing secrets or customer data.
✓
Artifacts are proof
Policy export + audit trail are the “evidence package” you can show in demos.
See:
LICENSE and README.md at the project root (and included in the install bundle).
Set your API key
Used for exporting policy artifacts from the sidecar.
Saved locally in your browser (
localStorage). Not sent anywhere until you click download.
Download policy artifacts
Exports from the sidecar on
https://irndom-sidecar-v7fm5k7qva-uw.a.run.app.
If these downloads fail, confirm the sidecar is reachable and your API key is correct.