← Back to blog

Stop Polluting Your Local Machine

Your laptop is for thinking. Builds, tests, Docker, and experiments belong on a disposable server.

Your local machine is a mess. Not because you're careless — because you use it.

Three versions of Node. A Postgres container running since February. Orphaned node_modules from projects you forgot about. Docker Desktop eating 4 GB of RAM while you're in a meeting. Half-finished experiments in ~/tmp that you'll "clean up later."

Every project you touch leaves residue. And that residue causes bugs.

The "works on my machine" problem is a pollution problem

When a test passes locally but fails in CI, it's almost always because your environment has something CI doesn't:

  • A cached dependency that masks a missing declaration in package.json
  • An environment variable set months ago that you forgot about
  • A database with test data from a previous run
  • A global tool at a version that doesn't match the project's expectation

Your machine has state. Lots of it. Some of it helps. Some of it lies.

The fix: don't run it locally

gibil create --name my-task --repo github.com/you/project --ttl 30
gibil run my-task "cd /root/project && pnpm install && pnpm test" --json
gibil destroy my-task

The test ran on a fresh Ubuntu server. No cached deps. No stale env vars. No leftover state. If it passes here, it passes in CI. If it fails, the failure is real.

Your laptop stays clean. The server does the dirty work and disappears.

What belongs on a remote server

Builds. Especially heavy ones. A Rust cargo build --release pegs your CPU for 10 minutes and makes your fan scream. On a cpx41 (8 vCPU), it runs faster and your laptop stays cool.

gibil create --name build --server-type cpx41 --repo github.com/you/rust-project --ttl 30
gibil run build "cd /root/project && cargo build --release" --json

Tests. The full suite. You run 3-4 tests locally while iterating, then run the full 2,000-test suite on a fresh server before pushing.

Docker services. Postgres, Redis, Elasticsearch — they run on the server, not your laptop. Define them in .gibil.yml and they start automatically on boot.

# .gibil.yml
image: node:20
services:
  - name: postgres
    image: postgres:16
    port: 5432
    env:
      POSTGRES_PASSWORD: dev

Experiments. Trying a new framework? Forge a disposable server. If it breaks, burn it. Your laptop never knew it happened.

Agent work. Let your AI agent work on a remote server via MCP. Any MCP-compatible agent gets root access, Docker, the full repo — and it physically cannot break your local environment.

What stays local

Your IDE. You think locally. Browse code, read diffs, write prompts.

Quick iterations. Change a line, run one test, check the output. The edit-run cycle should be instant.

Git operations. Commit, branch, rebase. Fast and local by nature.

The split is natural: thinking on your machine, execution on a server.

The agent angle

This split matters more with AI agents. When your agent runs pnpm test on your machine, it's using your CPU, your memory, your environment. If the test needs Docker, the agent needs Docker on your laptop.

On a gibil server via MCP:

{
  "mcpServers": {
    "gibil": {
      "command": "gibil",
      "args": ["mcp", "my-task"]
    }
  }
}

Your agent gets its own server. Root access, Docker, SSH, a public IP. It can install weird packages, run destructive commands, mess up the filesystem. The server is disposable. Your laptop is not.

The lifecycle

gibil create --name task --repo github.com/you/project --ttl 30
# ... work happens on the server ...
gibil destroy task

The server exists for exactly as long as the task. No orphaned servers, no forgotten containers, no "what's using port 5432?" TTL burns it automatically if you forget.

Your local machine stays what it should be: a clean, fast thinking environment. Everything else happens somewhere disposable.