Phase 7 — Backfill customer accounts
If you are bringing an existing CPO operation onto Polaris Express, your customers, their idTags, and their billing state already exist somewhere — a spreadsheet, a legacy CRM, a former billing provider. Phase 7 brings those records into Polaris Express and reconciles them with Lago so that the first charging session after cutover bills correctly.
This phase runs after the platform is healthy (Phases 1–6) and before you point real ChargeBoxes at production.
Prerequisites
Section titled “Prerequisites”- Phases 1–6 complete: database, web, SteVe, email
worker, and Lago are all up and reachable from the
webcontainer. - Lago has at least one plan configured that matches the tariff you intend to bill on.
- A CSV export of your existing customers with, at minimum: email, display name, external customer ID (your prior system’s stable key), and the idTag string they use at the ChargeBox.
- Admin console access with the
superadminrole.
-
Prepare the input CSV
Place the export at
web/data/backfill/customers.csvon the host running thewebcontainer. The expected columns are:customers.csv email,display_name,external_id,id_tag,plan_code,start_dateemailbecomes the login identity (used by BetterAuth and the magic link flow).external_idis preserved as the Lagoexternal_customer_id. Keep it stable — it is the join key for everything downstream.plan_codemust match an existing Lago plan code exactly.start_dateis the subscription anchor; back-date it if you want pro-rated billing to reflect the original signup.
-
Run a dry-run
The backfill command supports
--dry-run, which validates every row, checks for collisions against the current database and Lago, and prints a summary without writing anything.Terminal window docker compose exec web pnpm backfill:customers \--input /data/backfill/customers.csv \--dry-runRead the summary carefully. The exit code is non-zero if any row would fail. Common dry-run failures:
- Email already exists — the customer was partially imported in a
prior attempt, or someone signed up directly. Resolve by either
removing the row or marking it
--skip-existing. - Plan code not found — fix the CSV or create the plan in Lago.
- Duplicate idTag — two rows claim the same idTag. Polaris Express enforces uniqueness.
- Email already exists — the customer was partially imported in a
prior attempt, or someone signed up directly. Resolve by either
removing the row or marking it
-
Execute the backfill
When the dry-run is clean:
Terminal window docker compose exec web pnpm backfill:customers \--input /data/backfill/customers.csv \--skip-existingThe command runs in batches and writes a row-by-row log to
web/data/backfill/run-<timestamp>.log. For each row it:- Creates the user record and a default
user mapping for the
idTag. - Creates the Lago customer using
external_idasexternal_customer_id. - Creates a Lago subscription against
plan_codewithsubscription_at = start_date. - Emits an audit log entry tagged
backfill:<run-id>.
- Creates the user record and a default
user mapping for the
-
Reconcile
After the run completes, reconcile counts:
Terminal window docker compose exec web pnpm backfill:reconcile \--run-id <run-id>This compares CSV row count, users created, mappings created, Lago customers created, and Lago subscriptions active. Any divergence is flagged with the offending
external_id. -
Send welcome emails (optional)
When you are ready to invite customers to log in:
Terminal window docker compose exec web pnpm backfill:welcome \--run-id <run-id> \--batch-size 50 \--delay-ms 2000This sends a magic link to each imported email, rate-limited so you do not trip your SMTP provider’s burst limit.
Configure environment variables
Section titled “Configure environment variables”The backfill scripts read the same environment as the web
container. The relevant variables:
| Name | Default | Required | Source |
|---|---|---|---|
DATABASE_URL | — | yes | web/.env |
LAGO_API_URL | http://lago-api:3000 | yes | web/.env |
LAGO_API_KEY | — | yes | web/.env |
BACKFILL_BATCH_SIZE
= 25 | 25 | no | web/.env |
BACKFILL_LOG_DIR
= /data/backfill | /data/backfill | no | web/.env |
EMAIL_FROM | — | only for welcome step | web/.env |
If BACKFILL_BATCH_SIZE is omitted, the script uses
25 rows per transaction, which is conservative and safe to leave as-is
for runs up to ~10,000 customers.
Verify
Section titled “Verify”After the backfill, verify three things:
1. User count in the admin console. Navigate to Admin →
Customers and confirm the total matches the CSV row count (minus any
--skip-existing rows).
2. Lago customer count. From the Lago admin UI, the active customer count should match. Or via API:
curl -s -H "Authorization: Bearer $LAGO_API_KEY" \ "$LAGO_API_URL/api/v1/customers?per_page=1" | jq '.meta.total_count'3. End-to-end with a test idTag. Pick one imported customer, send
a StartTransaction to SteVe with their idTag, and
confirm:
- The transaction appears under that user in the admin console.
- A meter value event reaches Lago against the correct Lago customer.
If something goes wrong
Section titled “If something goes wrong”Partial run, want to resume. Re-run the same command with
--skip-existing. Rows already imported will be detected by
external_id and skipped without error.
A Lago subscription was created with the wrong plan. Don’t try to fix it from the backfill script. Cancel the subscription in Lago, then re-run the row using a single-row CSV.
You need to undo the whole run. Each backfill run is tagged in the audit log. To reverse:
docker compose exec web pnpm backfill:revert --run-id <run-id>The revert is itself audit-logged and produces a manifest at
web/data/backfill/revert-<run-id>.log.
Welcome emails bouncing. Check the email worker logs (Phase 4) and
confirm EMAIL_FROM is a verified sender at your SMTP
provider. The welcome step is safe to re-run; it skips users who have
already logged in at least once.
Next phase
Section titled “Next phase”Once backfill is verified and reconciled, proceed to Phase 8 — Cutover and monitoring to point production ChargeBoxes at the new SteVe endpoint.