Logs
Build and deploy logs, runtime logs, and Worker Node (server) logs in dFlow.
Written By Zoro
Last updated 3 days ago
dFlow exposes different log streams
depending on whether you are debugging a Deployment, a running Service, or infrastructure around the Worker Node.
Build and deploy logs
Purpose: Trace image builds, release steps, and failures during Deploy or Redeploy.
When to use
- Builds fail (dependencies, Dockerfile, buildpack or Railpack errors).
- A Deployment row never reaches a success status.
Access
- Open the Service β Deployments tab.
- Pick a deployment row.
- Choose View Logs to open the terminal view for that attempt.
The Deployments heading copy reads: deployment history and status.
Runtime logs
Purpose: Stream application output after a Service is running (requests, errors, background work).
When to use
- Diagnose 5xx or application exceptions in production.
- Inspect workers, queues, or long-running jobs.
Access
- Open the Service β Logs tab.
- Watch the live stream in the embedded terminal. The heading reads Logs, with the description: live logs from this service.
Worker Node (server) logs
Purpose: Infrastructure actions on the node: installs, plugin changes, or platform-driven operations outside a single app process.
When to use
- Verify Dokku or monitoring installs completed.
- Audit destructive actions at the host layer.
Access
Host-level streams are exposed where your organisation surfaces Worker Node or Environment diagnostics (labels vary by release). If you cannot find them, use Deployments and Logs on the affected Service first, then Compute and worker node issues under Troubleshooting in the sidebar.
Screenshots
Recommended captures (per the documentation screenshot policy):
- Deployments list with View Logs on a row.
- Logs tab showing streaming runtime output.