Session Export & Import
Session export lets you get your captured traffic out of Ghost and into other tools — for sharing with teammates, archiving test sessions, importing into Postman for API testing, or generating standalone reports. Ghost supports five export formats, two import formats, drag-and-drop importing, and per-flow code generation in five programming formats.
Think of it like saving a recording: you captured a testing session with dozens or hundreds of network requests, and now you want to share it, replay it in another tool, or compare it against a future session. Export packages everything into a single file, and import brings external captures into Ghost for analysis.
Export Formats
Section titled “Export Formats”Ghost can export an entire session in five formats, each designed for a different workflow.
HAR 1.2 (HTTP Archive)
Section titled “HAR 1.2 (HTTP Archive)”The industry-standard format for recording HTTP traffic. HAR files are supported by virtually every web development tool — Chrome DevTools, Firefox, Postman, Charles, Fiddler, and dozens of others. When you export as HAR, Ghost produces a file that any of these tools can open.
What’s inside a HAR file:
| Section | What It Contains | Why It Matters |
|---|---|---|
| Entries | One entry per HTTP request/response pair — every flow in the session | Each entry is a complete snapshot of what was sent and what came back |
| Timings | DNS lookup, TCP connection, TLS handshake, time to first byte (TTFB), and data transfer — all in milliseconds | Shows exactly where time was spent. If DNS took 200ms, that’s a DNS problem, not a server problem. |
| Headers | All request and response headers, flattened (multi-value headers become separate entries) | Lets you inspect authentication tokens, content types, caching directives, CORS headers — everything |
| Bodies | Text bodies as-is, binary content (images, protobuf, etc.) as base64-encoded strings | You can see the exact JSON payloads, HTML responses, or binary data that was transferred |
| Cookies | Parsed from Cookie and Set-Cookie headers into name/value pairs | Shows authentication cookies, tracking cookies, session identifiers |
| Query strings | Parsed from the URL into individual parameters | Shows ?page=2&sort=name as separate entries you can read easily |
| Redirect URL | The Location header value when the server redirected | Shows where 301/302 redirects point to |
Technical details:
- HAR version
1.2, creatorGhost 1.0 - When Ghost has timing data: all six timing phases are included (DNS, connect, SSL, send, wait/TTFB, receive/transfer), converted from microseconds to milliseconds
- When timing data is unavailable (e.g., imported flows): the
waitfield contains the total duration, all others are set to-1(the HAR standard for “not available”) - Bodies are encoded using Ghost’s text content detection — any recognized text type (
text/*,application/json,application/xml,application/javascript,application/graphql, types containing+jsonor+xml, and others) is stored as plain text. Everything else is base64-encoded. - Responses without a status code get
Status: 0withStatusText: "No Response"(or the error message if the request failed) - All array fields default to empty arrays, never
null— this ensures compatibility with strict HAR parsers
JSON (Ghost Native)
Section titled “JSON (Ghost Native)”Exports the raw Ghost flow array exactly as it exists in the database, serialized with json.MarshalIndent (human-readable, 2-space indentation). This is the only format that preserves all Ghost-specific fields — tags, metadata, source, annotations, device information, and everything else that the standardized formats strip out.
When to use JSON over HAR: When you plan to re-import the data into Ghost later (for comparison or archival), or when you need Ghost-specific metadata like tags, flow source (proxy, replay, import, script), or device attribution.
CSV (Spreadsheet)
Section titled “CSV (Spreadsheet)”A flat tabular format designed for analysis in Excel, Google Sheets, or any data tool. Each flow becomes one row with 12 columns:
| Column | Example | Description |
|---|---|---|
id | 01HWRFQP3K5M9TGWQX7Z | The flow’s unique ULID identifier |
method | POST | HTTP method (GET, POST, PUT, etc.) |
url | https://api.example.com/v1/cart | Full request URL |
host | api.example.com | The hostname extracted from the URL |
path | /v1/cart | The URL path without query parameters |
status | 200 | HTTP response status code (0 if no response) |
duration_ms | 142.3 | Request duration in milliseconds (1 decimal place) |
request_size | 1024 | Request body size in bytes |
response_size | 8192 | Response body size in bytes |
content_type | application/json | Response Content-Type header |
tags | api;authenticated;slow | All tags joined with semicolons |
error | connection refused | Error message (empty if successful) |
Flows with no request data are skipped entirely (they have nothing meaningful to show in a table).
CSV injection protection: Spreadsheet applications like Excel can execute formulas embedded in CSV data — a value starting with =SUM(...) would be executed as a formula. This is a known security risk called CSV injection. Ghost’s csvQuote() function prevents this by checking if any cell value starts with =, +, -, @, tab (\t), or carriage return (\r), and prepending a single-quote character (') that tells the spreadsheet to treat the value as plain text. Additionally, values containing commas, quotes, or newlines are wrapped in double quotes with internal quotes escaped.
Postman Collection (v2.1.0)
Section titled “Postman Collection (v2.1.0)”Exports as a Postman Collection that you can import directly into Postman for API testing. Each flow becomes a Postman request item.
What’s included:
- Collection name matches the session name
- Each item is named
"{method} {path}"(e.g., “POST /api/v1/cart/add”) - Request method, full URL (as a raw string)
- Headers — excluding
host,content-length, andconnection(Postman generates these automatically) - Request body as raw text (only if the body is non-empty and valid UTF-8 — binary bodies are silently omitted because Postman’s raw body mode can’t handle them)
Schema: https://schema.getpostman.com/json/collection/v2.1.0/collection.json — this is the current Postman collection format, compatible with both Postman desktop and Postman web.
Note: The URL is stored as a raw string only — not decomposed into protocol, host, path, and query components. Postman will parse the URL automatically on import.
HTML Report
Section titled “HTML Report”A self-contained HTML report with embedded CSS and JavaScript — no external dependencies, works offline, can be emailed or shared as a single file.
What the report contains:
- Statistics summary: total flows, total data transferred, error count, method distribution (how many GET, POST, etc.), status code distribution (2xx/3xx/4xx/5xx)
- Searchable traffic table: columns for Time, Method, Status, Host, Path, Duration, Size, and Tags — with a search filter and sortable columns
- Styled consistently: dark theme with Ghost’s Space Grotesk and JetBrains Mono fonts, color-coded method badges (GET=cyan, POST=purple, PUT=warning, DELETE=red, PATCH=pink)
- Path truncation: paths longer than 60 characters are truncated (UTF-8 safe) with an ellipsis
- All values are HTML-escaped to prevent XSS
The report export fetches up to 10,000 flows (compared to 100,000 for other formats) because each flow generates a full HTML table row, which would make very large reports unwieldy.
How Export Works
Section titled “How Export Works”What this diagram shows — the complete export process:
- You initiate an export from the UI — either from the File menu, the session context menu, or the settings panel
- Ghost’s API fetches all flows from the database in batches of 5,000 — this prevents loading hundreds of thousands of flows into memory at once, which could crash the application
- There’s a hard cap of 100,000 flows per export — if the session has more flows than this, Ghost logs a warning and exports the first 100K. This cap exists because generating a 100K-flow HAR file already produces a large JSON document
- Between each batch, Ghost checks whether the request was cancelled (you navigated away or closed the dialog) — if so, it stops fetching and returns what it has
- The raw flows are transformed into the requested format (HAR entries, CSV rows, Postman items, etc.)
- If the resulting file is 20 MB or smaller, Ghost automatically saves a copy as an “artifact” in the database — so you can download it again later from the artifacts panel without re-generating it. The artifact is tagged with the session name, flow count, and human-readable file size
- The file is streamed to the frontend with appropriate
Content-DispositionandContent-Typeheaders
Desktop vs. Browser Export
Section titled “Desktop vs. Browser Export”On the desktop app (Tauri), Ghost opens the operating system’s native save dialog — you choose where to save the file, and Ghost writes it using Tauri’s file system plugin. The default filename follows the pattern:
{sessionName}_{timestamp}.{ext}For example: API_Tests_2025-03-08-143022.har or Mobile_QA_2025-03-08-143022_postman_collection.json (the _postman_collection suffix is added for Postman exports).
Special characters in the session name are replaced with underscores for filesystem safety (/, \, :, *, etc. all become _). If the name is entirely special characters, it falls back to session.
In a browser context (no Tauri), Ghost falls back to a standard blob download — creates an object URL, simulates a link click, and cleans up the URL.
Import
Section titled “Import”Ghost can import external traffic captures into any session, letting you analyze captures from other tools or compare them against live traffic.
HAR Import
Section titled “HAR Import”POST /api/v1/sessions/{id}/import/har
Accepts a HAR 1.2 file (either as a raw JSON body or as a multipart file upload with field name "file") and converts each HAR entry into a Ghost flow.
Limits and safety:
- Maximum file size: 256 MB — enforced by reading at most
256MB + 1 byteand rejecting if data remains - Session must exist (404 if not found)
- Empty data returns 400
- Zero entries returns success with
imported: 0
How each entry is converted:
| HAR Field | Ghost Flow Field | Conversion Details |
|---|---|---|
startedDateTime | StartedAt | Tries three date formats in order: RFC3339Nano, RFC3339, ISO 8601 with milliseconds (2006-01-02T15:04:05.000Z). Falls back to current time if none parse. |
time | Duration | Milliseconds to Go time.Duration |
request.method | Request.Method | As-is. Falls back to "GET" if missing. |
request.url | Request.URL + Host + Path | URL parsed to extract host and path |
request.headers | Request.Headers | Converted to Go’s http.Header map |
request.postData.text | Request.Body | If MIME type is non-text, tries base64 decode first; falls back to raw text |
response.status | Response.StatusCode | Response only created if status > 0 |
response.content.text | Response.Body | If encoding is "base64" (case-insensitive), decoded; falls back to raw text on decode error |
timings | Timings | DNS, Connect, SSL, Wait (TTFB), Receive (Transfer) — milliseconds to Duration. Returns nil if all values ≤ 0. |
Every imported flow gets:
- A fresh ULID identifier (the original IDs from the HAR file are not preserved)
- Source set to
"import"(so you can filter imported flows withsource:importin GQL) - Metadata
{"import_source": "har"} - HTTP version defaults to
"HTTP/1.1"if not specified in the HAR
Error handling: Each entry is processed independently. If one entry fails to parse or insert into the database, it’s counted as “skipped” and the error message is recorded — but processing continues. Invalid entries don’t fail the entire import.
Real-time updates: Each successfully imported flow triggers a flow.created WebSocket event, so the traffic list updates live as the import progresses.
Response:
{ "imported": 142, "skipped": 3, "errors": ["entry 17: invalid URL", "entry 44: body decode failed", "entry 89: missing method"] }JSON Import
Section titled “JSON Import”POST /api/v1/sessions/{id}/import/json
Accepts a JSON array of Ghost flow objects — the same format produced by Ghost’s JSON export. Same multipart or raw body upload support as HAR import.
Limits:
- Maximum file size: 256 MB
- Maximum flows per file: 50,000 (to prevent memory exhaustion from massive imports)
- Maximum error messages recorded: 100 (to prevent the error log itself from growing unbounded)
Format detection: Before parsing, Ghost probes the first byte of the JSON:
{(JSON object): checks the first 4 KB for HAR signatures ("log"key with a non-null value). If it looks like HAR, returns a helpful error: “this looks like a HAR file — use the /import/har endpoint instead”. If it’s a non-HAR object, returns “invalid JSON format — expected an array of flow objects (starts with [)”[(JSON array): proceeds with import- Anything else: 400 error
Per-flow validation:
- Must have a non-nil
Requestwith non-emptyMethodandURL— flows missing any of these are skipped - Missing
HostorPath: auto-parsed from the URL - Zero
StartedAt: defaults to current time - Nil
Tags: defaults to empty array - Nil
Metadata: defaults to empty map
Every flow gets a fresh ULID, the session ID is overwritten to match the target session, source is set to "import", and metadata includes {"import_source": "json"}.
Context cancellation: Every 100 flows, Ghost checks whether the request was cancelled — this allows you to abort a long-running import without waiting for all 50K flows to process.
Same response format as HAR import: { imported, skipped, errors }.
Drag-and-Drop Import
Section titled “Drag-and-Drop Import”Drop a .har or .json file anywhere in the Ghost window to import it instantly. This is the fastest way to load an external capture.
How it works step by step:
What this diagram shows:
- Drop detection: The entire Ghost window is a drop zone. When you drag a file over it, a semi-transparent overlay appears with a dashed cyan border, an upload icon, and the text “Drop to import session” / “Supports .har and .json files”
- Extension check: Only
.harand.jsonfiles are accepted. Other file types show a toast error. - Session creation: Ghost automatically creates a new session named
"{filename} (imported)"with description"Imported from {filename}"— so your current session isn’t modified - Format probing: Ghost reads the first 1 KB of the file and examines the first non-whitespace character:
- Starts with
{: checks for HAR signatures using regex — looks for both"log":and"entries":patterns. If found, imports as HAR. - Starts with
[: imports as JSON (Ghost flow array format) - Anything else: error — “Invalid file format — expected JSON array or HAR object”
- Starts with
- Import: Calls the appropriate import endpoint with the file
- Success: Shows a toast notification with the count of imported flows and a “Compare” action button — clicking it opens session comparison between your active session and the imported session. This is particularly useful for comparing a baseline capture against the current state.
- Failure cleanup: If the import fails for any reason, Ghost automatically deletes the session it created — so you don’t end up with empty orphaned sessions
Note the difference: Drag-and-drop always creates a new session. The File menu’s “Import HAR” option imports into the active session. Choose based on whether you want the imported flows mixed with existing ones or kept separate.
Per-Flow Code Generation
Section titled “Per-Flow Code Generation”Individual flows can be exported as executable code snippets — useful for reproducing a specific request outside of Ghost.
Available Formats
Section titled “Available Formats”| Format | Access From | Output |
|---|---|---|
| cURL | Context menu + Inspector | Shell command with headers, body, line continuation |
| fetch() | Context menu + Inspector | JavaScript fetch() snippet with headers and body |
| Python | Inspector only | Python requests library snippet |
| HTTPie | Inspector only | HTTPie CLI command |
| Markdown | Context menu (“Share”) | Markdown summary table with masked sensitive headers |
Context menu = right-click a flow in the traffic list (offers cURL, fetch, Markdown) Inspector = the copy dropdown in the flow detail hero bar (offers all four code formats, plus a separate “Share” button for Markdown)
Format Details
Section titled “Format Details”cURL:
- Omits
-X GET(GET is cURL’s default) - All URLs and header values are shell-quoted (single quotes with
'escaped to'\'') - Binary request bodies: outputs a comment
# Binary request body (base64-encoded, not included)instead of the body data - Text bodies: included via
--data-raw - Long commands use backslash line continuation (
\) for readability
fetch():
- Omits
method: 'GET'(GET is fetch’s default) - JSON bodies: pretty-printed via
JSON.parse→JSON.stringify(parsed, null, 2), wrapped inJSON.stringify()for the fetch call - Non-JSON bodies: wrapped in
JSON.stringify() - Binary bodies: silently omitted
- Multi-value headers: only the first value is used
Python:
- Standard methods (get, post, put, patch, delete, head, options): uses
requests.{method}(url, ...) - Non-standard methods: uses
requests.request("{METHOD}", url, ...) - JSON content type: uses the
json=parameter with parsed JSON object (falls back todata=if JSON parsing fails) - Non-JSON bodies: uses
data=parameter - Includes
print(response.status_code)andprint(response.text)at the end
HTTPie:
- Format:
http METHOD URL Name:Value ... - Headers use HTTPie’s
Name:Valuesyntax (shell-quoted if value contains spaces) - JSON bodies: parsed into HTTPie’s
key:=valuepairs for direct key-value passing - Non-parseable JSON: silently skipped
Markdown:
- Produces a summary table with request and response details
- Sensitive header masking: the headers
authorization,cookie,set-cookie,x-api-key, andx-auth-tokenare masked —Bearertokens showBearer ***, all others show*** - Pipe characters (
|) in header values are escaped (since|is the Markdown table delimiter) - Request and response bodies are truncated at 2,000 characters with a
...(truncated)suffix - Code fences use language hints (
json,xml,html) based on content type for syntax highlighting - Binary response bodies show
*Binary response body* - Footer:
*Captured by Ghost at {timestamp}*
Skipped Headers
Section titled “Skipped Headers”All code generators skip these headers (which are connection-level, not meaningful for replay):
host, content-length, connection, keep-alive, transfer-encoding, proxy-authorization, proxy-authenticate, proxy-connection, te, trailers, upgrade
Additional Export Endpoints
Section titled “Additional Export Endpoints”Beyond the standard session export formats, Ghost provides two specialized export endpoints for AI-generated content:
| Endpoint | What It Returns |
|---|---|
GET /api/v1/sessions/{id}/report | The AI agent’s generated security/analysis report (report.md) from the session workspace. Supports content negotiation: Accept: text/html returns a wrapped HTML page, otherwise returns raw Markdown as an attachment. Max report size: 10 MB. |
GET /api/v1/sessions/{id}/poc | A zip archive of all proof-of-concept scripts from the session’s poc/ workspace directory. Only regular files are included (no directories or symlinks), each capped at 10 MB individually. |
These endpoints return 404 if the agent hasn’t generated a report or PoC scripts for the session.
Artifact Persistence
Section titled “Artifact Persistence”When an export file is 20 MB or smaller, Ghost automatically saves a copy as an “artifact” in the database. This means you can re-download previous exports without regenerating them.
Each artifact records:
- Type:
session_export - Title:
"{sessionName} — {FORMAT} Export"(e.g., “API Tests — HAR Export”) - Summary:
"{flowCount} flows, {humanSize}"(e.g., “1,247 flows, 3.2 MB”) - Metadata:
flow_countandsession_name
A artifact.created WebSocket event is broadcast when an artifact is saved.
Artifact API:
| Method | Endpoint | Description |
|---|---|---|
GET | /api/v1/artifacts | List all artifacts (filterable by type, session_id, supports limit and offset) |
GET | /api/v1/artifacts/{id} | Get artifact content as a string |
DELETE | /api/v1/artifacts/{id} | Delete an artifact |
GET | /api/v1/artifacts/{id}/download | Download artifact as a file (with correct Content-Type and Content-Disposition based on format: json/har → application/json, csv → text/csv, html → text/html, markdown → text/markdown) |
Where to Find Export/Import in the UI
Section titled “Where to Find Export/Import in the UI”Ghost provides multiple ways to access export and import functionality:
| Location | What’s Available |
|---|---|
| File menu → Export Session | HAR, JSON, CSV, Postman Collection (disabled when no active session) |
| File menu → Import HAR | Opens a file picker for .har files — imports into the active session |
| Settings → Storage | Same export buttons (HAR, JSON, CSV, Postman) + Import HAR button |
| Session context menu | Export options for the specific session |
| Drag-and-drop | Drop .har or .json anywhere — creates a new session |
| Flow context menu (right-click) | Copy as cURL, Copy as fetch(), Copy as Markdown |
| Inspector hero bar | Copy as dropdown (cURL, fetch, Python, HTTPie) + Share button (Markdown) |
| Native macOS menu | Same File menu items, handled by Tauri’s native menu system |
API Reference
Section titled “API Reference”Session Export
Section titled “Session Export”| Method | Endpoint | Description |
|---|---|---|
GET | /api/v1/sessions/{id}/export/har | Export as HAR 1.2 (JSON file) |
GET | /api/v1/sessions/{id}/export/json | Export as Ghost JSON (preserves all metadata) |
GET | /api/v1/sessions/{id}/export/csv | Export as CSV (12 columns, injection-protected) |
GET | /api/v1/sessions/{id}/export/postman | Export as Postman Collection v2.1.0 |
GET | /api/v1/sessions/{id}/export/report | Export as self-contained HTML report (10K flow cap) |
Session Import
Section titled “Session Import”| Method | Endpoint | Description |
|---|---|---|
POST | /api/v1/sessions/{id}/import/har | Import HAR file (raw JSON body or multipart upload, 256 MB limit) |
POST | /api/v1/sessions/{id}/import/json | Import Ghost JSON file (multipart upload, 256 MB limit, 50K flow cap) |
AI-Generated Content
Section titled “AI-Generated Content”| Method | Endpoint | Description |
|---|---|---|
GET | /api/v1/sessions/{id}/report | Download agent-generated report (Markdown or HTML via content negotiation) |
GET | /api/v1/sessions/{id}/poc | Download proof-of-concept scripts as a zip archive |
Use Cases
Section titled “Use Cases”Share a bug reproduction — You’ve captured the exact sequence of requests that triggers a bug. Export as HAR and attach it to your bug report. The developer imports it into their own Ghost or browser DevTools and sees exactly what you saw — headers, timing, response bodies, everything.
Compare before and after — Export a session before a code change. After the change, drag-and-drop the old HAR file into Ghost. Click “Compare” in the toast. Ghost shows you exactly what changed — new endpoints, different response times, status code changes.
Feed to Postman — Export as Postman Collection and import into Postman. Every API call from your testing session becomes a ready-to-use Postman request. You can organize them into folders, add assertions, and build automated test suites from real traffic.
Analyze in a spreadsheet — Export as CSV. Open in Excel or Google Sheets. Sort by duration to find the slowest requests. Filter by status code to count errors. Pivot by host to see which services are called most. Create charts of response time distribution.
Quick reproduction — Right-click a failing request → Copy as cURL. Paste into your terminal. The exact request is reproduced — same headers, same body, same URL. Modify one parameter and run again to isolate the issue.
Document an API — Right-click a flow → Copy as Markdown. Paste into your documentation, wiki, or Slack message. Sensitive headers (authorization tokens, cookies, API keys) are automatically masked with *** so you don’t accidentally share credentials.
Generate a standalone report — Export as HTML Report for stakeholders who don’t have Ghost installed. The report is a single file with embedded styles and JavaScript — search, sort, color-coded methods — works offline, can be emailed, printed, or embedded in a wiki.