Phase 2: Stream responses to client via StreamingBody#585
Draft
aram356 wants to merge 5 commits intofeature/streaming-pipeline-phase1from
Draft
Phase 2: Stream responses to client via StreamingBody#585aram356 wants to merge 5 commits intofeature/streaming-pipeline-phase1from
aram356 wants to merge 5 commits intofeature/streaming-pipeline-phase1from
Conversation
Replace #[fastly::main] with an undecorated main() that calls Request::from_client() and explicitly sends responses via send_to_client(). This is required for Phase 2's stream_to_client() support — #[fastly::main] auto-calls send_to_client() on the returned Response, which is incompatible with streaming. The program still compiles to wasm32-wasip1 and runs on Fastly Compute — #[fastly::main] was just syntactic sugar. Also simplifies route_request to return Response directly instead of Result<Response, Error>, since it already converts all errors to HTTP responses internally.
Change signature from returning Body (with internal Vec<u8>) to writing into a generic &mut W: Write parameter. This enables Task 8 to pass StreamingBody directly as the output sink. The call site in handle_publisher_request passes &mut Vec<u8> for now, preserving the buffered behavior until the streaming path is wired up.
Split handle_publisher_request into streaming and buffered paths based on the streaming gate: - Streaming: 2xx + processable content + no HTML post-processors - Buffered: post-processors registered (Next.js) or non-processable Streaming path returns PublisherResponse::Stream with the origin body and processing params. The adapter calls finalize_response() to set all headers, then stream_to_client() to commit them, and pipes the body through stream_publisher_body() into StreamingBody. Synthetic ID/cookie headers are set before body processing (they are body-independent), so they are included in the streamed headers. Mid-stream errors log and drop the StreamingBody — client sees a truncated response, standard proxy behavior.
- Replace streaming_body.finish().expect() with log::error on failure (expect panics in WASM, and headers are already committed anyway) - Restore explanatory comments for cookie parsing, SSC capture, synthetic ID generation, and consent extraction ordering
Hoist the non-processable early return above the streaming gate so content_encoding extraction happens once. The streaming gate condition is also simplified since should_process and request_host are already guaranteed at that point.
15 tasks
6 tasks
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Stream HTTP responses directly to the client via Fastly's
StreamingBodyAPI when Next.js is disabled. This eliminates full-body buffering on the proxy path, reducing peak memory from ~4x response size to constant and improving TTFB.Closes #573, closes #574, closes #575, closes #576.
Part of epic #563. Depends on Phase 1 (#583).
What changed
Entry point migration (
main.rs):#[fastly::main]with undecoratedmain()usingRequest::from_client()route_requestreturnsOption<Response>—Nonewhen the streaming path already sent the response viastream_to_client()send_to_client()process_response_streamingnow generic overW: Write(publisher.rs):Body(internalVec<u8>) to writing into&mut WStreamingBodydirectly as the output sinkPipelineConfigcreation and content-encoding extractionStreaming path via
PublisherResponseenum (publisher.rs):handle_publisher_requestreturnsPublisherResponse::StreamorPublisherResponse::Bufferedshould_process && !request_host.is_empty() && (!is_html || !has_post_processors)Content-Lengthremoved before streaming (chunked transfer)stream_publisher_body()public API bridges core ↔ adaptersend_to_client()with status; mid-stream → log anddrop(streaming_body)(abort)Files changed
main.rspublisher.rsPublisherResponseenum,W: Writerefactor, streaming gateTask 10 (Chrome DevTools metrics) — deferred
Requires a running publisher origin to measure TTFB/TTLB. Local dev uses
localhost:9090which has no mock server. Metrics capture deferred to staging deployment. See #577.Verification
cargo test --workspace— 754 passed, 0 failedcargo clippy --workspace --all-targets --all-features -- -D warnings— cleancargo fmt --all -- --check— cleannpx vitest run— 282 passedcargo build --release --target wasm32-wasip1— successTest plan