Skip to main content

Streaming HTTP response in PHP - turn long-running process into realtime UI


For a not too long single server-side request what if you could let the user know how the request is handled at the server's end, in real-time without them waiting for the entire process being finished !

without any package or WebSocket or anything on trend

Table of Contents

I was working on something

Recently I was working on an multi-vendor large application where the requirement was to make a multi currency payment processing to facilitate transaction from different source to target currency.

The entire flow to make a transaction with parameters like source & target currency, amount, recipient etc is quite lengthy. It consists of different stages like quote creation, recipient handling, transfer creation followed by funding. There are total 4 API requests that are dependent on each other's response so the entire process takes ⏱ approx. 10 seconds at the server's end.

I couldn't bear just a dumb loader holding my attention for that long 🙄, so I started thinking if I could go for Laravel Echo or anything equivalent that uses WebSocket which could help me out pushing events at different stages of processing, then I could inform the user accordingly and it feels more responsive.

Suddenly this idea of streaming came to the mind. Though this was not the first time I used this technique for such kind of situation, it was last year when I was working on an E-commerce application where the app needed a functionality to sync. it's products database via API.

Response Streaming demo application Laravel

made this with streaming in laravel back then.

Concept Overview


Streaming is not a new concept, it is a data transfer technique which allows a web server to continuously send data to a client over a single HTTP connection that remains open indefinitely. In streaming response comes in chunk rather than sending them at once. In the traditional HTTP request / response cycle, a response is not transferred to the browser until it is fully prepared which makes users wait.

Output Buffering

Output buffering allows to have output of PHP stored into an memory (i.e. buffer) instead of immediately transmitted, it is a mechanism in which instead of sending a response immediately we buffer it in memory so that we can send it at once when whole content is ready.

Each time using echo we are basically telling PHP to send a response to the browser, but since PHP has output buffering enabled by default that content gets buffered and not sent to the client.

A simple experiment

echo "Hi";
echo "There !";

running this code will execute Hi There ! together after a wait of 2 seconds in total. This is because of output buffering which is on by default in PHP.


instead of sending the response to the browser when the first echo is executed, its contents are buffered.

💡 Since buffered content is sent to the browser if either the buffers get full or code execution ends we'll get the two echoed response merged & responded by server all together at once.

Since Hi There ! is not enough to occupy more than 4KB (default size of output buffer in PHP) of buffer size, the content is sent when code execution ends.

however this is not the case for the PHP CLI where OB is always off

Let's try some streaming

Route::get('/mock', function() {    // defining a route in Laravel
set_time_limit(0); // making maximum execution time unlimited
ob_implicit_flush(1); // Send content immediately to the browser on every statement which produces output
ob_end_flush(); // deletes the topmost output buffer and outputs all of its contents

echo json_encode(['data' => 'test 1']);

echo json_encode(['data' => 'test 2']);

echo json_encode(['data' => 'test 3']);

Run the example in browser you'll see response in parts one after another according to the sleep we sprinkled which in real world would be our time consuming data processing like API call or multiple heavy sql execution etc.

Response Streaming example code result

this way we can send content in chunks in a standard HTTP request / response cycle


Output buffers catch output given by the program. Each new output buffer is placed on the top of a stack of output buffers, and any output it provides will be caught by the buffer below it. The output control functions handle only the topmost buffer, so the topmost buffer must be removed in order to control the buffers below it.

✔ The ob_implicit_flush(1) enables implicit flushing which sends output directly to the browser as soon as it is produced.

✔ If you need more fine grained control then use flush() function. To send data even when buffers are not full and PHP code execution is not finished we can use ob_flush and flush. The flush() function requests the server to send it's currently buffered output to the browser


How to get and process the response in javascript

I found a way to do so with traditional xhr ( XMLHTTPRequest ) request

by listening to the `onprogress` event.
function testXHR() {
let lastResponseLength = false;

xhr = new XMLHttpRequest();"GET", "/mock", true);

xhr.setRequestHeader("Content-Type", "application/json");
xhr.setRequestHeader("Accept", "application/json");
xhr.setRequestHeader('X-CSRF-Token', document.querySelector('meta[name="csrf-token"]').content);
xhr.onprogress = function(e) {
let progressResponse;
let response = e.currentTarget.response;

progressResponse = lastResponseLength ?
: response;

lastResponseLength = response.length;
let parsedResponse = JSON.parse(progressResponse);


if(, 'success')) {
// handle process success
xhr.onreadystatechange = function() {
if (xhr.readyState == 4 && this.status == 200) {
console.log("Complete = " + xhr.responseText);

xhr.onprogress: is the function called periodically with information until the XMLHttpRequest completely finishes

Here is how our ajax request work

Response Streaming example how it works

chrome devtools console

Few points to note

✅ we are sending json encoded response from server and in xhr onprogress getting every new response part merged with the previously received part.

✅ it is possible to load the response one at a time as server response is multiple JSON objects & in a format one after another. We can do it by substracting previoud response string length and parsing with JSON.parse

xhr.onprogress = function(e) {
let progressResponse;
let response = e.currentTarget.response;
progressResponse = lastResponseLength
? response.substring(lastResponseLength)
: response;

if we don't properly subtract the last response string we'll get error while parsing the response as json

Response Streaming raw unprocessed result returned by the server

here is the raw unprocessed result returned by the server

What about the progress bar ?

It looks very sophisticated but let me tell you its way too simple than it looks 😀.

There were total 4 different API calls in my case which was dependent to each other, one response is being used to another API's query so I pre setted how much execution is what progress and made the server respond with that.

You can be little more creative by generating the progress numbers in random without crossing certain pre-defined range for each steps which will make it feel more real 😂

Route::get('/expensive-process', function() {

echo json_encode(['progress' => 5, 'data' => $api_response]);

echo json_encode(['progress' => 25, 'data' => $response]);
echo json_encode(['progress' => 100, 'success' => 1, 'data' => $response]);

Response Streaming server response with progress percentage

take a closer look at the response

What if any error / exception occur! how to react to that?

That's easy.. catch the error & respond with a status that the front-end js script can react to

try {
$response = $this->expensiveProcessing();
} catch(\Exception $e) {
// Handle the exception
echo json_encode([
'success' => false,
'message' => $e->getCode() . ' - '. $e->getMessage(),
'progress' => 100


Configuration for Nginx

You need to do few tweaking with nginx server before working with output buffering.

fastcgi_buffering off;
proxy_buffering off;
gzip off;
for whichever reason if you don't have access to nginx server configuration then from PHP code you can also achieve the same result via HTTP header
header('X-Accel-Buffering: no');

Wrapping Up

PHP Response Streaming - server response with error and react to UI

here is an example how I handled error response in the UI

PHP Response Streaming full application

here is the complete app that I built

I hope you are as excited as I was when the first time I figured this and couldn't stop playing with until I built something real and useful. That is the best thing about programming you get to do it just as soon as you learn.

Thank you for reading 😇. Signing off. 🖖