-
Notifications
You must be signed in to change notification settings - Fork 2k
Description
Node Version: 4.4.4
http-proxy version: 1.13.2
When proxying a browser request to a JSON RESTful endpoint that returns gzip content-encoding and chunked transfer-encoding the browser is receiving invalid bytes back from the proxy. This causes a browser error: ERR_CONTENT_DECODING_FAILED
It looks like the server is returning valid bytes to the proxy, but when http-proxy pipes the data from the proxy response to the browser response, the bytes are being UTF-8 encoded. To work around this I added an onProxyRes handler that hijacks the browser response "write" and "end" methods to force "binary" encoding when receiving a "content-encoding" header that contains gzip.
This feels like a really ugly solution:
function handleGzip(proxyRes, req, res) {
var gzipped = /gzip/.test(proxyRes.headers["content-encoding"]);
if (gzipped) {
res.write = (function(override) {
return function(chunk, encoding, callback) {
override.call(res, chunk, "binary", callback);
};
})(res.write);
res.end = (function(override) {
return function(chunk, encoding, callback) {
override.call(res, chunk, "binary", callback);
};
})(res.end);
}
}
Oddly enough, this is only an issue when the Accept request header contains "text/html" or "application/xhtml+xml" When no Accept request header is specified, the response is properly proxied.
I was hoping to use ServerResponse#setDefaultEncoding("binary") but it appears ServerResponse doesn't implement that Writable method.
I'm not sure if this is an issue with http-proxy or the underlying node http implementation, but please help.