Skip to content

Add support for caching #219

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
kul opened this issue Mar 29, 2012 · 13 comments
Closed

Add support for caching #219

kul opened this issue Mar 29, 2012 · 13 comments

Comments

@kul
Copy link

kul commented Mar 29, 2012

Support for caching would be an awesome addition. Something along the lines of nginx

location ~* .(jpg|png|gif|jpeg|css|js|mp3|wav|swf|mov|doc|pdf|xls|ppt|docx|pptx|xlsx)$ {
proxy_buffering on;
proxy_cache_valid 200 120m;
expires 864000;
}

@Marak
Copy link
Contributor

Marak commented Mar 29, 2012

I think that this functionality would be best suited in another library that uses http-proxy as a dependency.

@Marak Marak closed this as completed Mar 29, 2012
@vvo
Copy link

vvo commented Mar 29, 2012

This should be easy to do as a middleware or using a connect middleware like https://github.com/tdebarochez/connect-cache

?

@kul
Copy link
Author

kul commented Mar 29, 2012

Awesome suggestion @vvo

Thanks! i will try this tomorrow.

@kul
Copy link
Author

kul commented Mar 30, 2012

i have given up on this,

var fs = require('fs'),
httpproxy = require('http-proxy'),
connect_cache = require('connect-cache');

var rules = { rules:
    [
        { regex: /.*.js/,ttl:60000  }
    ]
    ,loopback:'localhost:8080'
};

httpproxy.createServer(
    connect_cache(rules),
    8080,'localhost',options).listen(8181);

The request gets cached and everything works fine but once i do ctrl f5 a few time the connect-cache blows out. i have to say its a sad cache.

@vvo
Copy link

vvo commented Mar 30, 2012

blows out ?

ctrl+f5 will send out 'cache-control:no-cache' headers which means 'refresh the cache' so there's must be something in the connect cache code to handle this.

Im sure it will work, debug, read the connect cache code.

good luck

@kul
Copy link
Author

kul commented Mar 30, 2012

@deitch
Copy link

deitch commented Feb 20, 2013

@kul how did you get this to work? I can see how using the connect-cache middleware allowed the cache to handle it if it is already stored. But if it is not (cache miss), then it just passes through to the proxy, so the results of the proxy do not get cached. How did you ever populate the cache?

@kul
Copy link
Author

kul commented Feb 21, 2013

oh i am sorry i deleted the fork by mistake but the original repo had merged the pull request tdebarochez/connect-cache@05bfdc6.
So the idea is to treat it as cache miss which is just send to the server and cached for subsequent requests.

@deitch
Copy link

deitch commented Feb 21, 2013

I get that it is a cache miss; what I didn't get is how you got the data in the cache in the first place.

Once you have a cache miss, and you pass it on to http-proxy, how did you intercept the response to add it to the cache?

@kul
Copy link
Author

kul commented Feb 21, 2013

I am sorry if i am missing something here, did you try reading the node http proxy's and connect-cache's home page ?
https://github.com/nodejitsu/node-http-proxy#middleware
https://github.com/tdebarochez/connect-cache#how-it-works
Basically node-http-proxy provides middleware support which are like plugins in node.js world. Caching is taken care by connect-cache from that point onwards.

I may be a little rusty here, long time since i used node-http-proxy. Apologies.
PS: nginx release websocket support.

👾

@philjackson
Copy link
Contributor

@deitch Did you happen to work out how to intercept a response?

@deitch
Copy link

deitch commented Sep 10, 2013

Yeah, I used http-proxy, but I wrapped the response. In other words, I have middleware like this:

req.use(myMiddleware);

where myMiddleware is like:

httpProxy = require('http-proxy'), proxy = new httpProxy.HttpProxy(config);
module.exports = function(req,res,next) {
  if (isInCache) {
    // send response from cache
  } else {
    wrapResponse(res);
    proxy.proxyRequest(req,res,httpProxy.buffer(req));
  }
}

wrapResponse wraps every single send* and other relevant calls, and caches everything with a particular key

Conceptually simple, in practice was a beast, but works nicely.

@philjackson
Copy link
Contributor

Thanks, @deitch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants