All,

  I am currently working on implementing Seam on part of a relatively high traffic site (8k+ concurrent users) and optimizing our backend and frontend to handle that traffic. When focusing on front-end, I'm trying to make sure that the right HTTP caching headers are sent out -- but our Seam Remoting javascript files can't be browser-cached for a couple of reasons:

  /interface.js?componentName won't get cached due to it containing a query string and the way browsers handle those
  /remote.js always returns an HTTP 200 with the actual file contents, whereas this file would only change with new Seam releases (is that correct?)

  Also, neither of these JS files are combined, packed or minified.

  The Seam Remoting InterfaceGenerator does have an interface cache for each component, so at least that time is saved. But it still sends back unnecessary unchanged content to the browser. Since javascript downloads block all other downloads in the browser, this is especially noticeable with load times, making the page seem slower.

  A good potential solution (suggested by Dan Allen on twitter) to the first issue... use the ReWrite filter. My guess would be something like this:

  /interface/componentName.js --> interface.js?componentName

  And if we sent the right HTTP caching headers, that would fix that issue and browsers could cache it. But what about using multiple components? This is what I propose...

  /interface/componentName1,componentName2,componentName3.js -----> /interface.js?componentName1,componentName2,componentName3

  these still are valid and "RESTful" URLs... and can be cached.

  Is this something reasonable enough that I can take on as my first Seam contribution? (Seems small enough)... 

  Furthermore, in non-debug mode, Seam should send back minified Javascript (using the Java based YUI-compressor)

  Please let me know your thoughts before I get started,

Thanks,

Ashish Tonse
Kaizen Consulting