I assume you mean browser page caching. I came across this problem with an application
that required a login to see customer transaction data. Essentially you could login, view
financial data and then log out, after which hitting the back button would show browser
cached pages from the previous session. Obviously this caching could potentially reveal
sensitive data to a user other than the original customer.
Adding the following meta tags to all my pages (regardless of whether they contained
forms, s:link, etc) prevented the browser from caching the pages:
<meta http-equiv="Expires" content="-1" />
| <meta http-equiv="Cache-Control" content="no-cache" />
| <meta http-equiv="Pragma" content="no-cache" />
This worked well and our application functioned normally after retro fitting the meta
tags, however it did cause a problem for our client which may or may not be an issue for
you. Our application made heavy use of the commandLink for navigation. This mechanism
requires a HTTP post when the link is selected. The side effect of this is that whenever
the user selects the browsers' back button, after adding the meta tags the browser has
to request a fresh page from the server. Because the original page was generated as the
result of a HTTP post, this causes the browser to issue a "Resend Posted
information" warning (or something like that) every time the back button is used.
This is ugly and is not particularly intuitive and was unacceptable to our client
(understandably).
We got around this by replacing all commandLinks that were used purely for navigational
purposes with s:link. s:link generates a regular HTML link that allows navigation using
HTTP get which solves the browser message problem. One caveat to this solution is that if
any navigation requires values from other components on the page or those values are to be
saved for later you still have to use commandLink to submit the form data.
Regarding performance: in our case (in fact most cases), security requirements superseded
performance requirements, so we had no choice but to turn off browser page caching. That
said, you can still get good performance by adopting one or more of the following:
1. As mentioned in previous posts, if a page is particularly expensive to render then
caching the expensive page fragments using s:cache is a simple and very effective strategy
(e.g. a table bound to a datamodel that requires a complex database query).
2. Introducing appropriate caching policies for external page resources (css, js, images
etc) can help to reduce unnecessary server requests and save bandwidth.
3. If your generated pages are large and your users? connections are slow you could add a
compression filter to your application. I've used the following compression filter
successfully in a Seam application which will cut HTML page sizes by 80% over the wire:
http://sourceforge.net/projects/pjl-comp-filter/
Hope this helps,
Chris.
View the original post :
http://www.jboss.com/index.html?module=bb&op=viewtopic&p=4012560#...
Reply to the post :
http://www.jboss.com/index.html?module=bb&op=posting&mode=reply&a...