perl clear cache

hi all

I am working on a project that generates financial reports for a set of companies using perl and apache on linux. The problem that i'm facing is that the pages are cached and therefor if i call a report for clent A,the details of client B are displayed.

I have tried putting the foll:

#!/usr/bin/perl -w
my $month = param('month');
use lib '/var/www/cgi-bin';
use modules::module;

use CGI qw(:standard);
use CGI::Carp qw(fatalsToBrowser);

print header;
print start_html('Client A Report');
print"<meta http-equiv=expires content=0> ";
print"<META HTTP-EQUIV=pragma CONTENT=no-cache> ";
print"<body bgcolor=#E6E7E6>";
print "<table border=1 cellpadding=1 cellspacing=1 bgcolor=white>";

But this is not helping me. If I hit ctrl-F5 the correct data is dislayed,but obviously i cant ask my clent to do the same.

plz help!!!!


Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

you could add an 'expires' header to the response, by using this:

print header( -expires => '-1d' );

But are you sure it is a problem in the first place? I don't think client A would have the report for client B in his cache in the first place. Not unless he requested it before.
It may be just you who is having the problem.
It's not obvious from your code, but shouldn't there be a parameter in there that tells you which client is making the request? How do you know which report to generate?
the problem with the headers such as expire and no-cache is that they just don't work erliably since many browsers ignore tham and there are intermediate places such as ISPs that sometimes cache.

When I have this problem I make data be displayed by a script such as:

That script also uses one additional parameter, not otherwise needed, I call it onetime.  So the invocation of the script is something like:
Now, the value 106.... is actually the time that the page making the invocation was created with the process id added to it.  In this manner I can ensure that no two users ever ask for the same page.  

Experts Exchange Solution brought to you by ConnectWise

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial

How are your clients referencing your site?  If they all use the same URL and the client identifier is in a form field you can have problems with caching since a cache only looks at the URL.  Also you want to use cache tags that are part of the header rather than meta tags.  

The bottom line is that caches are not complicated but you really need to understand how they work - then you can work out a relatively simple approach to using them.  It took me a few trials.  Check out these couple of links on caches.


Cloud Class® Course: Certified Penetration Testing

This CPTE Certified Penetration Testing Engineer course covers everything you need to know about becoming a Certified Penetration Testing Engineer. Career Path: Professional roles include Ethical Hackers, Security Consultants, System Administrators, and Chief Security Officers.

I do not agree with Sam, sadly there are far too many systems that do not fllow the documented rules as far as cache tags are concerned.  He is right, they are not complicated, sadly however, the rules are not observed.
"I do not agree with Sam, sadly there are far too many systems that do not fllow the documented rules as far as cache tags are concerned.  He is right, they are not complicated, sadly however, the rules are not observed."

I think we both may be saying something quite similar.  I agree with you that no matter what the "book" says you can never count on a browser/proxy to go and figure out that something needs refreshing and have it ALWAYS refresh - sometimes it will just dispaly the cached copy.  You can, however,  always count on a browser calling your script if you generate a unqiue URL.  That was my point.

but you can always prevent the cache by adding something unique to the url so that the cache believes that it is different.
That is exactly what I am saying.
Nothing has happened on this question in more than 8 weeks. It's time for cleanup!

My recommendation, which I will post in the Cleanup topic area, is to
split points between jhurst and elsamman.


EE Cleanup Volunteer
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.