Still celebrating National IT Professionals Day with 3 months of free Premium Membership. Use Code ITDAY17


perl clear cache

Posted on 2003-11-05
Medium Priority
Last Modified: 2012-06-27
hi all

I am working on a project that generates financial reports for a set of companies using perl and apache on linux. The problem that i'm facing is that the pages are cached and therefor if i call a report for clent A,the details of client B are displayed.

I have tried putting the foll:

#!/usr/bin/perl -w
my $month = param('month');
use lib '/var/www/cgi-bin';
use modules::module;

use CGI qw(:standard);
use CGI::Carp qw(fatalsToBrowser);

print header;
print start_html('Client A Report');
print"<meta http-equiv=expires content=0> ";
print"<META HTTP-EQUIV=pragma CONTENT=no-cache> ";
print"<body bgcolor=#E6E7E6>";
print "<table border=1 cellpadding=1 cellspacing=1 bgcolor=white>";

But this is not helping me. If I hit ctrl-F5 the correct data is dislayed,but obviously i cant ask my clent to do the same.

plz help!!!!


Question by:anuvc
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
LVL 18

Expert Comment

ID: 9685367
you could add an 'expires' header to the response, by using this:

print header( -expires => '-1d' );

But are you sure it is a problem in the first place? I don't think client A would have the report for client B in his cache in the first place. Not unless he requested it before.
It may be just you who is having the problem.
It's not obvious from your code, but shouldn't there be a parameter in there that tells you which client is making the request? How do you know which report to generate?

Accepted Solution

jhurst earned 252 total points
ID: 9726433
the problem with the headers such as expire and no-cache is that they just don't work erliably since many browsers ignore tham and there are intermediate places such as ISPs that sometimes cache.

When I have this problem I make data be displayed by a script such as:

That script also uses one additional parameter, not otherwise needed, I call it onetime.  So the invocation of the script is something like:
Now, the value 106.... is actually the time that the page making the invocation was created with the process id added to it.  In this manner I can ensure that no two users ever ask for the same page.  

Assisted Solution

elsamman earned 248 total points
ID: 9728328

How are your clients referencing your site?  If they all use the same URL and the client identifier is in a form field you can have problems with caching since a cache only looks at the URL.  Also you want to use cache tags that are part of the header rather than meta tags.  

The bottom line is that caches are not complicated but you really need to understand how they work - then you can work out a relatively simple approach to using them.  It took me a few trials.  Check out these couple of links on caches.


Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!


Expert Comment

ID: 9728356
I do not agree with Sam, sadly there are far too many systems that do not fllow the documented rules as far as cache tags are concerned.  He is right, they are not complicated, sadly however, the rules are not observed.

Expert Comment

ID: 9728552
"I do not agree with Sam, sadly there are far too many systems that do not fllow the documented rules as far as cache tags are concerned.  He is right, they are not complicated, sadly however, the rules are not observed."

I think we both may be saying something quite similar.  I agree with you that no matter what the "book" says you can never count on a browser/proxy to go and figure out that something needs refreshing and have it ALWAYS refresh - sometimes it will just dispaly the cached copy.  You can, however,  always count on a browser calling your script if you generate a unqiue URL.  That was my point.


Expert Comment

ID: 9732366
but you can always prevent the cache by adding something unique to the url so that the cache believes that it is different.

Expert Comment

ID: 9732427
That is exactly what I am saying.
LVL 20

Expert Comment

ID: 10089853
Nothing has happened on this question in more than 8 weeks. It's time for cleanup!

My recommendation, which I will post in the Cleanup topic area, is to
split points between jhurst and elsamman.


EE Cleanup Volunteer

Featured Post

[Webinar] Lessons on Recovering from Petya

Skyport is working hard to help customers recover from recent attacks, like the Petya worm. This work has brought to light some important lessons. New malware attacks like this can take down your entire environment. Learn from others mistakes on how to prevent Petya like worms.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

In the distant past (last year) I hacked together a little toy that would allow a couple of Manager types to query, preview, and extract data from a number of MongoDB instances, to their tool of choice: Excel (…
Checking the Alert Log in AWS RDS Oracle can be a pain through their user interface.  I made a script to download the Alert Log, look for errors, and email me the trace files.  In this article I'll describe what I did and share my script.
Explain concepts important to validation of email addresses with regular expressions. Applies to most languages/tools that uses regular expressions. Consider email address RFCs: Look at HTML5 form input element (with type=email) regex pattern: T…
Six Sigma Control Plans

722 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question