How can i copy websites for offline browsing?

I am wanting to grab websites which have flash content for offline browsing, however the tools i have found so far are not able to grab the flash files. I am building an app that will get web content offline, and i was using wget until I found out the flash issue. I then played with httrack and had the same result.
Is there a tool out there that i can use to grab web content including flash, for offline browsing?

Please help...anybody.
Spawn10Asked:
Who is Participating?
 
Spawn10Connect With a Mentor Author Commented:
rstjean, hhttrack also has issues grabbing flash files, i already tried it. Am back using wget and i think I have managed to get combination of parameters to get the job done...as in

wget --recursive --no-clobber --page-requisites --html-extension --convert-links --restrict-file-names=windows --domains "website's domain" --no-parent "actual website url"
0
 
rstjeanCommented:
Grab winhttrack.  I find it grabs everything in a snap.  It also files them into a folder for you.

http://www.httrack.com/page/2/en/index.html
0
 
rstjeanCommented:
I need to read more of the original post before I post.

It all depends on how the flash files are linked.  Often it is a first file that calls other flash files.

Are you looking for something that downloads video embedded in flash players or something that downloads flash linked through a stub?
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.