[Webinar] Streamline your web hosting managementRegister Today

x
?
Solved

Is it possible to reuse file handle while changing the file name/location?

Posted on 2014-12-28
9
Medium Priority
?
452 Views
Last Modified: 2015-01-02
Hello,

I was wondering if it was possible to reuse a file handle in Python and change where the file handle points to?

Basically what I want to do is, instead of...
for x in range(0, 10000)
    file = open('file_' + str(x) + '.txt', 'w')
    file.write('some text\n')
    file.close()

Open in new window


I want to do something like...
file = open('file_0.txt', 'w')
for x in range(0, 10000)
    file.write('some text\n')
    <magic code to change file location so I don't need to create a new file object>
file.close()

Open in new window


I'm asking this because I'll be creating about 500k files; and creating new file objects seems to be slowing down the script.

Appreciate any help on this!
0
Comment
Question by:Errang Genevre
  • 4
  • 4
9 Comments
 
LVL 81

Accepted Solution

by:
arnold earned 1000 total points
ID: 40522082
your first example is the correct way to reuse file handle, close the existing, open with the new file name.

You are not testing whether the open is encountering an error such that if it does your process will lose data i.e. will write into a file handle that is not set.  

testing for a successful execution (opening a file) is highly recommended since you will not detect the error until it occurs and the data within is seen valuable by someone who requested access to it.
0
 
LVL 27

Assisted Solution

by:wilcoxon
wilcoxon earned 1000 total points
ID: 40522191
There is no way to do this in Perl (I'm assuming there isn't in Python either).  As arnold said, you should be checking for the proper functioning of the calls.

in Perl, the equivalent (corrected) code snippet would be below.
use Fatal qw(open close);
for my $x (0..10000) {
    open OUT, ">file_$x.txt";
    print OUT "some text\n";
    close OUT;

Open in new window


If you don't have a recent enough version of Perl to include Fatal, you can do:
for my $x (0..10000) {
    open OUT, ">file_$x.txt" or die "could not write file_$x.txt: $!";
    print OUT "some text\n";
    close OUT or die "could not close handle for file_$x.txt: $!";

Open in new window

0
 

Author Closing Comment

by:Errang Genevre
ID: 40522315
Alright, thanks.

Didn't expect it to be possible, but just thought I'd check.

I'll try experimenting with the shell output redirection; hopefully that'll have a smaller footprint.
0
Hire Technology Freelancers with Gigs

Work with freelancers specializing in everything from database administration to programming, who have proven themselves as experts in their field. Hire the best, collaborate easily, pay securely, and get projects done right.

 
LVL 81

Expert Comment

by:arnold
ID: 40522366
the comparative delay to close/open file handle is significantly smaller than your processing of data.

depending on what you are doing, you could use syslog/rsyslog as the destination for your output and have it manage the writeout into files.

pushing control out means you also are pushing the means of detecting errors in the process.
0
 

Author Comment

by:Errang Genevre
ID: 40527270
That's true.

The current process runs for 1.5+ hours, but its writing 500k files, and also creating over 2-3 times the number of directories. So I'm just trying to see if possible to use a combination of scripts to get it done quicker.

Still haven't found a decent enough solution for the file creation part; but yea, I still need to consider the Error handling part.
0
 
LVL 81

Expert Comment

by:arnold
ID: 40527284
If you comment out the file creation part and time the processing, dies it decrease.

Multiple directories suggests there is some additional logic to directories/file handles?
0
 

Author Comment

by:Errang Genevre
ID: 40527294
Since there are 500k files (file names are based on a numeric range), the folders are a way to group similar files, and not have Unix scream at us for having that many files in one location.
0
 
LVL 81

Expert Comment

by:arnold
ID: 40528587
Not sure whether running a daemon that will be charged with file creation to which your processing app will be sending its output is a worth while thing.

One thing to look at in your existing script is whether it is buffering the output.

I still think if the processing of the data can be sped up, that will likely be the way to go.
0
 

Author Comment

by:Errang Genevre
ID: 40528591
Alright, cool; thanks.
0

Featured Post

Receive 1:1 tech help

Solve your biggest tech problems alongside global tech experts with 1:1 help.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Sequence is something that used to store data in it in very simple words. Let us just create a list first. To create a list first of all we need to give a name to our list which I have taken as “COURSE” followed by equals sign and finally enclosed …
Checking the Alert Log in AWS RDS Oracle can be a pain through their user interface.  I made a script to download the Alert Log, look for errors, and email me the trace files.  In this article I'll describe what I did and share my script.
Learn the basics of modules and packages in Python. Every Python file is a module, ending in the suffix: .py: Modules are a collection of functions and variables.: Packages are a collection of modules.: Module functions and variables are accessed us…
The viewer will learn how to dynamically set the form action using jQuery.
Suggested Courses

612 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question