We help IT Professionals succeed at work.

Perl + Unix: File upload security best practices

tomaugerdotcom
tomaugerdotcom used Ask the Experts™
on
Hi Everyone,

I'm a veteran Perl programmer, but fairly new to Unix security. I have been using the CGI module for a long time to handle uploads, but haven't really paid much attention to the security issues.. until now.

This question deals specifically with Unix, users, groups, file permissions and directory locations. I have no problem making uploads work just fine, but I'm wondering what the best practices are.

Here is my specific question:

- I am not a privileged user on this Linux system
- My Perl scripts all run under user 'apache' and is a member of group 'apache'
- Any directories or files I create myself are created with owner 'tom' and group 'dev'
- I do not have permissions to chown or chgrp
- I always upload to a directory I have created called 'uploads'
- Uploads fail unless 'uploads' directory is chmodded to 777 <-- SECURITY RISK

Since I cannot 'chown dev uploads' which would at least allow me to 'chmod 774 uploads' what else can I do?

Thanks for your helpful advice in advance,

Tom
Comment
Watch Question

Do more with

Expert Office
EXPERT OFFICE® is a registered trademark of EXPERTS EXCHANGE®
Richard DavisSenior Web Developer

Commented:
That is a significant barrier to circumvent. Traditionally, it's best to have newly uploaded files scanned by the resident anti-virus/root-kit detection system. If all checked out, then move the file to a secure directory to be operated on via internal methods.

Your situation most definitely creates a rather difficult level of restrictions that disallow such practices.
I rent a hosting server where I have full root access to my OS, thus giving me the flexibility to administer file security methodologies at a more top-level approach.

My primary concern if I were in your position, would be getting newly uploaded files scanned upon arrival. This is critical. Obviously, this applies to any OS that is exposed to public resources.

Your inability to even chmod or chown the files is also of major concern as this allows you no recourse.

~A~

Author

Commented:
Thanks for the confirmation, Adrian. I also like your idea of scanning the files. However, my primary concern here is finding an alternative to chmod 777 that will allow my Perl script to still create files (and subdirectories if necessary) within an 'uploads' folder that's NOT world read-write-executable.

Thanks!

Tom
Richard DavisSenior Web Developer

Commented:
I would also structure your perl upload script so that is does some level of preliminary validation against the potential incoming file. Assuming that the file is binary, i.e. an image, check that the file name does not start with symbols. A good example of that would be to not let someone upload a file called "-- recursive".

Guess what that would do if you ran "rm --recursive" to try to delete it and the contents of that file were executable code?

Additionally, you could perform checks on the content of the file that confirm it's type. If it's a JPEG, read the header and determine if it's a genuine JPEG.

These steps could be performed at your level of access, but much more then that would demand that you seek a hosting server with far less restrictive accounts.

Two significantly major security risks you have is that your uploads directory is world writable and executable. The second is that your perl scripts are apache user & group. This leaves your system at high risk of malicious perl executions.

Bottom line is that you need greater control as a shell user. Your provider is understandably security conscience, but this sounds far too extreme and does not permit users even the most rudimentary control while not compromising administrative level security.

~A~

Author

Commented:
HI Adrian. Actually, the files get compressed, and the file name is turned into an alphanumeric hash code - the only thing that's kept is the TLA (file extension) and this is run through a filter - only certain file types are allowed.

So I think that part is covered.

Adrian, I hear you on contacting the admins. Unfortunately, we're talking a community college server here so it's very unlikely that they'll make any exceptions for me.

I'm looking for suggestions on where to put the uploads folder, and how to allow Perl to write to the folder without making it world read-write-execute

Anyone?

Thanks,

Tom
Richard DavisSenior Web Developer

Commented:
Well, my final piece of advice would be to locate the uploads folder outside of the web tree so that it's not visibly accessible. Security through obscurity is not good, but will hold the dam from temporarily.

Good luck.

~A~
Solution Architect
Commented:
In the case you are right now, I would do these recommendations:
- Create a separate file system without the executable attribute. /tmp/yourfolder could be enough. This will take care of the fact you need to have uploads limited, so a separate /tmp filesystem is the best thing to do with a linux in production. if you do not have that, you can ask root to create a file and mount it as a separate filesystem, with the noexec option. that will work just as well.
  In this case, if the mount point is owned by root and the group 'apache', you need to set these permissions at mount time: 2775 for the files, and assure files are at 664 with apache:apache as the owner:group.

- Ask root to create a crontab taking these files from the temporary folder and apache:apache permissions, to the final directory with the correct permissions and ownership. say wwwdata:wwwgroup and permissions 664 for files and 775 for directories. That will be enough for apache to access but without write permissions.

- Never. Never left the files in your web server owned by the same user that is running apache. that's why I recommend to create something like wwwdata:wwwgroup

Hope this help you. I have this kind of setup running at very high traffic and very sensitive web sites without problems.

Author

Commented:
@Redimido: thanks for the detailed response.

So, just to make sure I understand what you're suggesting:

1. create a directory outside the apache www/html structure
2. set the directory to 775, owner root, group apache
3. Perl creates files and folders into this directory

Now this is the part I don't understand. Why are we using an external process, like a crontab, to then move the files to the directory within the www/html structure? Is that because the crontab operates under a different user than Perl?

And what is the danger of keeping files in the web/html structure with owner/group the same as apache?

Thanks for your answers.

Tom
Gabriel OrozcoSolution Architect

Commented:
Hi

yes you got it correctly

The danger of having files under the same user that is running the apache server AND your cgi's, is because *any* intrusion using apache/perl will leave all files writeable (remember, same user running apache/perl cgi's is the owner of all files).

The best thing to do is to have all files owned by some other user but fully readable by the apache process. apache should only be able to write in one place. That place must be delimited in size, not executable (so nobody can upload a binary and try to execute it) and be cleaned up with some rules by a crontab process.

Author

Commented:
Gotcha.

Now, doesn't the requirement of running the crontab mean that there will be delays between the moment the file is uploaded and the moment it becomes available (for example, say we were doing a photo gallery - the user uploads a photo, and then expects that photo to be immediately displayed in his slideshow)?

And what do we do if we also want the user to be able to delete files?

Thanks for your insights so far,

Tom
Gabriel OrozcoSolution Architect

Commented:
you are correct. This approach is not the best if you are working with a live upload/delete scenario.

For the static files scenario:
The crontab runs every minute. if that is too long, then you can always use a process that pools every 5 seconds doing the same.

if you need live scenario like photo album, you would need to validate files, size, etc. in order to have all data in a special filesystem and all being controlled by a database.