Aidy_B
asked on
can you ftp only a set number of files?
i have a location that will have lots of files in at any one time (all csv files), but i only want to ftp approx 10 files at a time (being the 10 oldest files), using a wildcard as the filenames will always be different. Does anyone know if this is possible and if so...how.
cheers,
aidy.
cheers,
aidy.
Ignore that last "extract001.csv". It was off my screen when I clicked submit.
if the names are not named as drawlin suggested, then there is no way to do just 10 files. You can write a small shell script (if linux) or a small C program that can do this quite easily.
u can create a batch file (.bat file)
in that in the ftp, you can use the commang mget *XXX*.csv to donaload multiple files.
in windows you can then schedule this batch file to be executed at specific intervals.
Need more clarifications?
Regards,
Nilesh.
in that in the ftp, you can use the commang mget *XXX*.csv to donaload multiple files.
in windows you can then schedule this batch file to be executed at specific intervals.
Need more clarifications?
Regards,
Nilesh.
ASKER
Hi.
Apologies to everyone for the lack of detail. i was in a bit of a hurry and just wanted to get the ball rolling, if you know what i mean. Here goes for an indepth explanation:
csv files are produced on a remote server with random numbered filenames (random to all intense purposes anyway) being approx 8 characters long. these files are then required by a different system but need to be processed by that system in the date/time order that they were produced. this different system is fed the files by a scheduled FTP operation which supplies them to a pc/plc setup. this pc/plc setup modifies the files by cutting some of the data out of them then combines all the files in to a single text file before being called via another ftp operation into the plc. as the plc then uses the files to populate a finite number of data blocks, it is imperative that the number of csv files that were used to create the txt file is not so large that the plc runs out of data blocks to populate, and then bring the whole system into a crash and burn situation.
I have managed to create a setup that will ftp to the pc, modify the files, combine them, and put the finished txt file in the correct loction for the plc, with the added safety that it will not repeat this operation untill it has detected that the previous txt file no longer exists (which is controlled by the plc deleting the file upon successfully loading it), but the bit i am trying to ahieve is limiting each ftp operation (and hence each txt file) to only 10 or so of the original files so that the plc can be told only to ftp a new file in if it has at least 10 datablocks free to populate. but to add to the problem, these will need to be the next 10 in the sequence of the time they were created.
The system creating the files is beyond my control and if for any reason the process which the plc runs was to stop or slow down, then the amount of files available to ftp the next time the plc is ready to upload could be quite large as the files will be produced regardless of the proccesses curent status.
another way that is acceptable is if the limit of 10 files can be achieved on the pc by only transfering that many out of the main folder receiving the ftp files, into a different folder which can then be used to feed the conversion process, but once again it would need to be in the order of date creation.
as an end note, none of the csv files that are produced in the first place to ftp have filnames that would be known as they are just a build code that is produced (so basically the filenames mean nothing to anyone else other than the system itself).
if anyone has managed to read down as far as this....well done, and thank you for just that effort alone, yet alone if anyone actually has a solution.
thanks in advance,
aidy.
Apologies to everyone for the lack of detail. i was in a bit of a hurry and just wanted to get the ball rolling, if you know what i mean. Here goes for an indepth explanation:
csv files are produced on a remote server with random numbered filenames (random to all intense purposes anyway) being approx 8 characters long. these files are then required by a different system but need to be processed by that system in the date/time order that they were produced. this different system is fed the files by a scheduled FTP operation which supplies them to a pc/plc setup. this pc/plc setup modifies the files by cutting some of the data out of them then combines all the files in to a single text file before being called via another ftp operation into the plc. as the plc then uses the files to populate a finite number of data blocks, it is imperative that the number of csv files that were used to create the txt file is not so large that the plc runs out of data blocks to populate, and then bring the whole system into a crash and burn situation.
I have managed to create a setup that will ftp to the pc, modify the files, combine them, and put the finished txt file in the correct loction for the plc, with the added safety that it will not repeat this operation untill it has detected that the previous txt file no longer exists (which is controlled by the plc deleting the file upon successfully loading it), but the bit i am trying to ahieve is limiting each ftp operation (and hence each txt file) to only 10 or so of the original files so that the plc can be told only to ftp a new file in if it has at least 10 datablocks free to populate. but to add to the problem, these will need to be the next 10 in the sequence of the time they were created.
The system creating the files is beyond my control and if for any reason the process which the plc runs was to stop or slow down, then the amount of files available to ftp the next time the plc is ready to upload could be quite large as the files will be produced regardless of the proccesses curent status.
another way that is acceptable is if the limit of 10 files can be achieved on the pc by only transfering that many out of the main folder receiving the ftp files, into a different folder which can then be used to feed the conversion process, but once again it would need to be in the order of date creation.
as an end note, none of the csv files that are produced in the first place to ftp have filnames that would be known as they are just a build code that is produced (so basically the filenames mean nothing to anyone else other than the system itself).
if anyone has managed to read down as far as this....well done, and thank you for just that effort alone, yet alone if anyone actually has a solution.
thanks in advance,
aidy.
aidy
again there is no command you can execute to make this happen. However a small script can do this quite easily.
The solution would be
1. ls -altr will sort files by creation date. Earliest to latest
2. awk can then strip out just the files name
ls -altr | awk '{ print($9) }'
Now you have all the files listed by creation date.
Next you take the top 10 files and feed delete it from the original folder and place it onto a new one
I guess the other end will just pick up all files from new folder.
I can write a small bash script that can do this for you. it is fairly simple.
again there is no command you can execute to make this happen. However a small script can do this quite easily.
The solution would be
1. ls -altr will sort files by creation date. Earliest to latest
2. awk can then strip out just the files name
ls -altr | awk '{ print($9) }'
Now you have all the files listed by creation date.
Next you take the top 10 files and feed delete it from the original folder and place it onto a new one
I guess the other end will just pick up all files from new folder.
I can write a small bash script that can do this for you. it is fairly simple.
ASKER
hi periferral,
i am interested in this solution but where you mention taking the top ten files into another folder, is this done by script? i ask as the whole process is ran unattended 24/7 and needs to be able to operate with no human intervention.
thanks again, aidy.
i am interested in this solution but where you mention taking the top ten files into another folder, is this done by script? i ask as the whole process is ran unattended 24/7 and needs to be able to operate with no human intervention.
thanks again, aidy.
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
i am not sure that it will be permissible for me to run this on the server but i will find out in the morning. if not, is there any way of doing it on the local pc after i have pulled the files off the server? (the local machine being win xp os)
cheers, aidy
cheers, aidy
aidy.. you can to do the same thing from the local machine. However, this is not be as simple as executing on the remote server. I mean, you will need a shell script like tcl or perl maybe that can do something similar. the problem is that you need to ftp to the server, get the directory listing and then get only 10 files at a time and process them and return. using the ftp client and processing the information is harder since FTP does not have all the commands that are in a basic linux machine to do such processing so you will need to code in that logic.
ASKER
hello again.
apologies for the delay. this problem is twisting and turning a bit but the solutions offered have given the neccesary food for thought for now. i am going to close the question and sort out the finer details at a later date when the project is also being finalised. thanks for all the input. points are going to periferral as the solutions given are of the most use at this time.
Aidy.
apologies for the delay. this problem is twisting and turning a bit but the solutions offered have given the neccesary food for thought for now. i am going to close the question and sort out the finer details at a later date when the project is also being finalised. thanks for all the input. points are going to periferral as the solutions given are of the most use at this time.
Aidy.
Basic Example:
file names are
extract001.csv
extract002.csv
extract003.csv
extract004.csv
extract005.csv
extract006.csv
extract007.csv
extract008.csv
extract009.csv
extract010.csv
extract011.csv
You could get 001 through 009 with:
ftp:> mget extract..00*
extract001.csv