|
|
Subject:
limitation on the length of the output that an "awk" script could produce
Category: Computers > Software Asked by: harshag-ga List Price: $3.00 |
Posted:
13 Jul 2006 07:05 PDT
Expires: 14 Jul 2006 07:12 PDT Question ID: 745925 |
is there a limitation on the length of the output that an "awk" script could produce in one single file? I have written an awk script that reads input from 3500 files. Initially, I decided to output the same number of files(one for each input) and I ran the script and everything was just fine..I generated 3500 files(with the pattern matching i wished for each input file) and everything seemed perfect, but when i try to output everything into one file, the script runs for the same amount of time as before and produces only a partial result(i mean a result of only say 500 input files..) This made me to think if there was any limitation on the size of the output that could be generated by any awk script if we are outputting it to one single file? (BTW, i was outputting it into a .txt file) any advice on how to solve this problem will be greatly appreciated Thanks |
|
There is no answer at this time. |
|
Subject:
Re: limitation on the length of the output that an "awk" script could produce
From: eiffel-ga on 13 Jul 2006 09:45 PDT |
Hi harshag-ga, It's not clear what the limiation could be here. I'm assuming that your output is not so big that it is running against filesystem limits (in the gigabyte range). I'm assuming your disk is not full. I'm assuming you're not running this on a VFAT or FAT-formatted partition. I'm assuming you are generating the output for each file and using the shell's output redirection operator (">>") to concatenate all the outputs into one file. If not, you may wish to try it this way. You may also gain some clues by examining the output file to see whether you get output for the first few files, the last few files, the smallest files, etc. Regards, eiffel-ga |
Subject:
Re: limitation on the length of the output that an "awk" script could produce
From: bozo99-ga on 13 Jul 2006 12:52 PDT |
Look into ulimit and getrlimit(). Try to make a large file from a simple awk script with a loop and see whether it fails at the same size. I'd also look at your process with truss (or the equivalent for your system). You might hae a problem with the number of open files or something like that. Also do you really want to use awk in 2006 when you could use perl, python or some such? |
If you feel that you have found inappropriate content, please let us know by emailing us at answers-support@google.com with the question ID listed above. Thank you. |
Search Google Answers for |
Google Home - Answers FAQ - Terms of Service - Privacy Policy |