Hi all. I suppose this question could have gone in Linux just as well, but it really is a UNIX function, if I'm not mistaken. I've been benchmarking (or trying to) a quick little program I wrote, call it "benchmark," that receives one command-line integer. But I'm still rather unfamiliar with UNIX / Linux platforms, so I'm not entirely sure what all of the nonsense that the time function throws back at me actually represents. I execute this command (where n is some integer):
time benchmark n
Here are some examples of output that I've gotten back:
1.676u 0.002s 0:01.67 100.0% 0+0k 0+0io 0pf+0w
1.697u 0.011s 0:01.70 100.0% 0+0k 0+0io 0pf+0w
1.704u 0.011s 0:01.71 100.0% 0+0k 0+0io 0pf+0w
1.735u 0.011s 0:01.74 100.0% 0+0k 0+0io 0pf+0w
I counted seven things that are output in each test. I believe the first one is the time it took to execute the program in seconds (although I don't know why it's followed by a 'u'), but I don't know what the last six represent, and the man page didn't make much sense to me. What does all that mumbo-jumbo mean? Please explain the remaining six output parameters.