How do we differentiate the IOPS value from OS to Storage. For an example I'm writing 8K block of data using dd command in linux
dd if=/dev/zero of=/opt/rg/perf/testing123 bs=8k count=30000000
245760000000 bytes (246 GB) copied, 334.014 s, 736 MB/s
IOPS value is 30000000/334= 89820
but in Storage I'm only seeing around 2000 IOPS. IF I understand correctly it converts to different block size, and then do the calculation?