Iops vs. throughput.

Can someone describe in  a consise manner what the difference between iops and throughput are. All the definitions I've seen are clouded. People just parrot the same information without actually explaining what the differences are.  
  • post-author-pic
    Eli K
    01-18-2019

  • post-author-pic
    Eli K
    01-18-2019

  • post-author-pic
    Bob S
    01-18-2019

    Eli, let me see if I can help you out. 


    Throughput - the amount of data across a plane measured in bits/bytes per second.  
    IOPS - This is the measurement of read/writes (Input/Output) that a storage device can perform within one second. 
     
    IOPS is dependant on the hardware, SSD vs spinning disk, number of disks in the array, and the type of RAID array being used.  

    Of course, the more IOPS you have at your disposal, the more data you can throw at it which will then dictate the available bandwidth. Once you hit the IOPS ceiling on a device you will begin to see latency in read/write operations because the storage device can not keep up with the amount of data being thrown at it. 

    Let me give you a bit of a different explanation, as it pertains to cars. IOPS is the number of times per second the pistons can go up/down which makes the wheels go around. Bandwidth is the MPH a car can go. So the faster your engine can make the wheels go around, the faster the car can go.

    Does this help? If not, let me know and we can go at it from another angle!



Looking For Team Training?

Learn More