The Fallacy of Using IOPS to Measure I/O Performance

Logo
Presented by

Sushant Rao, Sr. Director of Product Marketing, DataCore Software

About this talk

Surveys show that 61% of companies have experienced slow applications after server virtualization with 77% pointing to I/O problems as the culprit. So, companies are desperately trying to improve their I/O performance. This brings up the question of what measure should IT departments use to evaluate storage products’ performance. The most common performance measurement is IOPS. But, is this the best metric? IOPS is dependent on a number of issues including, the type of workload being simulated, read/write percentages, sequent/random percentages and block size. Most importantly, IOPS doesn’t indicate the response time. So, what metric should be used to measure I/O performance? In this webinar, we’ll discuss Latency Curves, which show the response time the application sees as the storage product deals with more and more I/O (measured by IOPS). This measure will help you understand why some storage products can claim high number of IOPS but still perform poorly when they run into peak demand.
Related topics:

More from this channel

Upcoming talks (0)
On-demand talks (140)
Subscribers (9120)
Organizations around the world are experiencing the benefits of increased performance, continuous business operations and lower costs through server consolidation and virtualization. Many of them are now evaluating Hyper-converged systems and Software-Defined Storage to develop their next-generation storage architectures. Join this channel to hear industry and technical experts discuss the value of Hyper-converged and Software-Defined Storage and how you can take advantage of these technologies.