Sometimes we get chance to work on the project which
start from scratch. Initially we deal with very small data i.e. tables with 100
to 100000 records. With these data, application runs very fast.
All the web
pages are loaded within fraction of seconds. And we think we have developed an
efficient and highly optimized application. When the application goes to
live or after couple of years, if data volume increases with time, same web
page may take time in minutes or in hours to load it. At that time it is
very costly to tune the application and there would a lot of dependencies.
What is solution of this problem? Answer is,
predicate about future data volume during the development phase. Finding
the culprit code very early during the development much better than keep it
hidden until data volume doesn't increase. Make a hard rule that never develop
the application with small set of data. Insert the test data in those the
tables which think there is probability to incense the number of records with
time. Around 10 million rows are significantly good numbers to reveal culprit
code of the application. But these numbers totally depends upon your
predication of future. There are many load testing tools are available in the
market. They are very helpful to figure out the culprit codes.
No comments:
Post a Comment