Startup Idea: Software for File-Level Performance Monitoring
I worked as a software quality assurance director in a small IT shop that hosted widgets for some big-time websites like Forbes and the like.
The name of the game was speed. Time literally was money when it comes to how fast a widget can grab some data and present it on the web page. Some of the Technologies at play were IIS, JavaScript, SQL, JASON and PHP. Most of my time was spent testing individual front-ends, back-ends and APIs.
Testing front-ends is mostly a subjective user experience but it can eat up time if, say, someone put a large image (e.g., jpg) out there in 300 dpi instead of 72dpi. Back-ends and APIs usually involved the gathering and presentation of data using SQL queries and HTML/CSS JASON, etc.
There are a ton of test tools and custom scripts out there for load testing and speed but I never found what I really needed. I needed something that would tell me what front-ends, back-ends, APIs and databases, and graphics are being hit the most at a file by file level. I needed this information to know where to focus my attention. I needed it down to the file level.
I would have liked to turn an app loose on a live server and have it generate a report. The report would tell me the main files and images are getting the most hits, load times, and other things that may be possible (e.g., time to first byte TTFB).
I would like to see this at a file-by-file level. It would be a kind of back ground process that monitors without using many resources.
It would be nice if it had some intelligence and did some other the detective work. Ultimately, it could present graphs and pretty much grab someone and say, Hey, take a look at this time consuming resource monster!