Brent Simmons provides what is possibly the best description for laymen of the joy/agony of debugging software that I’ve ever seen:
If it took 10 minutes to reduce memory usage by 96MB, then — were there a linear relationship — it should take under half a minute, less than 30 seconds, to go the rest of the way.
That’s the way things work in the real world, after all. If you have 100 bags of leaves to carry out to the curb, each bag will take about the same amount of time as the others.
Instead, it will probably take me about two hours, maybe more, to get rid of that last 4MB.
It’s almost as if you carried out 96 bags of leaves easily and quickly, then realized you can’t get those last four bags, even though they’re exactly the same as all the others, without pouring a new driveway first.
Which is crazy, right? If the real world operated like that all the time, we’d go completely nuts.