A recent post on The Daily WTF reminded me of something. Specifically it reminded me about the most basic rule of making things happen more quickly.
The particular problem is unlikely to be encountered by anyone building the vast majority of conceivable software thingits, but that doesn't matter - the example is still instructive.
For those not following the link, the basic problem is to find the lowest common multiple of a set of contiguous integers from 1 to X inclusive. The specific case set X = 20.
The brute force solution (as opposed to the "I'm a moron" solution presented in the link) is to try every integer from 1 onwards. For all those thinking that such an approach is acceptable, think again.
Seriously, you have two important facts in front of you:
* The answer has to be evenly divisable by the largest integer in the range presented (we'll call this value Y).
* The answer has to be evenly divisable by the largest prime in the range presented (we'll call this value Z).
Taking the linked example, the largest number is 20, and the largest prime is 19.
So the lowest possible value is the largest integer in the range multiplied by the largest prime in the range - 20 * 19 = 380. The smallest step in the check is 19 - the largest prime. For anyone who queries this, please feel free to leave comments - I like comments and I like maths (yes, I am weird and insane).
From such a simple observation you can easily go further, but a large reduction in the search space has already been achieved with very little effort. In terms of return as a function of effort, a local maxima has most probably been observed.
The moral of the diatribe is similarly simple - the basis of all optimisation is to do less work on a nett basis. The old school world of precalculating sine and cosine values of angles is long past, as is designing systems around binary values, but reusing values and avoiding duplication of work and un-needed work is just as valid as ever.
No comments:
Post a Comment