Extremely silly math musing
Jan. 22nd, 2004 10:56 amQuestion:
If the popular version of Moore's law (computing speed increases by a factor of 2 every 18 months) were to apply continuously,(which it doesn't) at what point does a process become not worth hitting start on.
That is, if I have a program that takes 80 years to run, I shouldn't start it now. I can get it done in 23 years by waiting 3 years, at which point the program will only take 20 years. So at what point (expressed as current time to completion) should I hit start?
I have an answer, but I'd like someone to check my math. So if you're interested in trying it yourself, do so before clicking on the cut-tag.
( What I have )
If the popular version of Moore's law (computing speed increases by a factor of 2 every 18 months) were to apply continuously,(which it doesn't) at what point does a process become not worth hitting start on.
That is, if I have a program that takes 80 years to run, I shouldn't start it now. I can get it done in 23 years by waiting 3 years, at which point the program will only take 20 years. So at what point (expressed as current time to completion) should I hit start?
I have an answer, but I'd like someone to check my math. So if you're interested in trying it yourself, do so before clicking on the cut-tag.
( What I have )