<div dir="ltr"><div class="gmail_quote">On Mon, Jul 19, 2010 at 11:18 AM, Nadav Har'El <span dir="ltr"><<a href="mailto:nyh@math.technion.ac.il">nyh@math.technion.ac.il</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
On Sun, Jul 18, 2010, Elazar Leibovich wrote about "Re: New Freecell Solver gcc-4.5.0 vs. LLVM+clang Benchmark":<br>
<div class="im">> The standard deviation can give you an estimation of the minimal running<br>
> time. (99.xxxx of the samples are within X standard deviations below the<br>
> average. Pick a high enough xxxx relative to the number of times you'll run<br>
> the software, and you'll get an estimation of the minimum running time<br>
> you'll get).<br>
><br>
> The scalar representing the minimum running time have the obnoxious<br>
> mis-feature that it doesn't tend to a certain value. So with a small<br>
</div>>...<br>
<br>
Usually, the run time of deterministic programs does not behave the way<br>
you describe (i.e., with a gaussian distribution, having an average and<br>
distributed around it). A deterministic program *DOES* have a clearly<br>
defined minimum run time, which it will theoretically achieve on every<br>
run, unless delayed by something else - other processes, daemons in the<br>
background, or whatever.<br>
<br>
Imagine, for example, that you run a certain program 5 times and get the<br>
times: 20.0, 18.0, 18.1, 27.0, 18.1<br>
Evidently, the first run was slower because things were not in the cache,<br>
and the run that took 27.0 was delayed by some other process in the background<br>
taking up the CPU or disk. The minimum run time, 18.0, is the most interesting<br>
one - it is the time a run would take every time, if things were perfect.<br>
If you average the above numbers, or find the standard deviation, etc.,<br>
the numbers would not be very interesting...<br></blockquote><div><br></div><div>I take your point for a very simple program like the freecell solver which is mainly CPU-memory bound. Thanks for the input.</div><div><br>
</div><div>However I believe that every complicated enough program might have a pretty big randomness factor. Consider for example the physical disk layout in the current run, which is affected by the long history of disk usage in the last year. Or the memory allocation which is affected by the current memory layout which also depends on the memory allocation history of this particular computer. I'm not talking about GC which is highly non-deterministic, but still need to be taken into account (if I'm considering a software with GC, I want to find its performance even when it runs, since the probability the GC will run depends on how well I wrote my code). Those are all random (or hard to predict) functions which can affect the running time of even a seemingly deterministic program.</div>
<div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
<br>
--<br>
Nadav Har'El | Monday, Jul 19 2010, 8 Av 5770<br>
<div class="im"><a href="mailto:nyh@math.technion.ac.il">nyh@math.technion.ac.il</a> |-----------------------------------------<br>
</div>Phone +972-523-790466, ICQ 13349191 |How long a minute depends on what side of<br>
<a href="http://nadav.harel.org.il" target="_blank">http://nadav.harel.org.il</a> |the bathroom door you're on.<br>
</blockquote></div><br></div>