New Freecell Solver gcc-4.5.0 vs. LLVM+clang Benchmark

New Freecell Solver gcc-4.5.0 vs. LLVM+clang Benchmark

Nadav Har'El nyh at math.technion.ac.il
Mon Jul 19 12:18:45 IDT 2010


On Sun, Jul 18, 2010, Elazar Leibovich wrote about "Re: New Freecell Solver gcc-4.5.0 vs. LLVM+clang Benchmark":
> The standard deviation can give you an estimation of the minimal running
> time. (99.xxxx of the samples are within X standard deviations below the
> average. Pick a high enough xxxx relative to the number of times you'll run
> the software, and you'll get an estimation of the minimum running time
> you'll get).
> 
> The scalar representing the minimum running time have the obnoxious
> mis-feature that it doesn't tend to a certain value. So with a small
>...

Usually, the run time of deterministic programs does not behave the way
you describe (i.e., with a gaussian distribution, having an average and
distributed around it). A deterministic program *DOES* have a clearly
defined minimum run time, which it will theoretically achieve on every
run, unless delayed by something else - other processes, daemons in the
background, or whatever.

Imagine, for example, that you run a certain program 5 times and get the
times: 20.0, 18.0, 18.1, 27.0, 18.1
Evidently, the first run was slower because things were not in the cache,
and the run that took 27.0 was delayed by some other process in the background
taking up the CPU or disk. The minimum run time, 18.0, is the most interesting
one - it is the time a run would take every time, if things were perfect.
If you average the above numbers, or find the standard deviation, etc.,
the numbers would not be very interesting...

-- 
Nadav Har'El                        |           Monday, Jul 19 2010, 8 Av 5770
nyh at math.technion.ac.il             |-----------------------------------------
Phone +972-523-790466, ICQ 13349191 |How long a minute depends on what side of
http://nadav.harel.org.il           |the bathroom door you're on.



More information about the Linux-il mailing list