<html style="direction: ltr;">
<head>
<meta content="text/html; charset=utf-8" http-equiv="Content-Type">
<style type="text/css">body p { margin-bottom: 0.2cm; margin-top: 0pt; } </style>
</head>
<body style="direction: ltr;" bidimailui-charset-is-forced="true"
bgcolor="#FFFFFF" text="#000000">
<div class="moz-cite-prefix">On 10/08/15 17:32, Matan Ziv-Av wrote:<br>
</div>
<blockquote cite="mid:alpine.LFD.2.20.1508101654450.4095@matan"
type="cite"><br>
At the end of this message is a python program with simple
implementations of both algorithms (Gauss eliminiation and
recursive). Run them for sizes 10 to 16 consecuitively to see how
the difference between exponential and polynomial is very
significant in this case.
<br>
</blockquote>
1. Ask for help finding tools on mailing list<br>
2. Get Oleg to question the viability of the suggested solution<br>
3. Matan writes the tool you need for you<br>
4. ????<br>
5. Profit<br>
<br>
<br>
<blockquote cite="mid:alpine.LFD.2.20.1508101654450.4095@matan"
type="cite">
The optimizations you suggested are possible (for both
algorithms), but they are all just scaling the time needed. Even
if you speed up by a factor of 10^9, the recursive algorithm will
just not work.
<br>
</blockquote>
I have not delved too deeply into the "det" calculation, but
wouldn't caching of sub-results cause an actual decrease in the
complexity of the run?<br>
<br>
I've had a program I wrote once (for solving Nonograms
(<a class="moz-txt-link-freetext" href="https://en.wikipedia.org/wiki/Nonogram">https://en.wikipedia.org/wiki/Nonogram</a>)) where adding caching of
intermediate results caused an complexity reduction. A riddle
previously solved in ~40 minutes was, after introducing caching,
solved in 150 millisconds!!!<br>
<br>
I even considered doing a Haifux lecture titled "Complexity; why
programmers ignore it, and why it still matters".<br>
<br>
Shachar<br>
</body>
</html>