Back
It is possible to pick 7 lattice points (i,j) with 1 <= i <= 7 and
1 <= j <= 7 such that all (7*6)/2 = 21 distances between the points
are distinct. One possible selection is the set (1,1), (1,3), (2,3),
(3,7), (4,1), (6,6), and (7,7). The distances between these points
are the square roots of 1, 2, 4, 5, 8, 9, 10, 13, 16, 17, 20, 25, 29,
34, 37, 40, 41, 45, 50, 52, and 72.
Your task is to pick as many lattice points (i,j) as possible with
1 <= i <= 19 and 1 <= j <= 19 such that all distances between the
chosen lattice points are distinct.
My task, on the other hand, is to try to improve upon my previous
record. ;-)
It's been several years since I first encountered this problem, so
computing power may have progressed far enough since then to make
short work of it. If that proves to be the case, then I'll simply
increase the lattice size and we can all have fun finding the best
solution. :-)
Something tells me that a brute force search will be terribly slow.
Probably C may help here, it is about 10 times faster than Perl for some
kind of tasks. With other scripting languages it may be even slower.
I will start with Perl (probably tomorrow, no time today) and see whether
this is a solvable task at all, then move to C. :) Of course one may only
achieve the speed by replacing the algorithm, not the programming language.
After choosing a good algorithm (more efficient than just brute force)
I spend an hour and implemented it in Perl. The script may be found in the
usual place ~migo/math/scripts/DistinctDistanceLatticePoints .
It takes less than a minute on a good machine to list all solutions for N=7.
I see some optimizations that may improve the speed a bit, but not too much.
The good thing the script takes a constant memory, like 2-3 Mb, no leaks.
I really don't want to rewrite it in C, this would be a real pain, so
I will rather just wait for results of N=19. :)
This seems to be the best number of points S for the given lattice size N:
N<=7 S=N
N=8 S=7
N=9 S=8 <- took 50 minutes to prove :(
N=10 S>=9
N=11 S>=9
N=12 S>=10
N=13 S>=10
N=14 S>=11
N=15 S>=11
N=16 S>=12
N=17 S>=12
N=18 S>=13
N=19 S>=13
I think I need to eliminate symmetric solutions, this should speed it up.
It is possible the solution for N=19 will be better, should wait. :)
I was away when it printed a solution with 14 points for N=19.
* * . * . . . * . . . . * . . . . . .
. . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . * .
. . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . * . . .
. . . . . . . . . . . . . . . . . . *
. . . . . . . . . . . . . . . . . . .
. . . * . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . .
. . . . . . . . * . . . . . . . . . .
* . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . * . . .
. . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . * .
. . . . . . . . . . . . * . . . . . .
As you may see from the solution, it is far far away from completion.
Jason, what is your best result? So I know whether to start to rewrite
the algorithm or not.
The best result found by any competitor in that puzzle was 14 lattice
points. The solution I have on hand matches yours, although I believe the
solution I submitted was slightly different, probably because my algorithm
was randomized to some degree. But I can't verify that since I don't have
my old email, or even my program for that particular puzzle. :(
At any rate, I don't know whether 14 is the largest set of points one can
find.
Ok, then I will not rewrite the algorithm, at least right now. :)
If there is one solution then I suppose there are hundreds of solutions
with 14 points. Bad, i ran it with VERBOSE=1 parameter instead of VERBOSE=2,
so I will not see more solutions with 14 points on that run, just 15 if any.
Any info about that contest that you mentioned several times?
P.S. Does someone have an access to a supercomputer (1THz or greater)? :)
This looks like a very interesting problem. I agree that a brute force
search looks intimidatingly slow. I suspect an exhaustive search of a
19x19 grid will prove to be intractable, at least with the algorithm
Mikhael posted. It's entirely possible that there doesn't exist any
algorithm that can solve this conclusively in a "reasonable" amount of
time.
Injecting some randomness into the search, as Jason's original algorithm
did, might speed up the finding of near-optimal solutions, but this
wouldn't help in proving that any of the found solutions are truly optimal.
If I ever have time to try writing a program to solve this (a dubious
prospect at best, given the limited amount of time I have), I may take a
slightly different approach than migo's script, just to get a different
perspective. Namely, I may take the approach of finding the smallest NxN
grid that would fit a given number of S points.
Jason, I'm curious, exactly how did you inject randomness into your search?
Did you repeatedly pick random points from the set of remaining valid
points?
> Any info about that contest that you mentioned several times?
My college's computer science department held semester-long competitions
while I was there, with roughly one problem per month. This lasted anout 4
semesters, until participation dwindled off. At the end of each semester,
the person with the most points would win a book of their choice. Somehow
I managed to fend off the graduate students enough times to earn the first
three volumes of Knuth's TAOCP. :^)
> Jason, I'm curious, exactly how did you inject randomness into your search?
> Did you repeatedly pick random points from the set of remaining valid
> points?
At least some points were randomly chosen. I can't remember if I did this
for all points, or just several of them. I figured that would give me a
leg up on programs that systematically searched the solution space. My
assumption was that with such a huge search space, good (optimal?)
solutions probably weren't "close to the beginning" or "clumped together".
So far, in this particular case it appears that my assumption wasn't
necessarily true. :-)
Jason, I think your assumption is indead incorrect. :)
For example, for N=5 there are 280 solutions with 5 points.
Run this on freeshell to see these solutions:
~migo/math/scripts/DistinctDistanceLatticePoints 5 2
This runs 3 seconds for me, so it should not load freeshell (but please
don't run the script for more than half a minute, use your home machine).
I extended my script to properly gather full statistics about all existing
solutions with the given N. This was not a trivial task at all, since the
purpose was not to report any partial solutions (the ones that are included
in other solutions) and only report the optimial solutions, i.e. the ones
that can't be improved by adding additional points. I needed to solve some
combinatorial tasks to find all partial solutions from the optimal one.
The bad thing is that the script with this optional full statistics switch
consumes now a lot of memory, since it should store all partial solutions
(and there is a huge number of them) to disallow their reporting.
But the results look really interesting. For example, for 8x8 there are
26800 solutions using 7 points, and for 7x7 there are 8 solutions using 7
points. In fact there is only one solution for 7x7, since other 7 are just
mirrored from this one (my script reports all solutions even mirrored).
You may see these interesting results here:
~migo/math/solutions/DistinctDistanceLatticePoints
~migo/math/solutions/DistinctDistanceLatticePoints-N=5-full
~migo/math/solutions/DistinctDistanceLatticePoints-N=7-best
~migo/math/solutions/DistinctDistanceLatticePoints-N=10-best
I am still running the N=19 search, it runs for 3 days and there are no
new results becide the posted S=14 point solution. My rough estimation is
if there is a solution with S=15 then it may be reported in a month.
I don't think I want to run it for so long. :) It is a completely CPU task,
the memory is constant, about 14Mb for N=19. And 3Mb for N=10. The jump
here is only because of the way Perl allocates big hashes to improve speed.
I also run the full-stats search for N=10 (that PC has 2 CPUs), it already
runs for 7 hours now and consumes more than 1Gb, since there are many
milliards of solutions, optimal and partial. I hope it will finish today,
so I will be able to add stats to solutions/DistinctDistanceLatticePoints.
I consider this CPU consuming puzzle more or less solved, I don't see
better ways to solve it currently. It was fun to solve it.