apgsearch: a high-performance soup searcher

For general discussion about Conway's Game of Life.
User avatar
calcyman
Moderator
Posts: 2929
Joined: June 1st, 2009, 4:32 pm

apgsearch: a high-performance soup searcher

Post by calcyman » September 8th, 2014, 9:14 am

Greetings.

Over the last few weeks, I've been writing a Golly Python script along the lines of Nathaniel's soup-searcher and Andrzej Okrasinski's screensaver. The emphasis is on performance (100 soups per second per GHz) and reliability (correctly identifies at least 99.9999999% of objects generated) rather than simplicity.

Anyway, it seems to be in a beta-releasable state, so here's a ZIP file containing the .py script together with a bunch of .rule files used by the script.

To use this search program, execute Golly and run the .py script. In order to utilise your full CPU capacity, run n instances of Golly where n is the number of cores, each running an instance of the .py script. Modern desktop PCs (such as a Dell Optiplex 9020) can easily manage over 1000 soups per second.

You are prompted to enter a seed (which defaults to the ISO timestamp) when you run the script. The soups are SHA-256 hashes based on this seed, so including your username in the seed will give a proof-of-work that you discovered the soup yourself. It automatically identifies which soups are the most interesting (in terms of rarity of objects yielded) and lists the top 50 soups in the HTML output.

If you go into ~/.golly/apgsearch/progress (or C:\Users\Adam\AppData\Roaming\Golly\apgsearch\progress if you're using Windows), you can find a list of .txt files (one for each search) containing census results. In a future version, I'll be adding functionality to read, verify and aggregate these files, so that you can accumulate .txt files from multiple machines and combine the census results.

Happy searching! Comments and suggestions for improvement are of course very welcome.
Attachments
apgsearch-2014-09-08-beta.zip
apgsearch v0.3 (beta release)
(19.84 KiB) Downloaded 796 times
What do you do with ill crystallographers? Take them to the mono-clinic!

User avatar
Lewis
Posts: 337
Joined: March 17th, 2009, 5:26 pm
Location: UK
Contact:

Re: apgsearch: a high-performance soup searcher

Post by Lewis » September 8th, 2014, 2:19 pm

Been running this most of this afternoon, haven't found anything groundbreaking just yet, just the occasional tripole and figure-8. Just a few random questions about the program:
  • Is there an easier way to convert the object codes (like "0g8ka9m88gz122dia521" etc.) into the object itself? I can work it out for individual objects but it takes a while (drawing a little grid on graph paper and drawing in each line of the object at a time, then guessing what 'x' and 'z' do).
  • If a wickstretcher was ever to occur in this, would it be picked up the same way as a puffer/switch engine despite its output being one solid thing?
  • Are the rules "EradicateInfection", "HandlePlumes" and "PercolateInfection" rule-independent? With a bit of tinkering I've got the census to work in B35/S23 but as that's quite similar to Life not much needed to be changed. If I was to attempt setting the rule to something very different, like Day&Night or B3/S2456, would these still run OK?
EDIT: One final thing, at the top of the results page it says "Census results after ### soups (with # resets):", what does it mean by 'resets'?

User avatar
calcyman
Moderator
Posts: 2929
Joined: June 1st, 2009, 4:32 pm

Re: apgsearch: a high-performance soup searcher

Post by calcyman » September 8th, 2014, 3:13 pm

Oh, I think I should probably remove the mention of 'resets'. Basically, testing whether a soup has stabilised occasionally returns false positives, which are picked up in a later 'error-correcting' stage. This then backs out the latest addition to the census (of 100 soups), runs all of the soups for an extra 2^18 generations in HashLife, and then repeats the census process. At one point I wanted to keep track of the number of resets (they happen roughly every 3000000 soups) to ensure that they wouldn't cause a substantial performance hit (since each reset is a rather costly operation, taking about a minute).

Everything is deeply hardcoded and optimised for B3/S23 in multiple places (both in the rule tables and throughout the script), so this script won't work in other rules.

And yes, any infinite-growth patterns (except for the two natural switch engines) would appear on the 'unidentified objects' list at the end of the census, and marked as 'PATHOLOGICAL' in the HTML file. The soup containing the infinite-growth pattern would be awarded 50 points, and thus jump to the top of the 'interesting soups' list in the HTML file. So this would find wickstretchers, guns, puffers, slideguns, spacefillers, et cetera.

You can indeed convert objects from Extended Wechsler Notation into a graphical representation; this is enabled for oscillators and spaceships but disabled for still-lifes to avoid cluttering the census results. It's probably possible to change one line of the Python script to enable it for still-lifes. Yes, w = 00, x = 000, y? = 00...00, z = newline.
What do you do with ill crystallographers? Take them to the mono-clinic!

U03BB
Posts: 4
Joined: September 8th, 2014, 4:21 pm

Re: apgsearch: a high-performance soup searcher

Post by U03BB » September 8th, 2014, 4:28 pm

I'm a longtime-lurker, and guessed I would stay that.

I downloaded the script, and tried to test it, unfortunately it lead to a complette freeze of Golly. (Next time, I will start it from the command line to get error messages.)

In the latest results, however, the least interesting soup in the eyes of the script was this gem:

Code: Select all

.o.o.o....o....o
o.oo...o..o..ooo
..o.o.oooo..o...
..o..ooo.o..o..o
.......oooo.o...
..o..o...o....o.
.oo.oo.oooo.o...
ooo.o....oo..oo.
ooo..oo.o...o...
o...o.....o...oo
..o.....oo.o.o.o
...oooo.o..o..oo
o...o..oo...oo..
o.oo...ooo.o..o.
oooo.....o..o.oo
..ooo.ooo..ooo.o
 

User avatar
calcyman
Moderator
Posts: 2929
Joined: June 1st, 2009, 4:32 pm

Re: apgsearch: a high-performance soup searcher

Post by calcyman » September 8th, 2014, 5:20 pm

unfortunately it lead to a complette freeze of Golly.
That's very bizarre behaviour; what specification is your machine (and what seed did you use)? It very occasionally (once every few hours, on average) stalls for about a minute whilst it performs error-correction; it's possible you were just extremely unfortunate to have one of those reasonably early.
What do you do with ill crystallographers? Take them to the mono-clinic!

U03BB
Posts: 4
Joined: September 8th, 2014, 4:21 pm

Re: apgsearch: a high-performance soup searcher

Post by U03BB » September 8th, 2014, 5:55 pm

calcyman wrote:
unfortunately it lead to a complette freeze of Golly.
That's very bizarre behaviour; what specification is your machine (and what seed did you use)? It very occasionally (once every few hours, on average) stalls for about a minute whilst it performs error-correction; it's possible you were just extremely unfortunate to have one of those reasonably early.
It is most vexing, but I cannot reproduce it.

I had 3 or 4 consecutive crashes the first times I tried, and now nothing, it runs smoothly for long times without any glitches, in one and two instances of golly.

I recall that I used a seed among the lines of 2014-09-08U03BB. I had 2 instances of Golly running, one with the script running and one to play around with my results. Also, I had Chromium and Iceweasel opened.

The system is a T410i with an i5 M430 and an nVidia NVS 3100M. The system is Debian Sid, up to date.

I will file a bug report if something like that happens again.

User avatar
Andrew
Moderator
Posts: 911
Joined: June 2nd, 2009, 2:08 am
Location: Melbourne, Australia
Contact:

Re: apgsearch: a high-performance soup searcher

Post by Andrew » September 8th, 2014, 9:09 pm

Amazing work Adam -- well done!

Minor glitch: the final soup.save_progress call needs to be done before the final g.show call (otherwise the status bar ends up showing "Writing progress file..." when the script ends).

I've added below what I think is a nicer display_census routine. It has these changes:

- Truncates very long object names so the table doesn't get too wide (it's very annoying to have to resize the help window to see the entire table width).

- Instead of using "lexpatt:" links it uses "open:" links. Objects are permanently saved in apgsearch/objects/ and soups are saved in a temporary directory (which is deleted when Golly quits). This makes the table more compact -- ie. much less scrolling to see the results.

Code: Select all

    # Display results in Help window:
    def display_census(self, numsoups, root):

        dirname = g.getdir("data")
        separator = dirname[-1]
        apgpath = dirname + "apgsearch" + separator
        objectspath = apgpath + "objects" + separator
        if not os.path.exists(objectspath):
            os.makedirs(objectspath)

        results = "<html>\n<title>Census results</title>\n<body bgcolor=\"#FFFFCE\">\n"
        results += "<p>Census results after processing " + str(numsoups) + " soups:\n"

        tlist = sorted(self.objectcounts.iteritems(), key=operator.itemgetter(1), reverse=True)    
        results += "<p><center>\n"
        results += "<table cellspacing=1 border=2 cols=2>\n"
        results += "<tr><td>&nbsp;Object&nbsp;</td><td align=center>&nbsp;Common name&nbsp;</td><td align=right>&nbsp;Count&nbsp;</td></tr>\n"
        for objname, count in tlist:
            if (objname[0] == 'x'):
                if (objname[1] == 'p'):
                    results += "<tr bgcolor=\"#CECECF\">"
                elif (objname[1] == 'q'):
                    results += "<tr bgcolor=\"#CEFFCE\">"
                else:
                    results += "<tr>"
            else:
                results += "<tr bgcolor=\"#FFCECE\">"
            results += "<td>"
            results += "&nbsp;"
            
            # Using "open:" link enables one to click on the object name to open the pattern in Golly:
            rlepath = objectspath + objname + ".rle"
            results += "<a href=\"open:" + rlepath + "\">"
            if len(objname) > 50:
                # truncate long name and append ellipsis
                results += objname[:50] + "&#8230;"
            else:
                results += objname
            results += "</a>"
            results += "&nbsp;"
            
            # save object in rlepath if it doesn't exist
            if not os.path.exists(rlepath):
                rledata = "x = 0, y = 0, rule = B3/S23\n"
                # http://ferkeltongs.livejournal.com/15837.html
                compact = objname.split('_')[1] + "z"
                i = 0
                strip = []
                while (i < len(compact)):
                    c = ord2(compact[i])
                    if (c >= 0):
                        if (c < 32):
                            # Conventional character:
                            strip.append(c)
                        else:
                            if (c == 35):
                                # End of line:
                                if (len(strip) == 0):
                                    strip.append(0)
                                for j in xrange(5):
                                    for d in strip:
                                        if ((d & (1 << j)) > 0):
                                            rledata += "o"
                                        else:
                                            rledata += "b"
                                    rledata += "$\n"
                                strip = []
                            else:
                                # Multispace character:
                                strip.append(0)
                                strip.append(0)
                                if (c >= 33):
                                    strip.append(0)
                                if (c == 34):
                                    strip.append(0)
                                    i += 1
                                    d = ord2(compact[i])
                                    for j in xrange(d):
                                        strip.append(0)
                    i += 1
                # End of pattern representation:
                rledata += "!\n"
                try:
                    f = open(rlepath, 'w')
                    f.write(rledata)
                    f.close()
                except:
                    g.warn("Unable to create object pattern:\n" + rlepath)
            
            results += "</td><td align=center>&nbsp;"
            if (objname in self.commonnames):
                results += self.commonnames[objname][0]
            results += "&nbsp;</td><td align=right>&nbsp;" + str(count) + "&nbsp;"
            results += "</td></tr>\n"
        results += "</table>\n</center>\n"

        ilist = sorted(self.soupscores.iteritems(), key=operator.itemgetter(1), reverse=True)
        results += "<p><center>\n"
        results += "<table cellspacing=1 border=2 cols=2>\n"
        results += "<tr><td>&nbsp;Soup number&nbsp;</td><td align=right>&nbsp;Score&nbsp;</td></tr>\n"
        for soupnum, score in ilist[:50]:
            results += "<tr><td>&nbsp;"
            
            # soup pattern will be stored in a temporary directory
            # (but we must replace illegal file name characters with hyphens)
            rlepath = root + str(soupnum);
            rlepath = rlepath.replace(":","-")
            rlepath = rlepath.replace("/","-")
            rlepath = rlepath.replace("\\","-")
            rlepath = g.getdir("temp") + rlepath + ".rle"
            
            results += "<a href=\"open:" + rlepath + "\">"
            results += root + str(soupnum)
            results += "</a>"
            results += "&nbsp;</td><td align=right>&nbsp;" + str(score) + "&nbsp;</td></tr>\n"
            
            rledata = "x = 0, y = 0, rule = B3/S23\n"
            hashdigest = hashlib.sha256(root + str(soupnum)).digest()
            for j in xrange(32):
                t = ord(hashdigest[j])
                for k in xrange(8):
                    if (t & (1 << (7 - k))):
                        rledata += "o"
                    else:
                        rledata += "b"
                if ((j % 2) == 1):
                    rledata += "$\n"
            rledata += "!\n"
            try:
                f = open(rlepath, 'w')
                f.write(rledata)
                f.close()
            except:
                g.warn("Unable to create soup pattern:\n" + rlepath)
        
        results += "</table>\n</center>\n"
        results += "</body>\n</html>\n"
        
        htmlname = apgpath + "latest_census.html"
        try:
            f = open(htmlname, 'w')
            f.write(results)
            f.close()
            g.open(htmlname)
        except:
            g.warn("Unable to create html file:\n" + htmlname)
Use Glu to explore CA rules on non-periodic tilings: DominoLife and HatLife

User avatar
Lewis
Posts: 337
Joined: March 17th, 2009, 5:26 pm
Location: UK
Contact:

Re: apgsearch: a high-performance soup searcher

Post by Lewis » September 9th, 2014, 5:06 am

Does the script detect bi-poles properly? I haven't seen any occur so far (despite a fair number of tri/quadpoles), and looking at the 'Routing Logic' section of the script, it has listed every other 8-cell object apart from the bipole (I think).

User avatar
calcyman
Moderator
Posts: 2929
Joined: June 1st, 2009, 4:32 pm

Re: apgsearch: a high-performance soup searcher

Post by calcyman » September 9th, 2014, 6:20 am

Lewis wrote:Does the script detect bi-poles properly? I haven't seen any occur so far (despite a fair number of tri/quadpoles), and looking at the 'Routing Logic' section of the script, it has listed every other 8-cell object apart from the bipole (I think).
Argh, that's me being incredibly idiotic and forgetting to include it. Replace:

Code: Select all

                    objid = "xs8_69ic" #mango
with:

Code: Select all

                    if (dbreadth == 3):
                        objid = "xp2_31ago" #bipole
                    else:
                        objid = "xs8_69ic" #mango
Andrew wrote:I've added below what I think is a nicer display_census routine. It has these changes:
Thank you; that's so much more scalable!
U03BB wrote:I recall that I used a seed among the lines of 2014-09-08U03BB. I had 2 instances of Golly running, one with the script running and one to play around with my results. Also, I had Chromium and Iceweasel opened.

The system is a T410i with an i5 M430 and an nVidia NVS 3100M. The system is Debian Sid, up to date.
There doesn't seem to be anything strange with your configuration. What version of Python are you using? (It's tested on version 2.7.)

I seem to recall that xrange() malfunctions if you try to feed it a number exceeding 2^31. I should probably alter the script to prevent that from happening.
What do you do with ill crystallographers? Take them to the mono-clinic!

U03BB
Posts: 4
Joined: September 8th, 2014, 4:21 pm

Re: apgsearch: a high-performance soup searcher

Post by U03BB » September 9th, 2014, 5:09 pm

Lewis wrote:There doesn't seem to be anything strange with your configuration. What version of Python are you using? (It's tested on version 2.7.)

I seem to recall that xrange() malfunctions if you try to feed it a number exceeding 2^31. I should probably alter the script to prevent that from happening.
It is Python 2.7. The skript has now run 24h in two instances without failing. If I'm able to reproduce it, I will, but I don't have an idea what I could test.

Thanks for writing the script!

I have some questions:

Is there a function to resume interrupted searches?
Does the (X soups per seconds)-line show an average?
Is there a possibility to show the soup(s) a particular object has arisen? I would like to see a soup in action which produces a switch engine.

User avatar
dvgrn
Moderator
Posts: 10565
Joined: May 17th, 2009, 11:00 pm
Location: Madison, WI
Contact:

Re: apgsearch: a high-performance soup searcher

Post by dvgrn » September 9th, 2014, 6:28 pm

U03BB wrote:Is there a possibility to show the soup(s) a particular object has arisen? I would like to see a soup in action which produces a switch engine.
The switch-engine soup wish is easy to grant, anyway. Try for example running 12,000 soups with a seed of "DaveSoup1" (without the quotes).

You should find that a block-laying switch engine appears in the census results. It's soup #11463 in this group, and being a switch engine it gets a good score, 14. So it's the second soup shown in the HTML results -- "DaveSoup111463" -- below #8022 with the two LWSSes.

Code: Select all

x = 16, y = 16, rule = B3/S23
2obo2b2o3bo2bo$3obob2obob2o2bo$6bob4o3bo$b2obob3o2b2obo$2obobob2ob2o2b
o$o2b4o2b3ob3o$b2o2b4o2bob2o$4b3obo$2bo4bob2o2b2o$o2b2o2bo5b3o$bo3bobo
2bob3o$3o2b3obo2bo2bo$bo2bob3o3b3o$5bob4o2bobo$4o2bob2o$o2bo4b2o4b2o!
The last million-soup run I tried turned up about a dozen of these block-layers, as I recall, and two or three glider-producing switch engines, which seems to be about right: (Achim Flammenkamp's experiments showed a ratio of around 1:3. Achim's total number of switch engines seems to have been significantly lower -- maybe because he started with bigger patches of soup? More middle and less edge would tend to crash a lot of switch engines before they can be recognized, I suppose, and/or create a relatively large number of other object types.

Anyway, switch-engine soups are going to be pretty much a dime a dozen now, I would say! I'm getting to be more interested in the idea of "glider-constructible soups", where you hit a small random constellation with a glider, or multiple gliders, and/or a spaceship or two, and classify what comes out. Should be able to get switch engines out of collisions like that, about as often as out of random 16x16 patches -- within an order of magnitude or two, anyway.

User avatar
Lewis
Posts: 337
Joined: March 17th, 2009, 5:26 pm
Location: UK
Contact:

Re: apgsearch: a high-performance soup searcher

Post by Lewis » September 10th, 2014, 7:33 am

Probably the rarest oscillator I've had the script turn up so far, this unnamed 19-cell P2:

Code: Select all

x = 16, y = 16, rule = B3/S23
bob3obo3b3o$2ob4ob5o2bo$2ob5o3bo$3bo2bob2o4bo$2o3bobobo4bo$3bo3b2o6bo$
bob2o3b4obobo$o2b4obo3b3o$o2bobob2o2bo3bo$4obob2o2bo2b2o$4obobo2bo4bo$
obo5b3o4bo$b2o2b3obob3o$2bob2o6bobo$o3b2ob2o$3bo9bobo!
Most of the other high-scoring soups were from a combination of a few pulsars and *WSSs.

Also, this (in the rule B35/s23) came as a surprise:

Code: Select all

x = 16, y = 16, rule = B35/S23
4o3b4o3bo$2obo4bo2bo2bo$b2o5b2ob2o2bo$2obo2b2o3b3o$2ob2ob3ob3o2bo$2bo
6b3obo$2o2bo4bobob3o$o3bobo2b6o$2obob3obob4o$2o2b3o2bo2bobo$ob4o3b7o$o
3b2o7b3o$o3b3obobo2bo$6obo2b2o$3o2b3o3b5o$8obo3b2o!
...which made me think, how long until we start seeing soups which produce the c/7 loafer etc. in Life?

Also, anyone got any guesses as to what the next 'natural' spaceship (after glider, *WSS and combinations of 2 interaction *WSSs) will be?

User avatar
calcyman
Moderator
Posts: 2929
Joined: June 1st, 2009, 4:32 pm

Re: apgsearch: a high-performance soup searcher

Post by calcyman » September 10th, 2014, 11:16 am

Lewis wrote:Most of the other high-scoring soups were from a combination of a few pulsars and *WSSs.
Hmm, yes, I've noticed they seem to dominate the results. As such, I've modified the score table slightly for v0.4 (decreasing *WSS and pulsar by one point each; leaving PD constant; increasing switch engines and anything rarer than the PD by one point).
Lewis wrote:Probably the rarest oscillator I've had the script turn up so far, this unnamed 19-cell P2:
Hmm, yes, maybe I should score unrecognised still-lifes and p2 oscillators based on their size, rather than capping them at 10 and 20 points, respectively.
Lewis wrote:Also, this (in the rule B35/s23) came as a surprise:
Ooh, yes, it looks like conwaylife.com/soup never found any spaceships other than Glider 3736, whereas this is Glider 4793.

http://fano.ics.uci.edu/ca/rules/b35s23/

Very well done on actually managing to get my script to work in a different rule, by the way!

Just out of interest, is there much of a market for searching other rules? I can't handle explosive rules such as B37/S23, and even HighLife would be a massive headache to implement (because chaotic quadratic-growth patterns occur relatively frequently). I'm still slightly tempted to attempt HighLife, though, because it's a beautiful rule and being able to recognise replicators and growing spaceships is a fun challenge. It might be difficult to count the number of replicators, though!

In theory, the script could quite easily auto-generate all of the auxiliary rule tables at the start; this has the advantage of making everything self-contained. And I suppose it could 'intelligently' build the routing logic (in the form of a hashtable instead of being hardcoded) at the beginning of runtime, as well, by seeing which objects appear at least 20 times within the first 4000 soups.

I suppose that getting this to work for arbitrary Class-IV cellular automata would force me to implement the infinite-growth-handling logic properly, instead of the messy kludge of labelling anything that isn't a block-laying or glider-producing switch engine as 'PATHOLOGICAL'. Especially the fact that anything that grows at the rate of 1 cell every 9 generations is labelled as a 'block-laying switch engine'. (!)
Lewis wrote:...which made me think, how long until we start seeing soups which produce the c/7 loafer etc. in Life?
I would guess somewhere between 10^11 and 10^13 soups.
Also, anyone got any guesses as to what the next 'natural' spaceship (after glider, *WSS and combinations of 2 interaction *WSSs) will be?
Unless there's a small reasonably-high-period spaceship out there (analogous to the loafer, but hitherto-undiscovered), it's likely to be one of the following five things:
  • Loafer;
  • Sidecar on HWSS;
  • p16 Coe engine on LWSS;
  • p12 Schick engine on two LWSSes;
  • That small unnamed c/3 spaceship with no known synthesis.
I'm really hoping that the fifth one emerges in the soup search, as it is likely to yield a synthesis.
U03BB wrote:Does the (X soups per seconds)-line show an average?
Yes, the average soup-producing rate of this particular search.
U03BB wrote:Is there a function to resume interrupted searches?
Not yet. I haven't got around to implementing functionality to read progress files.
What do you do with ill crystallographers? Take them to the mono-clinic!

User avatar
Lewis
Posts: 337
Joined: March 17th, 2009, 5:26 pm
Location: UK
Contact:

Re: apgsearch: a high-performance soup searcher

Post by Lewis » September 10th, 2014, 11:43 am

calcyman wrote: Just out of interest, is there much of a market for searching other rules?
I'm not sure; I've done fairly extensive searches in other rules before but I don't think anyone else has ever expressed interest in it at all. However, back when the Soup Search was up and running on here, the other rules seemed to get a fair bit of attention.
I can't handle explosive rules such as B37/S23, and even HighLife would be a massive headache to implement (because chaotic quadratic-growth patterns occur relatively frequently). I'm still slightly tempted to attempt HighLife, though, because it's a beautiful rule and being able to recognise replicators and growing spaceships is a fun challenge. It might be difficult to count the number of replicators, though!
So far I've only managed to get it to run B35/S23, B3/S24 and B3/S2456. 2x2 works but goes really slowly, which I think is down to two things:
- I don't know how to properly write a "routing logic" section that quickly sorts out the small objects (I can't get my head around how the things like 'dpop', 'dlength' and 'lbreadth' etc. work) so it has to do nearly every object the longer way.
- with each group of 100 soups, there's bound to be a mix of p26, 14, 10 and maybe p22 oscillators, so the universe as a whole will only repeat after thousands of generations, I'm assuming this might have an effect on detecting when the universe has stabilised.
So while it does work, it crawls along at about 15 soups/second (for comparison, a single instance of Golly set to B35/S23 gets close to 800 soups/second, and in regular Life it's about 200-ish for me).

I reckon Day&Night would be easy enough to implement too; as far as I'm aware there's only one known natural infinite-growth mechanism that occurs with any frequency.
I suppose that getting this to work for arbitrary Class-IV cellular automata would force me to implement the infinite-growth-handling logic properly, instead of the messy kludge of labelling anything that isn't a block-laying or glider-producing switch engine as 'PATHOLOGICAL'. Especially the fact that anything that grows at the rate of 1 cell every 9 generations is labelled as a 'block-laying switch engine'. (!)
So does this mean that the script wouldn't be able to pick up a soup like the one posted here where both types of switch engine occur from the same soup?

User avatar
calcyman
Moderator
Posts: 2929
Joined: June 1st, 2009, 4:32 pm

Re: apgsearch: a high-performance soup searcher

Post by calcyman » September 10th, 2014, 12:41 pm

- I don't know how to properly write a "routing logic" section that quickly sorts out the small objects (I can't get my head around how the things like 'dpop', 'dlength' and 'lbreadth' etc. work) so it has to do nearly every object the longer way.
Run it in CoalesceObjects for at least the entirety of its period. 'lpop' is the number of cells in state 1, and 'llength' and 'lbreadth' are the dimensions of the box bounding the state-1 cells. 'dpop' is the number of cells in state 2, and 'dlength' and 'dbreadth' are the dimensions of the box bounding the state-2 cells. (These numbers depend only on phase; different orientations of the same phase have the same 'invariants'.)

I suspect that bypassing the routing logic is indeed the reason for the slowdown; canonising objects takes a while since it needs to seed a new universe for each individual object.
- with each group of 100 soups, there's bound to be a mix of p26, 14, 10 and maybe p22 oscillators, so the universe as a whole will only repeat after thousands of generations, I'm assuming this might have an effect on detecting when the universe has stabilised.
Each soup is stabilised individually; the 'pages' of 100 soups are only created for running rules such as ClassifyObjects. Although my code does implicitly assume that really common oscillators have periods dividing 6, it shouldn't be too problematic.
So while it does work, it crawls along at about 15 soups/second (for comparison, a single instance of Golly set to B35/S23 gets close to 800 soups/second, and in regular Life it's about 200-ish for me).
For 2x2, can you give readouts of qlifetime, ruletime and gridtime to see where the bottleneck lies? (These are printed in brackets when the script finishes.)

And wow -- B35/S23 runs four times more quickly than Life?! I guess that things must fade really quickly in that rule.
So does this mean that the script wouldn't be able to pick up a soup like the one posted here where both types of switch engine occur from the same soup?
No. The script identifies these as two separate objects (it's really good at separating objects correctly) and measures them independently. Consequently, the script would correctly identify the soup as having spawned both a GPSE and BLSE, and wouldn't enter Pathological Panic.
What do you do with ill crystallographers? Take them to the mono-clinic!

User avatar
Lewis
Posts: 337
Joined: March 17th, 2009, 5:26 pm
Location: UK
Contact:

Re: apgsearch: a high-performance soup searcher

Post by Lewis » September 10th, 2014, 1:13 pm

calcyman wrote:Run it in CoalesceObjects for at least the entirety of its period. 'lpop' is the number of cells in state 1, and 'llength' and 'lbreadth' are the dimensions of the box bounding the state-1 cells. 'dpop' is the number of cells in state 2, and 'dlength' and 'dbreadth' are the dimensions of the box bounding the state-2 cells. (These numbers depend only on phase; different orientations of the same phase have the same 'invariants'.)
How would I go about differentiating between two objects with the exact same lpop, dpop and bounding boxes (for example, the following two sets of objects):

Code: Select all

x = 8, y = 10, rule = B36/S125
bo3b2o$obo4bo$bo5bo5$2o3b2o$2bo2bo$2o4b2o!
For 2x2, can you give readouts of qlifetime, ruletime and gridtime to see where the bottleneck lies? (These are printed in brackets when the script finishes.)
Where does the script display qlifetime, ruletime and gridtime? I can't find them in any of the output files, and Golly itself just says "Writing progress file..." in the little yellow bit near the top. (this happens when running the regular Life script too, so I don't think it's just something I've accidentally broke whilst modifying the script to 2x2).

User avatar
Andrew
Moderator
Posts: 911
Joined: June 2nd, 2009, 2:08 am
Location: Melbourne, Australia
Contact:

Re: apgsearch: a high-performance soup searcher

Post by Andrew » September 10th, 2014, 6:38 pm

Lewis wrote:Where does the script display qlifetime, ruletime and gridtime? I can't find them in any of the output files, and Golly itself just says "Writing progress file..."
See the mention of minor glitch in my earlier post.
Use Glu to explore CA rules on non-periodic tilings: DominoLife and HatLife

Sphenocorona
Posts: 549
Joined: April 9th, 2013, 11:03 pm

Re: apgsearch: a high-performance soup searcher

Post by Sphenocorona » September 10th, 2014, 10:08 pm

Accidentally closed golly during a search that had some interesting soups :s Thankfully I have a few copies of what I did find:

Pulsar and 14-cell still life:

Code: Select all

x = 17, y = 18, rule = B3/S23
$3o2b2obo2bo3bo$2bob5ob4o$b3obob3ob4o$obo2b10o$3b3ob2o5bo$b2o4bo$ob2ob
3o2b3o$5bo2bob2ob3o$b2o5bo2bob2o$o3bo2b2obobo2bo$bo2bo2bobob2o$obo2b3o
3b2o2bo$bo3b3obo5bo$b7o3bo2bo$2b5obob3o$2o3bo2b3o2bo!
Two very close HWSS's:

Code: Select all

x = 16, y = 16, rule = B3/S23
obo3b2o$o6bo3b4o$2b3ob3o2bob3o$2bo2b2o2b3o2bo$2bob4o2b2o$2o2b2obobo3b
3o$2b2o3b2o2bobo$obo2bo2bob2obo$b2ob2o2b3ob3o$7bobobo3bo$3b2o7bobo$ob
2ob2o2b2o2bo$o2bobo3b2obo$bob2ob10o$bo2bo2bo3b4o$o2b2obo5bob2o!
I also found 2 glider-producing switch engines in a short period of time (one was among the first 10000 or so, the next was somewhere before 40000, but after 20000) - while it took much longer to find a block-laying switch engine.

User avatar
Lewis
Posts: 337
Joined: March 17th, 2009, 5:26 pm
Location: UK
Contact:

Re: apgsearch: a high-performance soup searcher

Post by Lewis » September 11th, 2014, 3:25 am

For 2x2, can you give readouts of qlifetime, ruletime and gridtime to see where the bottleneck lies? (These are printed in brackets when the script finishes.)
10000 soups processed in 259.699044811 (14.7483, 4.64597, 236.95079) secs.
This is with a half-finished Routing Logic thing I cobbled together yesterday evening, which sorts out all 2 and 3-cell objects, and most of the 4 and 5 cell ones.

EDIT: Tried it again and got the same amount of soups, but with times (15.526, 5.6439, 523.8556). (this was with a second instance of Golly running, mind).

User avatar
calcyman
Moderator
Posts: 2929
Joined: June 1st, 2009, 4:32 pm

Re: apgsearch: a high-performance soup searcher

Post by calcyman » September 12th, 2014, 9:00 am

(Note that on Windows 7 there's a substantial performance improvement, sometimes by as much as a factor of two, by having the Golly window minimised -- maybe this is responsible for the disparity between your two recent runs.)
Lewis wrote:10000 soups processed in 259.699044811 (14.7483, 4.64597, 236.95079) secs.
The first number is the time spent running patterns to stabilisation; this seems perfectly reasonable. The second is the time spent applying rules such as CoalesceObjects and ClassifyObjects; again, this is perfectly reasonable. The third is outrageously large, and means that it's wasting a lot of time identifying objects. This may be due to the routing logic not filtering enough of the common objects.

It might also be advisable to deal with the two-cell still-lifes by modifying ExpungeObjects, in the same way I handle blocks, blinkers and beehives in B3/S23. This delegates the task of removing really common objects to Golly, rather than having to do it (much more expensively) in the Python script.
What do you do with ill crystallographers? Take them to the mono-clinic!

User avatar
Lewis
Posts: 337
Joined: March 17th, 2009, 5:26 pm
Location: UK
Contact:

Re: apgsearch: a high-performance soup searcher

Post by Lewis » September 12th, 2014, 10:48 am

I've got a variation of ExpungeObjects which gets rid of both 2-cell objects now working, that's brought the speed up to about 100 soups per second which is a massive improvement. Interestingly, the 23/35 script I've been using didn't use any ExpungeObjects rules, and after adding it back in there doesn't seem to be much of an increase in speed (unless it's just down to other things running in the background, or not being minimized.)

User avatar
calcyman
Moderator
Posts: 2929
Joined: June 1st, 2009, 4:32 pm

Re: apgsearch: a high-performance soup searcher

Post by calcyman » September 14th, 2014, 8:32 pm

Version 0.4 of apgsearch has now been released!

Plenty of changes since last version (mostly due to popular demand):
  • First natural occurrence of each object is recorded in the HTML results.
  • Soup scoring for B3/S23 has been improved.
  • Now analyses and canonises arbitrary linear-growth patterns (not just switch engines) rather than describing them as 'PATHOLOGICAL'.
  • Includes support for arbitrary non-explosive outer-totalistic rules (not just B3/S23).
  • Messy routing logic has been discarded and replaced with a cache.
  • Generates any rule tables when necessary, therefore requiring no auxiliary files.
It currently struggles with HighLife because replicators exhibit nonlinear growth, as do replicator-based growing spaceships. But I can verify that it runs smoothly in any of the following rules:

B3/S23 (obviously)
B35/S23 (as suggested by Lewis)
B36/S125 (2x2, despite Lewis's modification of v0.3 running slowly)
B368/S245 (features a common puffer called yl170_1_8_4a9d539b04cbe36bc88b71b4c04b0035)

By the way, the performance speedup is reminiscent of HashLife. Specifically, the first couple of pages of soups will run rather slowly since it needs to cache lots of common objects. Once they have been entered into the cache, it accelerates profoundly.

Also, for rules other than B3/S23, ignore the soup scores since they're irrelevant.
Attachments
apgsearch-2014-09-14-beta.zip
apgsearch v0.4 (beta release)
(17.56 KiB) Downloaded 554 times
What do you do with ill crystallographers? Take them to the mono-clinic!

User avatar
calcyman
Moderator
Posts: 2929
Joined: June 1st, 2009, 4:32 pm

Re: apgsearch: a high-performance soup searcher

Post by calcyman » September 15th, 2014, 6:26 am

Andrew Trevorrow noticed a bug in v0.4 of apgsearch. There's a quick fix by inserting a sanity check before the final line of the function linearlyse().

Code: Select all

    prehash = str(moments[1]) + "#" + str(moments[2])

    # Linear-growth patterns with growth rate zero are clearly errors!
    if (moments[0] == 0):
        return "unidentified"

    return "yl" + str(p) + "_" + str(q) + "_" + str(moments[0]) + "_" + hashlib.md5(prehash).hexdigest()
What do you do with ill crystallographers? Take them to the mono-clinic!

User avatar
codeholic
Moderator
Posts: 1147
Joined: September 13th, 2011, 8:23 am
Location: Hamburg, Germany

Re: apgsearch: a high-performance soup searcher

Post by codeholic » September 15th, 2014, 3:38 pm

Why did this soup get score of 50 (v0.4)? Is it somehow related to the last fix?

Code: Select all

x = 16, y = 16, rule = B3/S23
bob5ob3obo$b6ob2obobobo$o3bo2bob2o3bo$o2b2o5b2o2b2o$2ob3o2b2obob2o$o4b
ob2ob2o2bo$2b9ob4o$bobobobo3bobo$obobo4b5obo$o2b3o8bo$3b2obo2b2obob2o$
2o3b3obo2bo2bo$2bo5b2o2bobo$bo5b3o2b2obo$3bobo3bo2bo$ob8o2b2o!
Ivan Fomichev

User avatar
calcyman
Moderator
Posts: 2929
Joined: June 1st, 2009, 4:32 pm

Re: apgsearch: a high-performance soup searcher

Post by calcyman » September 15th, 2014, 4:44 pm

Why did this soup get score of 50 (v0.4)? Is it somehow related to the last fix?
Yes, you need to modify linearlyse() in the way I described.
What do you do with ill crystallographers? Take them to the mono-clinic!

Post Reply