LSSS memory-, disk-, and threadusage
Hello all, first post on this forum.
Being a long time fan of GoL i decided to participate in finding new patterns for B3/S23/C1 most notably spaceships and oscillators.
I realize that a lot of work has been done in terms of scripts and programs in finding these patterns and a lot of archived material is present for me to check against. Great!
In order to find new patterns, i decided to familiarize myself with the current tools used in patterns searching. So far i have started using:
- WLS
- JLS
- gfind
- Logic Life Search (LLS) with various SAT solvers (glucose-syrup/minisat/lingeling and derivatives)
- Life Slice Ship Search (LSSS)
Apart form WLS i run everything in Linux. I have a dedicated machine for this: (dual Xeon 4C/8T so 16 threads in total, 16Gb memory, about 250GB diskspace)
Among all programs i compiled and ran, LSSS is the most taxing program i have come across. Compiling with Rust went fine and for a test run i tried to search for Copperhead so i created a search_P10H1V0EVEN8.sh hoping to attempt to find it.
Unfortunately after a couple of days the program exits with "error: Os { code: 28, kind: Other, message: "No space left on device" }" and sure enough, it filled all of the 200 GB free space i had left.
I have 11 .partial.txt files already done (0011.partial.txt, 0013.partial.txt till 0031.partial.txt) but 0033 bombed out.
From Ganglia, a tool for monitoring system load and disk and memory usage, i see that LSSS is using all 16 threads, but not constantly all of the time. Also the memory usage varies strongly and at one point necessitated me to stop the search and added 8GB of extra memory to cope with the increased usage.
Before i allocate a second disk for this, i am wondering if i am doing something wrong, either in my choice of the test pattern (Copperhead, being a slow ship with high period) or the parameters (margin 8 , midline=--even_midline --velocity (1,0)c/10 in B3/S23)
Can a person who is experienced with LSSS have a quick look at my search script? I've attached it in this post.