I'm glad it was useful!
When I start the population with good individuals, it tends to return as the solution the best individual from the first generation
That is the sort-of expected outcome as mentioned in papers/blogs; usually random individuals are pretty terrible(in fitness), so a heuristic approach is very prone to convergence due to the disparity in fitness, assuming these great fitness individuals are not lost randomly in selection. I also noticed you add 5X the known individual in the gist example code, this will help such individuals not get randomly lost to selection, but also increase the convergence as with each generation this number of copies can grow until they saturate the population(they win every tournament they enter).
I see you are using a low selection pressure of 3 in the gist code. This is already pretty low and lowering selection pressure was going to be my first suggestion for this situation. Perhaps a larger random population can be a band aid(see next re eaMuPlusLambda instead), or perhaps
you can inject these good individuals into the population at the generation where the pop fitness mean is on par with the known solutions? This may help with the disparity issue in selection.
Also, using eaMuPlusLambda + VarOr may also help with more exploration of solution space(based on the gist code example, you may already be doing this in practice).
Also, would it be possible to implement GA operators that evolve through the generations?
One way to do this would involve working with the algorithm locally(for example copy eaSimple and varAnd to your script). Then you would be able to pass along the current gen integer along to the mate and mutate operators, as well as create custom and mutate operators that take in their normal inputs plus the gen integer. This could then be used to guide the workings of these operations.