Open In Colab

IDC6940

Weekly Report 3

Richard Henry & Delanyce Rose

June 26, 2025

We will continue with two symbolic regression libraries available for Python.

PySR

Since this library runs in Google Colab, but continually dies on a local machine running Jupyter notebook, the first step is to ensure that the library is installed on the current instance of Colab.

# Install libraries
!pip install -U pysr
Requirement already satisfied: pysr in /usr/local/lib/python3.11/dist-packages (1.5.8)
Requirement already satisfied: click<9.0.0,>=7.0.0 in /usr/local/lib/python3.11/dist-packages (from pysr) (8.2.1)
Requirement already satisfied: juliacall<0.9.26,>=0.9.24 in /usr/local/lib/python3.11/dist-packages (from pysr) (0.9.25)
Requirement already satisfied: numpy<3.0.0,>=1.13.0 in /usr/local/lib/python3.11/dist-packages (from pysr) (2.0.2)
Requirement already satisfied: pandas<3.0.0,>=0.21.0 in /usr/local/lib/python3.11/dist-packages (from pysr) (2.2.2)
Requirement already satisfied: scikit-learn<2.0.0,>=1.0.0 in /usr/local/lib/python3.11/dist-packages (from pysr) (1.6.1)
Requirement already satisfied: sympy<2.0.0,>=1.0.0 in /usr/local/lib/python3.11/dist-packages (from pysr) (1.13.1)
Requirement already satisfied: typing-extensions<5.0.0,>=4.0.0 in /usr/local/lib/python3.11/dist-packages (from pysr) (4.14.0)
Requirement already satisfied: juliapkg~=0.1.17 in /usr/local/lib/python3.11/dist-packages (from juliacall<0.9.26,>=0.9.24->pysr) (0.1.17)
Requirement already satisfied: python-dateutil>=2.8.2 in /usr/local/lib/python3.11/dist-packages (from pandas<3.0.0,>=0.21.0->pysr) (2.9.0.post0)
Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.11/dist-packages (from pandas<3.0.0,>=0.21.0->pysr) (2025.2)
Requirement already satisfied: tzdata>=2022.7 in /usr/local/lib/python3.11/dist-packages (from pandas<3.0.0,>=0.21.0->pysr) (2025.2)
Requirement already satisfied: scipy>=1.6.0 in /usr/local/lib/python3.11/dist-packages (from scikit-learn<2.0.0,>=1.0.0->pysr) (1.15.3)
Requirement already satisfied: joblib>=1.2.0 in /usr/local/lib/python3.11/dist-packages (from scikit-learn<2.0.0,>=1.0.0->pysr) (1.5.1)
Requirement already satisfied: threadpoolctl>=3.1.0 in /usr/local/lib/python3.11/dist-packages (from scikit-learn<2.0.0,>=1.0.0->pysr) (3.6.0)
Requirement already satisfied: mpmath<1.4,>=1.1.0 in /usr/local/lib/python3.11/dist-packages (from sympy<2.0.0,>=1.0.0->pysr) (1.3.0)
Requirement already satisfied: filelock<4.0,>=3.16 in /usr/local/lib/python3.11/dist-packages (from juliapkg~=0.1.17->juliacall<0.9.26,>=0.9.24->pysr) (3.18.0)
Requirement already satisfied: semver<4.0,>=3.0 in /usr/local/lib/python3.11/dist-packages (from juliapkg~=0.1.17->juliacall<0.9.26,>=0.9.24->pysr) (3.0.4)
Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.11/dist-packages (from python-dateutil>=2.8.2->pandas<3.0.0,>=0.21.0->pysr) (1.17.0)

Next we force the installation of the julia programming language by making a call to the Python side:

# import pysr

Now we will import everything else we need:

# Call libraries
import numpy as np
import pandas as pd
import sympy as sym
import matplotlib.pyplot as plt
from pysr import PySRRegressor
import time

Next, we rebuild our toy dataset and plot it.

# Build Two Copies of Toy Dataset
x4=np.arange(start=0.55,stop=1.075,step=0.025)
x5=np.arange(start=0.55,stop=1.075,step=0.025)
y4=141.5/x4-131.5
y5=141.5/x5-131.5
# Plot
plt.plot(x4,y4,marker="o",linestyle="None")
plt.xlabel("Specific Gravity")
plt.ylabel("API Gravity")
plt.title("First Toy Dataset")
plt.grid()
plt.show()

Next, we: - Initiate the Symbolic Regressor - Use the regressor to fit the toy dataset - Predict the API gravity values using the Regressor.

The creators of this library chose to use the scikit-learn workflow, so that the terminology is similar to what was used in progress report 1, with the exception of having to reshape the data because we only have one predictor.

We are using defaults for everything except for the random number seed.

t0 = time.time()
myMod04=PySRRegressor(random_state=7)
myMod04.fit(x4.reshape(-1, 1),y4)
y_pred04=myMod04.predict(x4.reshape(-1, 1))
print("time to fit: ",time.time() - t0, 'seconds')
/usr/local/lib/python3.11/dist-packages/pysr/sr.py:2811: UserWarning: Note: it looks like you are running in Jupyter. The progress bar will be turned off.
  warnings.warn(
/usr/local/lib/python3.11/dist-packages/pysr/sr.py:1873: UserWarning: Note: Setting `random_state` without also setting `deterministic=True` and `parallelism='serial'` will result in non-deterministic searches.
  warnings.warn(
[ Info: Started!

Expressions evaluated per second: 3.660e+04
Progress: 185 / 3100 total iterations (5.968%)
════════════════════════════════════════════════════════════════════════════════════════════════════
───────────────────────────────────────────────────────────────────────────────────────────────────
Complexity  Loss       Score      Equation
1           1.321e+03  0.000e+00  y = 52.149
3           6.519e+02  3.532e-01  y = 44 / x₀
5           4.859e-11  1.511e+01  y = (141.5 / x₀) + -131.5
7           4.305e-11  6.057e-02  y = ((141.5 / x₀) + -60.092) - 71.408
9           3.196e-11  1.489e-01  y = ((x₀ + 141.5) / x₀) + (-61.905 - 70.595)
21          2.496e-11  2.061e-02  y = ((((186.51 / (x₀ + x₀)) + ((96.494 / (x₀ + x₀)) + -8.3...
                                      28)) + -115.6) + (x₀ - 7.5748)) - x₀
───────────────────────────────────────────────────────────────────────────────────────────────────
════════════════════════════════════════════════════════════════════════════════════════════════════
Press 'q' and then <enter> to stop execution early.

Expressions evaluated per second: 3.680e+04
Progress: 377 / 3100 total iterations (12.161%)
════════════════════════════════════════════════════════════════════════════════════════════════════
───────────────────────────────────────────────────────────────────────────────────────────────────
Complexity  Loss       Score      Equation
1           1.321e+03  0.000e+00  y = 52.149
3           6.519e+02  3.532e-01  y = 44 / x₀
5           4.859e-11  1.511e+01  y = (141.5 / x₀) + -131.5
7           3.196e-11  2.095e-01  y = ((x₀ + 141.5) / x₀) + -132.5
11          2.780e-11  3.484e-02  y = (((-50.177 - x₀) + (141.5 / x₀)) + x₀) - 81.323
13          2.434e-11  6.654e-02  y = (248.09 / (x₀ + x₀)) + (-131.5 + (34.908 / (x₀ + x₀)))
15          1.880e-11  1.292e-01  y = (((228.07 / (x₀ + x₀)) + -117.43) + (54.935 / (x₀ + x₀...
                                      ))) + -14.066
23          1.602e-11  1.994e-02  y = (x₀ + -19.14) + ((((94.138 / (x₀ + x₀)) + -98.461) + (...
                                      (188.86 / (x₀ + x₀)) - (x₀ - -7.3012))) - 6.5983)
───────────────────────────────────────────────────────────────────────────────────────────────────
════════════════════════════════════════════════════════════════════════════════════════════════════
Press 'q' and then <enter> to stop execution early.

Expressions evaluated per second: 3.730e+04
Progress: 579 / 3100 total iterations (18.677%)
════════════════════════════════════════════════════════════════════════════════════════════════════
───────────────────────────────────────────────────────────────────────────────────────────────────
Complexity  Loss       Score      Equation
1           1.321e+03  0.000e+00  y = 52.149
3           6.519e+02  3.532e-01  y = 44 / x₀
5           4.859e-11  1.511e+01  y = (141.5 / x₀) + -131.5
7           3.196e-11  2.095e-01  y = ((x₀ + 141.5) / x₀) + -132.5
11          2.780e-11  3.484e-02  y = (((-50.177 - x₀) + (141.5 / x₀)) + x₀) - 81.323
13          2.434e-11  6.654e-02  y = (248.09 / (x₀ + x₀)) + (-131.5 + (34.908 / (x₀ + x₀)))
15          1.724e-11  1.725e-01  y = ((54.935 / (x₀ + x₀)) + ((228.07 / (x₀ + x₀)) + -117.4...
                                      3)) + -14.066
19          9.441e-12  1.505e-01  y = (((188.86 / (x₀ + x₀)) - (x₀ - -7.3012)) + (x₀ + -124....
                                      2)) + (94.138 / (x₀ + x₀))
───────────────────────────────────────────────────────────────────────────────────────────────────
════════════════════════════════════════════════════════════════════════════════════════════════════
Press 'q' and then <enter> to stop execution early.

Expressions evaluated per second: 3.590e+04
Progress: 790 / 3100 total iterations (25.484%)
════════════════════════════════════════════════════════════════════════════════════════════════════
───────────────────────────────────────────────────────────────────────────────────────────────────
Complexity  Loss       Score      Equation
1           1.321e+03  0.000e+00  y = 52.149
3           6.519e+02  3.532e-01  y = 44 / x₀
5           4.859e-11  1.511e+01  y = (141.5 / x₀) + -131.5
7           3.196e-11  2.095e-01  y = ((x₀ + 141.5) / x₀) + -132.5
11          2.780e-11  3.484e-02  y = (((-50.177 - x₀) + (141.5 / x₀)) + x₀) - 81.323
13          1.914e-11  1.866e-01  y = (68.078 / (x₀ + x₀)) + ((214.92 / (x₀ + x₀)) + -131.5)
15          1.724e-11  5.243e-02  y = ((54.935 / (x₀ + x₀)) + ((228.07 / (x₀ + x₀)) + -117.4...
                                      3)) + -14.066
17          1.672e-11  1.531e-02  y = ((125.03 / (x₀ + x₀)) - x₀) + ((157.97 / (x₀ + x₀)) + ...
                                      (x₀ + -131.5))
19          9.441e-12  2.857e-01  y = (((188.86 / (x₀ + x₀)) - (x₀ - -7.3012)) + (x₀ + -124....
                                      2)) + (94.138 / (x₀ + x₀))
21          8.402e-12  5.832e-02  y = ((x₀ - x₀) + (122.15 / (x₀ + x₀))) + (-129.1 + (((x₀ -...
                                       -160.85) / (x₀ + x₀)) + -2.9023))
───────────────────────────────────────────────────────────────────────────────────────────────────
════════════════════════════════════════════════════════════════════════════════════════════════════
Press 'q' and then <enter> to stop execution early.

Expressions evaluated per second: 3.720e+04
Progress: 1015 / 3100 total iterations (32.742%)
════════════════════════════════════════════════════════════════════════════════════════════════════
───────────────────────────────────────────────────────────────────────────────────────────────────
Complexity  Loss       Score      Equation
1           1.321e+03  0.000e+00  y = 52.149
3           6.519e+02  3.532e-01  y = 44 / x₀
5           4.859e-11  1.511e+01  y = (141.5 / x₀) + -131.5
7           3.196e-11  2.095e-01  y = ((x₀ + 141.5) / x₀) + -132.5
9           2.018e-11  2.299e-01  y = (33.4 / x₀) + ((108.1 / x₀) + -131.5)
13          1.914e-11  1.322e-02  y = (68.078 / (x₀ + x₀)) + ((214.92 / (x₀ + x₀)) + -131.5)
15          8.402e-12  4.117e-01  y = (122.15 / (x₀ + x₀)) + (((x₀ - -160.85) / (x₀ + x₀)) +...
                                       -132)
───────────────────────────────────────────────────────────────────────────────────────────────────
════════════════════════════════════════════════════════════════════════════════════════════════════
Press 'q' and then <enter> to stop execution early.

Expressions evaluated per second: 4.150e+04
Progress: 1366 / 3100 total iterations (44.065%)
════════════════════════════════════════════════════════════════════════════════════════════════════
───────────────────────────────────────────────────────────────────────────────────────────────────
Complexity  Loss       Score      Equation
1           1.321e+03  0.000e+00  y = 52.149
3           6.519e+02  3.532e-01  y = 44 / x₀
5           4.859e-11  1.511e+01  y = (141.5 / x₀) + -131.5
7           3.196e-11  2.095e-01  y = ((x₀ + 141.5) / x₀) + -132.5
9           2.018e-11  2.299e-01  y = (33.4 / x₀) + ((108.1 / x₀) + -131.5)
13          1.464e-11  8.028e-02  y = (87.241 / (x₀ + x₀)) + (-131.5 + (195.76 / (x₀ + x₀)))
15          8.402e-12  2.776e-01  y = (122.15 / (x₀ + x₀)) + (((x₀ - -160.85) / (x₀ + x₀)) +...
                                       -132)
17          6.965e-12  9.381e-02  y = ((122.15 / (x₀ + x₀)) + (((x₀ - -160.85) / (x₀ + x₀)) ...
                                      + -131.94)) - 0.064223
───────────────────────────────────────────────────────────────────────────────────────────────────
════════════════════════════════════════════════════════════════════════════════════════════════════
Press 'q' and then <enter> to stop execution early.

Expressions evaluated per second: 5.260e+04
Progress: 1850 / 3100 total iterations (59.677%)
════════════════════════════════════════════════════════════════════════════════════════════════════
───────────────────────────────────────────────────────────────────────────────────────────────────
Complexity  Loss       Score      Equation
1           1.321e+03  0.000e+00  y = 52.149
3           6.519e+02  3.532e-01  y = 44 / x₀
5           4.859e-11  1.511e+01  y = (141.5 / x₀) + -131.5
7           3.196e-11  2.095e-01  y = ((x₀ + 141.5) / x₀) + -132.5
9           1.880e-11  2.654e-01  y = ((42.384 / x₀) + -131.5) + (99.116 / x₀)
13          1.464e-11  6.250e-02  y = (87.241 / (x₀ + x₀)) + (-131.5 + (195.76 / (x₀ + x₀)))
15          8.402e-12  2.776e-01  y = (122.15 / (x₀ + x₀)) + (((x₀ - -160.85) / (x₀ + x₀)) +...
                                       -132)
17          6.965e-12  9.381e-02  y = ((122.15 / (x₀ + x₀)) + (((x₀ - -160.85) / (x₀ + x₀)) ...
                                      + -131.94)) - 0.064223
───────────────────────────────────────────────────────────────────────────────────────────────────
════════════════════════════════════════════════════════════════════════════════════════════════════
Press 'q' and then <enter> to stop execution early.

Expressions evaluated per second: 6.590e+04
Progress: 2410 / 3100 total iterations (77.742%)
════════════════════════════════════════════════════════════════════════════════════════════════════
───────────────────────────────────────────────────────────────────────────────────────────────────
Complexity  Loss       Score      Equation
1           1.321e+03  0.000e+00  y = 52.149
3           6.519e+02  3.532e-01  y = 44 / x₀
5           4.859e-11  1.511e+01  y = (141.5 / x₀) + -131.5
7           3.196e-11  2.095e-01  y = ((x₀ + 141.5) / x₀) + -132.5
9           1.880e-11  2.654e-01  y = ((42.384 / x₀) + -131.5) + (99.116 / x₀)
13          1.464e-11  6.250e-02  y = (87.241 / (x₀ + x₀)) + (-131.5 + (195.76 / (x₀ + x₀)))
15          8.402e-12  2.776e-01  y = (122.15 / (x₀ + x₀)) + (((x₀ - -160.85) / (x₀ + x₀)) +...
                                       -132)
17          6.215e-12  1.508e-01  y = ((122.15 / (x₀ + x₀)) + (((x₀ - -160.85) / (x₀ + x₀)) ...
                                      + -132.46)) - -0.45653
───────────────────────────────────────────────────────────────────────────────────────────────────
════════════════════════════════════════════════════════════════════════════════════════════════════
Press 'q' and then <enter> to stop execution early.
[ Info: Final population:
[ Info: Results saved to:

Expressions evaluated per second: 7.140e+04
Progress: 2731 / 3100 total iterations (88.097%)
════════════════════════════════════════════════════════════════════════════════════════════════════
───────────────────────────────────────────────────────────────────────────────────────────────────
Complexity  Loss       Score      Equation
1           1.321e+03  0.000e+00  y = 52.149
3           6.519e+02  3.532e-01  y = 44 / x₀
5           4.859e-11  1.511e+01  y = (141.5 / x₀) + -131.5
7           3.196e-11  2.095e-01  y = ((x₀ + 141.5) / x₀) + -132.5
9           1.880e-11  2.654e-01  y = ((42.384 / x₀) + -131.5) + (99.116 / x₀)
13          1.464e-11  6.250e-02  y = (87.241 / (x₀ + x₀)) + (-131.5 + (195.76 / (x₀ + x₀)))
15          8.402e-12  2.776e-01  y = (122.15 / (x₀ + x₀)) + (((x₀ - -160.85) / (x₀ + x₀)) +...
                                       -132)
17          6.215e-12  1.508e-01  y = ((122.15 / (x₀ + x₀)) + (((x₀ - -160.85) / (x₀ + x₀)) ...
                                      + -132.46)) - -0.45653
───────────────────────────────────────────────────────────────────────────────────────────────────
════════════════════════════════════════════════════════════════════════════════════════════════════
Press 'q' and then <enter> to stop execution early.
───────────────────────────────────────────────────────────────────────────────────────────────────
Complexity  Loss       Score      Equation
1           1.321e+03  0.000e+00  y = 52.149
3           6.519e+02  3.532e-01  y = 44 / x₀
5           4.859e-11  1.511e+01  y = (141.5 / x₀) + -131.5
7           3.196e-11  2.095e-01  y = ((x₀ + 141.5) / x₀) + -132.5
9           1.880e-11  2.654e-01  y = ((42.384 / x₀) + -131.5) + (99.116 / x₀)
13          1.464e-11  6.250e-02  y = (87.241 / (x₀ + x₀)) + (-131.5 + (195.76 / (x₀ + x₀)))
15          8.402e-12  2.776e-01  y = (122.15 / (x₀ + x₀)) + (((x₀ - -160.85) / (x₀ + x₀)) +...
                                       -132)
17          6.215e-12  1.508e-01  y = ((122.15 / (x₀ + x₀)) + (((x₀ - -160.85) / (x₀ + x₀)) ...
                                      + -132.46)) - -0.45653
───────────────────────────────────────────────────────────────────────────────────────────────────
time to fit:  49.09089636802673 seconds
  - outputs/20250626_201459_NJnrIf/hall_of_fame.csv
plt.scatter(x4,y4,label="Raw")
plt.plot(x4,y_pred04,label="Fit")
plt.xlabel("Specific Gravity")
plt.ylabel("API Gravity")
plt.title("First Toy Dataset - PySR Fit")
plt.grid()
plt.legend()
plt.show()

We appear to have a wonderful fit, so let us look at the winning equation:

myEq04=myMod04.sympy()
myEq04
(x0 - 1*(-160.84647))/(x0 + x0) - 132.00002 + 122.15356/(x0 + x0)

On first glance, this equation looks too complicated. However, since it is in sympy format, we can ask for a simplified version:

sym.simplify(myEq04)
-131.50002 + 141.500015/x0

This is very close to our target equation. The big difference between PySR and gramEvol is that the constants calculated from regression instead of being picked from a list.

Just to be clear, here is the sympy version of the target equation:

x, y = sym.symbols('x y')
myTarg=sym.sympify(141.5/x-131.5)
myTarg
-131.5 + 141.5/x

GPLearn

Next, we are going to repeat the workflow with a similar python library.

# Install libraries
!pip install gplearn
Requirement already satisfied: gplearn in /usr/local/lib/python3.11/dist-packages (0.4.2)
Requirement already satisfied: scikit-learn>=1.0.2 in /usr/local/lib/python3.11/dist-packages (from gplearn) (1.6.1)
Requirement already satisfied: joblib>=1.0.0 in /usr/local/lib/python3.11/dist-packages (from gplearn) (1.5.1)
Requirement already satisfied: numpy>=1.19.5 in /usr/local/lib/python3.11/dist-packages (from scikit-learn>=1.0.2->gplearn) (2.0.2)
Requirement already satisfied: scipy>=1.6.0 in /usr/local/lib/python3.11/dist-packages (from scikit-learn>=1.0.2->gplearn) (1.15.3)
Requirement already satisfied: threadpoolctl>=3.1.0 in /usr/local/lib/python3.11/dist-packages (from scikit-learn>=1.0.2->gplearn) (3.6.0)
# Call libraries
#import numpy as np
#import pandas as pd
#import sympy as sym
#import matplotlib.pyplot as plt
from gplearn.genetic import SymbolicRegressor
#import time

This library also follows scikit-learn:

t1 = time.time()
myMod05=SymbolicRegressor(verbose=1)
myMod05.fit(x5.reshape(-1, 1),y5)
y_pred05=myMod05.predict(x5.reshape(-1, 1))
print("time to fit: ",time.time() - t1, 'seconds')
/usr/local/lib/python3.11/dist-packages/sklearn/base.py:474: FutureWarning: `BaseEstimator._validate_data` is deprecated in 1.6 and will be removed in 1.7. Use `sklearn.utils.validation.validate_data` instead. This function becomes public and is part of the scikit-learn developer API.
  warnings.warn(
    |   Population Average    |             Best Individual              |
---- ------------------------- ------------------------------------------ ----------
 Gen   Length          Fitness   Length          Fitness      OOB Fitness  Time Left
   0    48.77          230.199      127          16.0636              N/A      1.19m
   1    69.97           61.557       39          9.73818              N/A     53.49s
   2   105.94          109.546       75          6.65517              N/A     59.87s
   3   111.77           594.88      135           5.8833              N/A      1.22m
   4   101.34           430.16      111          4.87836              N/A      1.09m
   5   118.17          829.213      123          4.32692              N/A     26.02s
   6   123.19          636.761      111          3.81706              N/A     20.34s
   7   117.01          919.962      125          3.68446              N/A     18.69s
   8   114.20          1030.04      139          2.86918              N/A     16.70s
   9   118.03          1009.22      139          2.86918              N/A     21.09s
  10   121.82          490.347      139          2.82478              N/A     22.30s
  11   126.96          1320.42      119          2.55424              N/A     15.45s
  12   139.68          319.006      141          2.33987              N/A     11.55s
  13   143.51          724.901      175          1.91984              N/A     10.11s
  14   147.76          996.678      179          1.82939              N/A      8.36s
  15   149.28          675.834      175          1.57828              N/A      6.98s
  16   169.78          876.069      211          1.54558              N/A      5.82s
  17   178.75          967.873      211          1.54558              N/A      5.76s
  18   178.07          212.905      215          1.53758              N/A      2.19s
  19   181.07          586.638      283          1.49769              N/A      0.00s
time to fit:  47.844974517822266 seconds
plt.scatter(x5,y5,label="Raw")
plt.plot(x5,y_pred05,label="Fit")
plt.xlabel("Specific Gravity")
plt.ylabel("API Gravity")
plt.title("First Toy Dataset - GPLearn Fit")
plt.grid()
plt.legend()
plt.show()

The fit is almost as good as PySR, and it ran almost twice as fast.

Since we chose nearly default parameters for this library we need to provide a dictionary for it to convert the winning equation to sympy:

converter = {
    'add': lambda x, y : x + y,
    'sub': lambda x, y : x - y,
    'mul': lambda x, y : x*y,
    'div': lambda x, y : x/y,
    'sqrt': lambda x : x**0.5,
    'log': lambda x : log(x),
    'abs': lambda x : abs(x),
    'neg': lambda x : -x,
    'inv': lambda x : 1/x,
    'max': lambda x, y : max(x, y),
    'min': lambda x, y : min(x, y),
    'sin': lambda x : sin(x),
    'cos': lambda x : cos(x),
    'pow': lambda x, y : x**y,
}
myEq05 = sym.sympify(str(myMod05._program), locals=converter)
myEq05
2.86285714285714*(X0 - 0.357)*(-(2.710027100271*X0 + 0.059/X0)*(-0.8 + (3.16000385319333*X0*(X0 - 0.26) + 4.13139611542167*X0 + 3.61861904761905)/(X0 - 0.357 - 0.872428571428571/X0) + 1.43142857142857/X0) + (-2.80898876404494*X0*(-X0 - 0.306) + 0.501)/((3.19488817891374*X0 + 1)/(1.489*X0 - (2.710027100271*X0 + 0.059/X0)*(0.356 + (3.16000385319333*X0*(X0 - 0.26) + 4.13139611542167*X0 + 3.61861904761905)/(X0 - 0.357 - 0.872428571428571/X0) + 0.791139240506329*(3.19488817891374*X0 + 1)/X0) + 0.643 - 0.872428571428571/X0) - 1) - 231.122687272957/X0**2)/((1.84152184795823 - 16.8788528927984*X0)*(X0**2 + 0.26*X0))

Hmm. Lets have it simplified:

sym.simplify(myEq05)
(278.820002953071*X0**13 + 640.018813576176*X0**12 + 690.754307137746*X0**11 + 268.31036133222*X0**10 + 5457.87712812355*X0**9 + 7619.62577640509*X0**8 - 6937.47921256253*X0**7 - 16804.077869542*X0**6 + 1575.78354402274*X0**5 + 6273.6100798772*X0**4 - 102.256793911761*X0**3 + 2.9432049685599*X0**2 - 170.865461382981*X0 - 8.39525390570684)/(X0**3*(144.545366338012*X0**10 + 282.301566657126*X0**9 - 48.7993700979364*X0**8 - 481.522001438071*X0**7 - 191.930188319542*X0**6 + 109.695704741436*X0**5 + 59.6272490047645*X0**4 + 16.4463177493779*X0**3 + 1.59776694954208*X0**2 - 0.303370477591*X0 - 0.0170104154782206))

That did not change much. (:-<) This is a good equation, but it is much more complicated than it needs to be.

Tentative Conclusions

  1. Setting the random number seed does not guarantee the same answers in either library.
  2. Both libraries “work”, but one may need more tuning than the other to produce actionable results.
  3. It would appear as though domain knowledge may be necessary to tell whether a winning equation is “simple” enough.