Genetic algorithm to replace manual optimizaton in P123 classic

Thank you for sharing your progress with this algorithm.

I hadn’t come across this before. I checked in with ChatGPT and it does sound promising.

Would you mind sharing a bit more about what you’ve learned from using the CMA-ES method?

That’s such an important point — and not just for genetic algorithms, but for optimization in general. Whether we’re using a spreadsheet, a GA, random forest, or XGBoost, we’re ultimately trying to find an optimum — ideally a global one.

Most of the ML algorithms at P123 rely on some variant of gradient descent to get there. But why stop there, as you point out?

Out of curiosity: what metric are you using to evaluate fitness?

I recently shared some thoughts on the limitations of R² and MSE, especially when the true goal is ranking stocks:

:point_right: For ML Feature Selection, Can a Larger MSE Actually Mean a Better Model?

That post didn’t generate much discussion, but I think it’s relevant here. I doubt you’re using R² or MSE — and that might be part of why your results are working so well.

Also, if anyone happens to know — I’d be curious what metric P123’s grid search is currently using. R² seems like the default, but I haven’t seen this explicitly confirmed.

Whatever Wycliffes is using might offer some useful guidance for future improvements to the AI/ML module.

Very advanced stuff — thanks again for sharing it.

1 Like