LightGBM random seeds may be handled a few ways:
- Provide fixed values in data_random_seed and feature_fraction_seed. LightGBM gives precedence to direct specification of seeds.
- Provide a fixed value in random_state, from which LightGBM will use internally to populate other seeds, including data_random_seed and feature_fraction_seed.
- Omit seed/state from hyperparameters. This is the default behavior of LightGBM on Portfolio123, yielding different output per training.
The default randomness is intentional and may be useful for exploring the robustness of an algorithm against an AI Factor's features. The system allows the same model to be added and trained multiple times on the same AI Factor to support this use case. (Perhaps fixed seeds should be used in Grids, since differing seeds possibly obscure the results if the goal is to tune hyperparameters.)