I think this is a great feature!!!
But P123's Target Information Regression does not work for me for some reason.
When I use my DataMiner download with pretty much the same features and mutual_info_regression at Sklearn I get different answers that work for me and make sense.
You can see for yourself if you have any data. Here is my post on this: Target Information Regression seems to work well with DataMiner Downloads but not as expected with the AI Factor Beta - #4
Here is a repeat of the code so that you can use to see for yourself if there is a significant difference:
I have no access to P123's code to understand why there is a difference or why I need to use Jupyter Notebooks to get answers that make sense to me.
from sklearn.feature_selection import mutual_info_regression
import numpy as np
import pandas as pd
# List of factors
factors = []
# Read data
df8 = pd.read_csv('~/Desktop/DataMiner/xs/DM8xs.csv')
# Combine data for training
df_train = df8.copy() # Use copy to avoid setting with copy warning
# Drop rows with missing values in any of the factors or target column
df_train.dropna(subset=factors + ['ExcessReturn'], inplace=True)
# Define features (X) and target (Y)
X = df_train[factors]
Y = df_train['ExcessReturn']
# Compute mutual information scores
mi_scores = mutual_info_regression(X, Y, n_neighbors=3)
# Display the mutual information scores
mi_scores_df = pd.DataFrame({'Feature': factors, 'MI Score': mi_scores})
print(mi_scores_df)
Jim