Is this the coefficients of a simple multivariate regression presented in bar graph form?

Yes. Would need nested cross-validation.

Extra Trees Regressor seems so robust across different hyperparameters I think I will opt out of optimizing the hyperaparmeters. Although I will probably use different defaults that described above (without changing them).

Doing just RFE now (without doing anything with the hyperaparmeters) in fact.

I hope I fully understand your post now. VERY HELPFUL!!

You are duplicating this: SelectKBest. SeletKBest at sklearn can use different metrics including mutual_information_regression. Available now as a menu dropdown at P123!!!!!

I cannot begin to tell you what a cool thing I think that is. To start, I think i will automatically remove any features with zero mutual information. Then at least try some different thresholds. I can a fine tune on individual features later. See discussion of and links to recursive feature elimination by Pitmaster here: Max_features = "log2" vs the default - #2 by pitmaster

But SelectKBest.is the same principle or more accurately it is one of the methods of recursive feature elimination discussed in Pitmaster's link.

Very nice!

Jim