Is this the coefficients of a simple multivariate regression presented in bar graph form?

Very nice!!!!! I learned something.

I have played with k-nearest neighbors but did not realize it could be used for this.

So I have never used this but it seems it could replace feature_importances. And as far as Iknow it could be better. From the documentation:

"It can be used for univariate features selection," which ties in nice with @Pitmasters link here (link it to his post): Max_features = "log2" vs the default - #2

So random forests and extra trees regressors can be improve in 2 ways as Pitmaster links to above:

  1. Tuning of a relatively few hyperparameters—especially compared to XGBoost say.

  2. Recursive feature elimination which could be done with this—although i am not sure if it is the optimal method to use for recursive feature eleimination. Which would be basically removing the feature with the least "mutual information" relative to the target one at a time until you find the best performing model..

Do those 2 things with an Extra TreesRegressor and you can be done in good time:

Which is a little quicker that keras III it seems, which is still running (sorry about that I did not know):

Anyway, I learned something. It seems really cool and I cannot wait to use it!!!!!

Jim