ML integration Update

In addition to helping enisbe, uploading programs for prediction would require no processing time for P123 (or AWS).

Assuming people who use the API would not stick with just using the API they could find their own computer resources to train their models and then integrate their models into what you are planning—assuming some sort of Python interface.

It seems like enisbe might have a good suggestion.

But are you going to have a Python interface? One that does not have a TensorFlow library installed perhaps? No TensorFlow libraries but possibly your own libraries: “having our own ML libraries in our system….”

I look forward to seeing what you and the people you are working with have in mind.

Jim

Whatever works. I am not personally motivated to see P123 use neural nets. This is an interesting topic and others have shown interest in it. Ensibe uses it for profit it seems.

For individuals Colab has a significant upgrade in its resources for for $10 per month. The free version claims to have access to GPUs (Graphics Processor Units). But I do not find the free version to be faster than my MacBook Pro. And make no mistake, it is an old MacBook Pro—with 2 cores (2015).

Here is the link to Colab: Colab

Colab IS TensorFlow and Google. Probably created to help a new generation learn TensorFlow so Google (who created TensorFlow) can recruit new people already using TensorFlow. So you would expect Colab to have some solution that works for most people.

I have been able to create slow models that do not finish with a wide variety of ML algorithms. I am pretty good at doing that.

The most important factor in a NN model that is already standardized is the optimization method in my experience. Using “Stochastic Gradient Descent” is the easiest method for creating a neural-net program that will not finish running in my experience.

Some books tout Stochastic Gradient Descent as a method. I can see the advantages. But generally stick with Nadam or Adam, I would recommend. Nadam is an advanced algorithm that is effective at adjusting the learning rate. Going fast when it can and slower when it needs to.

In my experience most people are more aware of deep learning than they are of boosting and most associate deep learning with AI. Neural nets will be a marketing tool if you can provide it even if it may not be better than XGBoost for most models.

But it is also true that people can develop a TensorFlow model with the API now. Without paying the ten dollars a month at Colab in my experience.

I think one can get TensorFlow to work on a variety of systems if you want to use it for marketing. But it is not that hard to create one that does not finish running for any ML model.

FWIW.

Jim