Giving you personal and shared information on science, technology, technical & Investments in Nigeria

Automated machine learning or AutoML explained

The two biggest barriers to the application of machine learning both classical machine learning and deep learning are skills and computing resources. You can solve the second problem by throwing money at it, either for the purchase of accelerated hardware such as computers with high-end GPUs or for the rental of computing resources in the cloud such as instances with attached GPUs, TPUs, and FPGAs.

On the other hand, solving the skills problem is harder. Data scientists often command hefty salaries and may still be hard to recruit. Google was able to train many of its employees on its own Tensor Flow framework, however, most firms barely have people skilled enough to build machine learning and deep learning models themselves, much less teach others how.

What is AutoML?

Automated machine learning, or AutoML, aims to reduce or eliminate the requirement for skilled data scientists to build machine learning and deep learning models. Instead, an AutoML system permits you to make available the labeled training data as input and receive an optimized model as output.

There are several ways of going about this. One approach is for the software to simply train every kind of model on the data and pick the one that works best. A refinement of this would be for it to build one or more ensemble models that combine the other models, which sometimes however not always gives better results.

A second technique is to optimize the hyperparameters explained below of the greatest model or models to train an even better model. Feature engineering also explained below is a valuable addition to any model training. One way of de-skilling deep learning is to apply transfer learning, essentially customizing a well-trained general model for specific data.

What is hyperparameter optimization?

All machine learning models have parameters, meaning the weights for each variable or feature in the model. These are normally determined by back-propagation of the errors, plus iteration under the control of an optimizer such as stochastic gradient descent.

Most machine learning models also have hyperparameters that are set outside of the training loop. These often involve the learning rate, the dropout rate, and model-specific parameters such as the number of trees in a Random Forest.

Hyperparameter tuning or hyperparameter optimization HPO is an automatic way of sweeping or searching through one or more of the hyperparameters of a model to find the set that outcomes in the greatest trained model. This can be time-consuming since you require to train the model again the inner loop for each set of hyperparameter values in the sweep the outer loop. If you train many models in parallel, you can reduce the time required at the expense of applying more hardware.

What is feature engineering?

A feature is an individual measurable property or characteristic of a phenomenon being observed. The concept of a “feature” is related to that of an explanatory variable, which is applied in statistical techniques such as linear regression. A feature vector combines all of the features for a single row into a numerical vector. Feature engineering is the process of finding the greatest set of variables and the greatest data encoding and normalization for input to the model training process.

Part of the art of selecting features is to pick a minimum set of independent variables that explain the problem. If two variables are extremely correlated, either they require to be combined into a single feature, or one should be dropped. Sometimes people perform principal component analysis PCA to change correlated variables into a set of linearly uncorrelated variables.

To apply categorical data for machine classification, you require to encode the text labels into another form. There are two common encodings.

One is label encoding, which means that each text label value is replaced with a number. The other is one-hot encoding, which means that each text label value is turned into a column with a binary value or. Most machine learning frameworks have functions that do the conversion for you. In general, one-hot encoding is preferred, as label encoding can sometimes confuse the machine learning algorithm into thinking that the encoded column is ordered.

To apply numeric data for machine regression, you normally require to normalize the data. Otherwise, the numbers with larger ranges might tend to dominate the Euclidian distance between feature vectors, their effects could be magnified at the expense of the other fields, and the steepest descent optimization might have difficulty converging. There are a number of ways to normalize and standardize data for machine learning, including min-max normalization, mean normalization, standardization, and scaling to unit length. This process is often called feature scaling.

Some of the transformations that people apply to construct fresh features or reduce the dimensionality of feature vectors are simple. For instance, subtract Year of Birth from Year of Death and you construct Age at Death, which is a prime independent variable for lifetime and mortality analysis. In other cases, feature construction may not be so obvious.

What is transfer learning?

Transfer learning is sometimes called custom machine learning and sometimes called AutoML mostly by Google. Rather than initiating from scrap  once training models from your data, Google Cloud AutoML implements automatic deep transfer learning meaning that it initiates from an existing deep neural network trained on other data and neural architecture search meaning that it finds the right combination of extra network layers for language pair translation, natural language classification, and image classification.

That’s a variety process than what’s normally meant by AutoML, and it doesn’t cover as many apply cases. On the other hand, if you require a customized deep learning model in an assisted area, transfer learning will often produce a superior model.

AutoML implementations

There are many implementations of AutoML that you can try. Some are paid services, and some are free source code. The lists below are by no means complete or final.

AutoML services

All of the big three cloud services have some kind of AutoML. Amazon Sage Maker does hyperparameter tuning, however, doesn’t automatically try multiple models or perform feature engineering. Azure Machine Learning has both AutoML, which sweeps through features and algorithms, and hyperparameter tuning, which you typically run on the greatest algorithm chosen by AutoML. Google Cloud AutoML, as I discussed earlier, is deep transfer learning for language pair translation, natural language classification, and image classification.

A number of smaller firms offer AutoML services as well. For instance, Data Robot, which claims to have invented AutoML, has a strong reputation in the market. And while dot Data has a tiny market share and a mediocre UI, it has strong feature engineering capabilities and covers many enterprises apply cases. HO.ai Driverless AI, which I evaluate, can aid a data scientist to turn out models like a Kaggle master, doing feature engineering, algorithm sweeps, and hyperparameter optimization in a unified way.

AutoML frameworks

AdaNet is a lightweight Tensor Flow-based framework for automatically learning high-quality models with minimal expert intervention. Auto-Keras is an open-source software library for programmed machine learning, built at Texas A&M, that provides functions to automatically search for architecture and hyperparameters of deep learning models. NNI Neural Network Intelligence is a toolkit from Microsoft to aid users design and tune machine learning models e.g., hyperparameters, neural network architectures, or a complex system’s parameters in an efficient and automatic way.

Originally posted 2019-08-28 08:00:24.

Spread the love
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  

Leave a Reply

Your email address will not be published.

WP2Social Auto Publish Powered By : XYZScripts.com
error: Content is protected !!