/home/leansigm/public_html/components/com_easyblog/services

SigmaWay Blog

SigmaWay Blog tries to aggregate original and third party content for the site users. It caters to articles on Process Improvement, Lean Six Sigma, Analytics, Market Intelligence, Training ,IT Services and industries which SigmaWay caters to

Introducing Gradient Boosting Machines

Introducing Gradient Boosting Machines

Currently one of the state of the art algorithms in Machine Learning is Gradient Boosting Machine (GBM). GBM can be used for regression, based on decision trees as prediction models. In GBMs, the learning procedure consecutively fits new models to provide a more accurate estimate of the response variable. The principle idea behind this algorithm is to construct the new base-learners to be maximally correlated with the negative gradient of the loss function, associated with the whole ensemble. The loss functions applied can be arbitrary, but to give a better perception, if the error function is the classic squared-error loss, the learning procedure would result in consecutive error-fitting. Read more at: http://www.dataminingblog.com/

 

Rate this blog entry:
4453 Hits
0 Comments
Sign up for our newsletter

Follow us