Skip to content

jaimeps/adaboost-implementation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

Implementation of AdaBoost classifier

Description

This is an implementation of the AdaBoost algorithm for a two-class classification problem. The algorithm sequentially applies a weak classification to modified versions of the data. By increasing the weights of the missclassified observations, each weak learner focuses on the error of the previous one. The predictions are aggregated through a weighted majority vote.

Methods

Adaboost algorithm:

Example

Using the Hastie (10.2) dataset, we can appreciate a significant reduction in the error rate as we increase the number of iterations.

References

  • Trevor Hastie, Robert Tibshirani, Jerome Friedman - The Elements of Statistical Learning

About

Implementation of AdaBoost algorithm in Python

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy