Skip main navigation

Cookies Notification

We use cookies on this site to enhance your user experience. By continuing to browse the site, you consent to the use of our cookies. Learn More
×

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Existing users will be able to log into the site and access content. However, E-commerce and registration of new users may not be available for up to 12 hours.
For online purchase, please visit us again. Contact us at customercare@wspc.com for any enquiries.

SEARCH GUIDE  Download Search Tip PDF File

  • articleNo Access

    Error Awareness by Lower and Upper Bounds in Ensemble Learning

    Ensemble learning systems could lower down the risk of overfitting that often appears in a single learning model. Different to those ensemble learning approaches by re-sampling, negative correlation learning trains all learners in an ensemble simultaneously and cooperatively. However, overfitting had sometimes been observed in negative correlation learning. Two error bounds are therefore introduced into negative correlation learning for preventing overfitting. One is the upper bound of error output (UBEO) which divides the training data into two groups based on the distances between the data and the formed decision boundary. The other is the lower bound of error rate (LBER) which is set as a learning switch. Before the performance measured by error rates is higher than LBER, negative correlation learning is applied on the whole training set. As soon as the performance is lower than LBER, negative correlation learning will only be applied to the group of data whose distances to the current decision boundary are within the range of UBEO. The other group of data outside of this range will not be learned anymore. Further learning on the data points in the later group would make the learned decision boundary too complex to classify the unseen data well. Experimental results would explore how LBER and UBEO would lead negative correlation learning towards a robust decision boundary.