Contents:
Enter all digits found on the item e. An ISSN is a standardized international code which allows the identification of a serial publication. An ISSN consists of eight digits in two groups of four, separated by a hyphen. You can enter an ISSN with or without a hyphen or leading zeros as shown below: FAST headings provide additional subject information about a work and enable you to search the Classify database by subject. OCLC is a non-profit library cooperative, made up of thousands of member libraries throughout the world. OCLC does not sell books and the actual content is not provided in our database.
Libraries add bibliographic records to WorldCat representing books and other items in their collections. The Classify prototype helps librarians apply classification numbers to resources in library collections. Your local library may be able to help you gain access to a resource found through Classify and WorldCat. Classification schemes are used to provide a systematic arrangement of materials. The classification numbers applied to books and other materials are used to arrange items on shelves and to support browsing, filtering and retrieval of bibliographic information in online systems.
The Classify prototype is designed to help users apply classification numbers. Classify provides a user interface and a machine service for assigning classification numbers and subject headings. The database is searchable by many of the standard numbers associated with books, magazines, journals, and music and video recordings. Learn More About Classify.
Enter ISBNs with our without hyphens. Suykens, J. Vandewalle, and B. De Moor. Intelligence and cooperative search by coupled local minimizers. International Journal of Bifurcation and Chaos , 11 8 , Tapia, J. Gonzalez, and J. A generalized class of boosting algorithms based on recursive decoding models. Committee machines. In Handbook on neural Network Signal Processing. CRC Press, PDF Volker Tresp. Mixtures of gaussian processes. Viola and M. Robust real-time object recognition. ICCV , Walker, O.
Rambow, and M. Spot: A trainable sentence planner.
Wickramaratna, S. Holden, and B. Performance degradation in boosting. Yan and D. Critic-driven ensemble classification via a learning method akin to boosting. A big mistake concerning boosting. Zemel and T. A gradient-based boosting algorithm for regression problems. MIT Press.
PDF T. Statistical behavior and consistency of classification methods based on convex risk minimization. PDF E.
Allwein, R. Schapire, and Y. Reducing multiclass to binary: A unifying approach for margin classifiers. Journal of Machine Learning Research , , PDF K. Bennett, A. Demiriz, and J. A column generation algorithm for boosting. Morgan Kaufmann, PDF L. Some infinite theory for predictor ensembles. Collins, R. Logistic regression, AdaBoost and bregman distances. PDF H. Effect of pruning and early stopping stopping on performance of a boosted ensemble. Duffy and D. Leaveraging for regression. Potential boosters?
Solla, T. Leen, and K. MIT Press, Escudero, L. Boosting applied to word sense disambiguation. PDF D. Freitag and N. Boosted wrapper induction. Freund and R. Friedman, T. Hastie and R. The Annals of Statistics , 38 2 , Huang, Z. Zhou, H. Zhang, and T. Pose invariant face recognition. Iyer, D. Lewis, R. Schapire, Y. Singer, and A.
Boosting for document routing. PDF W. Does boosting overfit: Views from an exact solution. Is regularization unnecessary for boosting? On weak base hypotheses and their implications for boosting regression and classification. Process consistency for adaboost. Some results on weakly accurate base learners for boosting regression and classification.
N-tuple network, cart and bagging. Neural Computation , 12 2 , Koltchinskii and D. Bounding the generalization error of neural networks and combined classifiers.
Empirical margin distributions and bounding the generalization error of combined classifiers. Submitted, Bounding the generalization error of convex combinations of classifiers: Balancing the dimensionality and the margins. Liu, X. Yao, and T.
Evolutionary ensembles with negative correlation learning.