Article reference:

M. Egmont-Petersen, W.R.M. Dassen, J.H.C. Reiber. "Sequential selection of discrete features for neural networks - a Bayesian approach to building a cascade," Pattern Recognition Letters, Vol. 20, No. 11-13, pp. 1439-1448, 1999.


A feature selection procedure is used to successively remove features one-by-one from a statistical classifier by an iterative backward search. Each classifier uses a smaller subset of features than the classifier in the previous iteration. The classifiers are subsequently combined into a cascade. Each classifier in the cascade should classify cases to which a reliable class label can be assigned. Other cases should be propagated to the next classifier which uses also the value of a new feature. Experiments demonstrate the feasibility of building cascades of classifiers (neural networks for prediction of atrial fibrillation (FA)) using a backward search scheme for feature selection.

Electronic reprint, or contact me: michael * (with * indicating @)