GMMB_CLASSIFY Classify data using Bayesian or Mahalanobis distance classifier. T = GMMB_CLASSIFY(S, data, ...) Classifies D dimensional data (N points) using Gaussian Mixture Model Bayesian classifier in struct S into K classes. S is a bayesS struct, see readme.txt. See also GMMB_CREATE. *** This is a legacy interface that is no longer developed. This does not use gmmb_generatehist but relies on the histS created during classifier training. *** Optional parameters: values Return numbers instead of class labels. For Mahalanobis boolean; return the Mahalanobis distances. For Bayesian: 0 class labels 1 posterior probabilities 2 posterior likelihood (no scaling) Default setting 0. T: N x K matrix, if enabled. mahalanobis Use Mahalanobis distances as classification rule instead of Bayesian. Boolean, default 0. quantile Use a 'trash class' based on training data density quantile. The function gmmb_generatehist will be called. Not available with Mahalanobis distance method. Range [0, 1], default 0 (not in use). If a sample goes to the trash class: class label = 0 all posterior probabilities = 0 all posterior likelihoods = 0 If a sample does not belong to a class in case values~=0, the respective posterior will be zero. References: [1] Duda, R.O., Hart, P.E, Stork, D.G, Pattern Classification, 2nd ed., John Wiley & Sons, Inc., 2001. Author(s): Joni Kamarainen <Joni.Kamarainen@lut.fi> Pekka Paalanen <pekka.paalanen@lut.fi> Copyright: Bayesian Classifier with Gaussian Mixture Model Pdf functionality is Copyright (C) 2003, 2004 by Pekka Paalanen and Joni-Kristian Kamarainen. $Name: $ $Revision: 1.2 $ $Date: 2005/04/14 10:33:34 $
0001 %GMMB_CLASSIFY Classify data using Bayesian or Mahalanobis distance classifier. 0002 % 0003 % T = GMMB_CLASSIFY(S, data, ...) Classifies D dimensional data (N points) 0004 % using Gaussian Mixture Model 0005 % Bayesian classifier in struct S into K classes. 0006 % S is a bayesS struct, see readme.txt. 0007 % 0008 % See also GMMB_CREATE. 0009 % 0010 % *** 0011 % This is a legacy interface that is no longer developed. 0012 % This does not use gmmb_generatehist but relies on the histS 0013 % created during classifier training. 0014 % *** 0015 % 0016 % Optional parameters: 0017 % values Return numbers instead of class labels. 0018 % For Mahalanobis boolean; return the Mahalanobis distances. 0019 % For Bayesian: 0 class labels 0020 % 1 posterior probabilities 0021 % 2 posterior likelihood (no scaling) 0022 % Default setting 0. T: N x K matrix, if enabled. 0023 % 0024 % mahalanobis Use Mahalanobis distances as classification rule 0025 % instead of Bayesian. Boolean, default 0. 0026 % 0027 % quantile Use a 'trash class' based on training data density quantile. 0028 % The function gmmb_generatehist will be called. 0029 % Not available with Mahalanobis distance method. 0030 % Range [0, 1], default 0 (not in use). 0031 % If a sample goes to the trash class: 0032 % class label = 0 0033 % all posterior probabilities = 0 0034 % all posterior likelihoods = 0 0035 % If a sample does not belong to a class in case values~=0, 0036 % the respective posterior will be zero. 0037 % 0038 % References: 0039 % [1] Duda, R.O., Hart, P.E, Stork, D.G, Pattern Classification, 0040 % 2nd ed., John Wiley & Sons, Inc., 2001. 0041 % 0042 % Author(s): 0043 % Joni Kamarainen <Joni.Kamarainen@lut.fi> 0044 % Pekka Paalanen <pekka.paalanen@lut.fi> 0045 % 0046 % Copyright: 0047 % 0048 % Bayesian Classifier with Gaussian Mixture Model Pdf 0049 % functionality is Copyright (C) 2003, 2004 by Pekka Paalanen and 0050 % Joni-Kristian Kamarainen. 0051 % 0052 % $Name: $ $Revision: 1.2 $ $Date: 2005/04/14 10:33:34 $ 0053 % 0054 0055 function [t] = gmmb_classify(bayesS, data_, varargin); 0056 0057 conf = struct(... 0058 'values', 0, ... 0059 'mahalanobis', 0, ... 0060 'quantile', 0); 0061 conf = getargs(conf, varargin); 0062 0063 0064 N = size(data_,1); 0065 K = size(bayesS,2); 0066 0067 % data_ is N x D 0068 0069 if conf.quantile ~= 0 0070 histS = gmmb_generatehist(bayesS, 1000); 0071 conf_thresh = gmmb_frac2lhood(histS, ... 0072 conf.quantile*ones(1,K)); 0073 clear histS; 0074 end 0075 0076 0077 if conf.mahalanobis 0078 %disp('Using Mahalanobis distance.'); 0079 % Mahalanobis distance classifier 0080 sqrmdist = zeros(N,K); 0081 for k = 1:K 0082 C = size(bayesS(k).mu, 2); 0083 sqrdist = zeros(N,C); 0084 for c = 1:C 0085 invs = inv(bayesS(k).sigma(:,:,c)); 0086 mu = bayesS(k).mu(:,c).'; 0087 sqrdist(:,c) = sum((data_*invs).*conj(data_),2) ... 0088 - data_*invs*mu' ... 0089 - (mu*invs*data_').' ... 0090 + mu*invs*mu'; 0091 end 0092 sqrmdist(:,k) = min(real(sqrdist), [], 2); 0093 end 0094 if conf.values ~= 0 0095 t = sqrmdist; 0096 else 0097 [a, b] = min(sqrmdist, [], 2); 0098 t = b'; 0099 end 0100 else 0101 % GMM Bayesian classifier 0102 0103 % classify all points simultaneously 0104 pxomega = gmmb_pdf(data_, bayesS); 0105 tulo = gmmb_weightprior(pxomega, bayesS); 0106 % tulo is the product of GMM PDF values and class apriories 0107 0108 % Zero out class likelihoods of samples that do not belong 0109 % to the class, based on the density quantile. 0110 if conf.quantile ~= 0 0111 mask = (pxomega < repmat(conf_thresh, N, 1)); 0112 tulo(mask) = 0; 0113 end 0114 0115 % Compute posteriors if requested. 0116 if conf.values == 1 0117 t = gmmb_normalize(tulo); 0118 else 0119 t = tulo; 0120 end 0121 0122 % Find the classification outcome if requested 0123 % Mark the samples that got into none of the known classes 0124 if (conf.values == 0) 0125 t = gmmb_decide(t); 0126 end 0127 end 0128