Home > gmmbayestb-v1.0 > gmmb_gem.m

gmmb_gem

PURPOSE ^

GMMB_GEM - Greedy EM estimated GMM parameters

SYNOPSIS ^

function [estimate, varargout] = gmmb_gem(data, varargin);

DESCRIPTION ^

GMMB_GEM    - Greedy EM estimated GMM parameters
 Produces a bayesS struct without 'apriories'
 This is just a wrapper for the Vlassis Greedy EM algorithm implementation.

 estimate = GMMB_GEM(data[, parameters])
 [estimate,stats] = GMMB_GEM(...)

 Parameters (default):
   verbose    print some progress numbers (false)
   animate    plot data and ellipses during algorithm evaluation (false)
   Cmax    the maximum number of GMM components
   ncand    number of candidate locations for each new component (10)
 At least Cmax should be set explicitly.
 example:
    estS = gmmb_gem(data, 'Cmax', 10, 'animate', true);

 References:
   [1] Vlassis, N., Likas, A., A Greedy EM Algorithm for Gaussian Mixture
   Learning, Neural Processing Letters 15, Kluwer Academic Publishers, 2002.
   http://carol.wins.uva.nl/~vlassis/research/learning/index_en.html

 Author(s):
    Pekka Paalanen <pekka.paalanen@lut.fi>

 Copyright:

   Bayesian Classifier with Gaussian Mixture Model Pdf
   functionality is Copyright (C) 2003 by Pekka Paalanen and
   Joni-Kristian Kamarainen.

   $Name:  $ $Revision: 1.1 $  $Date: 2004/11/02 08:32:22 $


 Logging
   parameters

      logging   What kind of logging to do:
        0 - no logging
        1 - normal logging
        2 - extra logging: store all intermediate mixtures
      If the 'stats' output parameter is defined, then 'logging'
      defaults to 1, otherwise it is forced to 0.

  the 'stats' struct:
      iterations: EM iteration count
      loglikes:   iterations long vector of the log-likelihood
                NOTE: mean(log(p)), not sum(log(p)) as it should(?)
    extra logging: (not supported yet)
      initialmix: parameters for the initial mixture
      mixtures:   parameters for all intermediate mixtures

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

0001 %GMMB_GEM    - Greedy EM estimated GMM parameters
0002 % Produces a bayesS struct without 'apriories'
0003 % This is just a wrapper for the Vlassis Greedy EM algorithm implementation.
0004 %
0005 % estimate = GMMB_GEM(data[, parameters])
0006 % [estimate,stats] = GMMB_GEM(...)
0007 %
0008 % Parameters (default):
0009 %   verbose    print some progress numbers (false)
0010 %   animate    plot data and ellipses during algorithm evaluation (false)
0011 %   Cmax    the maximum number of GMM components
0012 %   ncand    number of candidate locations for each new component (10)
0013 % At least Cmax should be set explicitly.
0014 % example:
0015 %    estS = gmmb_gem(data, 'Cmax', 10, 'animate', true);
0016 %
0017 % References:
0018 %   [1] Vlassis, N., Likas, A., A Greedy EM Algorithm for Gaussian Mixture
0019 %   Learning, Neural Processing Letters 15, Kluwer Academic Publishers, 2002.
0020 %   http://carol.wins.uva.nl/~vlassis/research/learning/index_en.html
0021 %
0022 % Author(s):
0023 %    Pekka Paalanen <pekka.paalanen@lut.fi>
0024 %
0025 % Copyright:
0026 %
0027 %   Bayesian Classifier with Gaussian Mixture Model Pdf
0028 %   functionality is Copyright (C) 2003 by Pekka Paalanen and
0029 %   Joni-Kristian Kamarainen.
0030 %
0031 %   $Name:  $ $Revision: 1.1 $  $Date: 2004/11/02 08:32:22 $
0032 %
0033 %
0034 % Logging
0035 %   parameters
0036 %
0037 %      logging   What kind of logging to do:
0038 %        0 - no logging
0039 %        1 - normal logging
0040 %        2 - extra logging: store all intermediate mixtures
0041 %      If the 'stats' output parameter is defined, then 'logging'
0042 %      defaults to 1, otherwise it is forced to 0.
0043 %
0044 %  the 'stats' struct:
0045 %      iterations: EM iteration count
0046 %      loglikes:   iterations long vector of the log-likelihood
0047 %                NOTE: mean(log(p)), not sum(log(p)) as it should(?)
0048 %    extra logging: (not supported yet)
0049 %      initialmix: parameters for the initial mixture
0050 %      mixtures:   parameters for all intermediate mixtures
0051 %
0052 
0053 function [estimate, varargout] = gmmb_gem(data, varargin);
0054 
0055 [N, D] = size(data);    % number of points (n), dimensions (d)
0056 
0057 % defaults
0058 conf = struct(...
0059     'verbose', 0, ...
0060     'animate', 0, ...
0061     'Cmax', ceil(min(50, N/(D*D)/3)), ...
0062     'ncand', 10, ...
0063     'logging', 0 ...
0064     );
0065 
0066 if nargout>1
0067     conf.logging = 1;
0068     varargout{1} = [];
0069 end
0070 
0071 conf = getargs(conf, varargin);
0072 
0073 C = conf.Cmax;
0074 
0075 N_limit = (D+D*(D+1)/2+1)*3;
0076 if N < N_limit
0077     warning_wrap('gmmb_gem:data_amount', ...
0078        ['Training data may be insufficient. ' ...
0079         'Have: ' num2str(N) ', recommended: >' num2str(N_limit) ...
0080         ' points.']);
0081 end
0082 
0083 
0084 if nargout<2
0085     conf.logging=0;
0086 end
0087 
0088 [W, M, R, stats] = gmmbvl_em(data, C, conf.ncand, conf.animate, ...
0089                              conf.verbose, conf.logging);
0090 
0091 Cfinal = size(R,1);
0092 sigma = zeros(D, D, Cfinal);
0093 for c = 1:Cfinal
0094     Rk = reshape(R(c,:),D,D);
0095     sigma(:,:,c) = Rk' * Rk;
0096 end
0097 
0098 estimate = struct('mu', M.',...
0099     'sigma', sigma,...
0100     'weight', W);
0101 
0102 if(conf.logging>0)
0103     varargout{1} = stats;
0104 end

Generated on Thu 14-Apr-2005 13:50:22 by m2html © 2003