Using Bayesbrain together with FieldTrip data
Create the random variables
factors = cell(1,3); factors{1} = gaussian_cpd(1,[],3,[0; 0],{[]; []},[1; 1]); factors{2} = gaussian_cpd(2,[],3,[0; 0],{[]; []},[1; 1]); factors{3} = multinomial_cpd(3,[],[0.5; 0.5]); % optionally add names to the factors factors{1}.name = 'MLO32'; factors{2}.name = 'MRO32'; factors{3}.name = 'orientation'; factors{3}.statenames = {'left attention' 'right attention'};
Create simple bayes net
bn = bayesnet(factors);
This is what the plot would look like
Log likelihood of this model is pretty low since we did not train parameters
bn.loglik(data)
ans = -3.4906e+03
Learn parameters from complete data
bn = bn.learn_parameters(data);
Log likelihood has increased
bn.loglik(data)
ans = -2.2308
Plot the estimated prior distributions with continuous ones of the form
subplot(1,3,1); bn.factors{1}.plot(); legend('left attention','right attention'); subplot(1,3,2); bn.factors{2}.plot(); legend('left attention','right attention'); subplot(1,3,3); bn.factors{3}.plot(); set(gcf,'Position',[100 100 1500 400]);

Create an inference engine
ie = canonical_jtree_ie(bn);
triangulating model constructing potentials constructing junction tree computing messages
Add some evidence
ie.enter_evidence([nan -59.5 nan]);
Compute marginals
m1 = normalize(ie.marginalize(1)); m3 = normalize(ie.marginalize(3));
Plot the marginals after evidence propagation
figure subplot(1,2,1); m1.plot(); subplot(1,2,2); m3.plot();
