This is the base class for all nodes in a Bayesian network. Classes that inherit from this class must implement three pure virtual methods. Note that the GUnivariateDistribution class has an IsDiscrete and an IsSupported method, so if your class wraps a GUnivariateDistribution then two of them are taken care of for you. In order to implement ComputeLogLikelihood, your class will probably need references to its parent nodes so that it can obtain their values to use as parameters for its distribution. You can implement your network structure however you like. When you have your network set up, you're ready to use MCMC to infer values for the network. To do this, just create a loop that calls Sample on each node in the network, and the whole network should eventually converge to good values. (Also, you need to make GBayesianNetworkChildIterator work, which I haven't worked out yet.)
More...
#include <GBayesianNetwork.h>
List of all members.
Public Member Functions |
| GBayesianNetworkNode (double priorMean, double priorDeviation) |
virtual | ~GBayesianNetworkNode () |
virtual bool | isDiscrete ()=0 |
| This should return true iff this node supports only discrete values.
|
virtual bool | isSupported (double val)=0 |
| This should return true iff val is within the range of supported values. (If IsDiscrete returns true, val will contain only discrete values, so you don't need to check for that.)
|
virtual double | logLikelihood (double x, GBayesianNetworkNode *pSpecialParent, double specialParentValue)=0 |
| Compute the log-likelihood of the value "x" given the current values of all of this node's parent nodes, (except, to facilitate Gibb's Sampling, if one of the parent nodes is "pSpecialParent", then "specialParentValue" should be used for that parent's value instead of that parent's current value. If this function returns nan or anything <= -1e200, the sample will be rejected without consideration. (Note that these likelihoods don't need to be normalized. It's okay of they sum/integrate to a constant instead of to 1.)
|
double | currentValue () |
void | sample (GRand *pRand) |
| Uses a combination of Metropolis, and Gibb's Sampling to resample the node. (Also, this dynamically adjusts the sampling distribution variance.)
|
Protected Member Functions |
double | gibbs (double x) |
| Computes the log-probability of x (as a value for this node) given the current values for the entire rest of the network (aka the complete conditional), which according to Gibbs, is equal to the log-probability of x given the Markov-Blanket of this node, which we can compute efficiently.
|
bool | metropolis (GRand *pRand) |
| Sample the network in a manner that can be proven to converge to a true joint distribution for the network. Returns true if the new candidate value is selected.
|
Protected Attributes |
double | m_currentMean |
double | m_currentDeviation |
unsigned int | m_nSamples |
unsigned int | m_nNewValues |
double | m_sumOfValues |
double | m_sumOfSquaredValues |
Detailed Description
This is the base class for all nodes in a Bayesian network. Classes that inherit from this class must implement three pure virtual methods. Note that the GUnivariateDistribution class has an IsDiscrete and an IsSupported method, so if your class wraps a GUnivariateDistribution then two of them are taken care of for you. In order to implement ComputeLogLikelihood, your class will probably need references to its parent nodes so that it can obtain their values to use as parameters for its distribution. You can implement your network structure however you like. When you have your network set up, you're ready to use MCMC to infer values for the network. To do this, just create a loop that calls Sample on each node in the network, and the whole network should eventually converge to good values. (Also, you need to make GBayesianNetworkChildIterator work, which I haven't worked out yet.)
Constructor & Destructor Documentation
GClasses::GBayesianNetworkNode::GBayesianNetworkNode |
( |
double |
priorMean, |
|
|
double |
priorDeviation |
|
) |
| |
virtual GClasses::GBayesianNetworkNode::~GBayesianNetworkNode |
( |
| ) |
[virtual] |
Member Function Documentation
double GClasses::GBayesianNetworkNode::currentValue |
( |
| ) |
[inline] |
double GClasses::GBayesianNetworkNode::gibbs |
( |
double |
x | ) |
[protected] |
Computes the log-probability of x (as a value for this node) given the current values for the entire rest of the network (aka the complete conditional), which according to Gibbs, is equal to the log-probability of x given the Markov-Blanket of this node, which we can compute efficiently.
virtual bool GClasses::GBayesianNetworkNode::isDiscrete |
( |
| ) |
[pure virtual] |
This should return true iff this node supports only discrete values.
virtual bool GClasses::GBayesianNetworkNode::isSupported |
( |
double |
val | ) |
[pure virtual] |
This should return true iff val is within the range of supported values. (If IsDiscrete returns true, val will contain only discrete values, so you don't need to check for that.)
virtual double GClasses::GBayesianNetworkNode::logLikelihood |
( |
double |
x, |
|
|
GBayesianNetworkNode * |
pSpecialParent, |
|
|
double |
specialParentValue |
|
) |
| [pure virtual] |
Compute the log-likelihood of the value "x" given the current values of all of this node's parent nodes, (except, to facilitate Gibb's Sampling, if one of the parent nodes is "pSpecialParent", then "specialParentValue" should be used for that parent's value instead of that parent's current value. If this function returns nan or anything <= -1e200, the sample will be rejected without consideration. (Note that these likelihoods don't need to be normalized. It's okay of they sum/integrate to a constant instead of to 1.)
bool GClasses::GBayesianNetworkNode::metropolis |
( |
GRand * |
pRand | ) |
[protected] |
Sample the network in a manner that can be proven to converge to a true joint distribution for the network. Returns true if the new candidate value is selected.
void GClasses::GBayesianNetworkNode::sample |
( |
GRand * |
pRand | ) |
|
Uses a combination of Metropolis, and Gibb's Sampling to resample the node. (Also, this dynamically adjusts the sampling distribution variance.)
Member Data Documentation