GClasses

Class List

Here are the classes, structs, unions and interfaces with brief descriptions:
GClasses::ArrayHolder< T >Just like Holder, except for arrays
GClasses::SOM::BatchTrainingImplements the batch training algorithm for self-organizing maps as described in T. Kohonen "Self Organizing Maps" Third Edition, 2001, published by Springer
GClasses::ComplexNumber
GClasses::SOM::DummyTrainingAlgorithmA training algorithm that throws an exception when train is called - stub for fully serializing training algorithms
GClasses::FileHolderCloses a file when this object goes out of scope
GClasses::G2DRegionGraphImplements a region adjacency graph for 2D images, and lets you merge similar regions to create a hierarchical breakdown of the image
GClasses::G3dLetterMaker
GClasses::G3DMatrixRepresents a 3x3 matrix
GClasses::G3DVectorRepresents a 3D vector
GClasses::GActionPath
GClasses::GActionPathSearchThis is the base class of search algorithms that can only perform a discreet set of actions (as opposed to jumping to anywhere in the search space), and seeks to minimize the error of a path of actions
GClasses::GActionPathState
GClasses::GActivationAlgebraicThe hyperbolic tangent activation function
GClasses::GActivationArcTanThe arctan activation function
GClasses::GActivationBendThis provides an alternative to using GActivationIdentity on the output layer for regression problems. It may add more power because it is non-linear, but like the identity function, its co-domain is the same as its domain
GClasses::GActivationBiDirThis is an output-layer activation function shaped like a sigmoid, but with both a co-domain and domain that spans the continuous values
GClasses::GActivationFunctionThe base class for activation functions. Typically, this are sigmoid-shaped functions used to "squash" the output of a network node. These are typically used in conjunction with the GNeuralNet class
GClasses::GActivationGaussianThis is a simple Gaussian function
GClasses::GActivationIdentityUse this function when you do not want to squash the net. For example, using this activation function with a network that has no hidden layers makes a perceptron model. Also, it is common to use this activation function on the output layer for regression problems
GClasses::GActivationLogisticThe logistic activation function
GClasses::GActivationPiecewiseThis is an experimental activation function intended to reduce the required computation involved in inverting neural networks
GClasses::GActivationSincThis is a canonical wavelet
GClasses::GActivationTanHThe hyperbolic tangent activation function
GClasses::GAdaBoost
GClasses::GAgentActionIteratorIterates through all the actions that are valid in the current state. If actions are continuous or very numerous, this should sample valid actions in a random order. The caller may decide that it has sampled enough at any time
GClasses::GAgglomerativeClustererThis merges each cluster with its closest neighbor. (The distance between clusters is computed as the distance between the closest members of the clusters times (n^b), where n is the total number of points from both clusters, and b is a balancing factor
GClasses::GAgglomerativeTransducerThis is a semi-supervised agglomerative clusterer. It can only handle one output, and it must be nominal. All inputs must be continuous. Also, it assumes that all output values are represented in the training set
GClasses::GAnnealingThis algorithm tries the current direction and a slightly perturbed direction at each step. If the perturbed direction resulted in faster improvement, it becomes the new current direction. As long as the current direction yields improvement, it accelerates, otherwise it decelerates
GClasses::GAppContains some generally useful functions for launching applications
GClasses::GArffAttribute
GClasses::GArffRelationARFF = Attribute-Relation File Format. This stores richer information than GRelation. This includes a name, a name for each attribute, and names for each supported nominal value
GClasses::GArgReaderParses command-line args and provides methods to conveniently process them
GClasses::GAtomicCycleFinderThis finds all of the atomic cycles (cycles that cannot be divided into two smaller cycles) in a graph
GClasses::GAttributeSelectorGenerates subsets of data that contain only the most relevant features for predicting the labels. The train method of this class produces a ranked ordering of the feature attributes by training a single-layer neural network, and deselecting the weakest attribute until all attributes have been deselected. The transform method uses only the highest-ranked attributes
GClasses::SOM::GaussianWindowFunctionUses a unit-height, zero-mean Gaussian weighting with the width as sigma truncated to 0 at 5 standard deviations
GClasses::GBackPropThis class performs backpropagation on a neural network. (I made it a separate class because it is only needed during training. There is no reason to waste this space after training is complete, or if you choose to use a different technique to train the neural network.)
GClasses::GBackPropLayerAn internal class used by GBackProp
GClasses::GBackPropNeuronAn internal class used by GBackProp
GClasses::GBackPropWeightAn internal class used by GBackProp
GClasses::GBagBAG stands for bootstrap aggregator. It represents an ensemble of voting modelers. Each model is trained with a slightly different training set, which is produced by drawing randomly from the original training set with replacement until we have a new training set of the same size. Each model is given equal weight in the vote
GClasses::GBagOfRecommendersThis class performs bootstrap aggregation with collaborative filtering algorithms
GClasses::GBaselineLearnerAlways outputs the label mean (for continuous labels) and the most common class (for nominal labels)
GClasses::GBaselineRecommenderThis class always predicts the average rating for each item, no matter to whom it is making the recommendation. The purpose of this algorithm is to serve as a baseline for comparison
GClasses::GBayesianModelAveragingThis is an ensemble that uses the bagging approach for training, and Bayesian Model Averaging to combine the models. That is, it trains each model with data drawn randomly with replacement from the original training data. It combines the models with weights proporitional to their likelihood as computed using Bayes' law
GClasses::GBayesianModelCombination
GClasses::GBayesianNetworkChildIteratorIterates through all the children of the specified node in a Bayesian network
GClasses::GBayesianNetworkNodeThis is the base class for all nodes in a Bayesian network. Classes that inherit from this class must implement three pure virtual methods. Note that the GUnivariateDistribution class has an IsDiscrete and an IsSupported method, so if your class wraps a GUnivariateDistribution then two of them are taken care of for you. In order to implement ComputeLogLikelihood, your class will probably need references to its parent nodes so that it can obtain their values to use as parameters for its distribution. You can implement your network structure however you like. When you have your network set up, you're ready to use MCMC to infer values for the network. To do this, just create a loop that calls Sample on each node in the network, and the whole network should eventually converge to good values. (Also, you need to make GBayesianNetworkChildIterator work, which I haven't worked out yet.)
GClasses::GBetaDistributionThe Beta distribution
GClasses::GBezierRepresents a Bezier curve
GClasses::GBigIntRepresents an integer of arbitrary size, and provides basic arithmetic functionality. Also contains functionality for implementing RSA symmetric-key cryptography
GClasses::GBillboardThis is a billboard (a 2-D image in a 3-D world) for use with GBillboardWorld. You can set m_repeatX and/or m_repeatY to make the image repeat across the billboard
GClasses::GBillboardWorldThis class represents a world of billboards, and provides a rendering engine
GClasses::GBitReverser_imp< T, numBits >Template used for reversing numBits of type T. You shouldn't need this, use the function reverseBits
GClasses::GBitReverser_imp< T, 1 >Base case of template used for reversing numBits of type T. You shouldn't need this, use the function reverseBits
GClasses::GBitsContains various functions for bit analysis
GClasses::GBitTableRepresents a table of bits
GClasses::GBlobIncomingThis class is for deserializing blobs. It takes care of Endianness issues and protects against buffer overruns. This class would be particularly useful for writing a network protocol
GClasses::GBlobOutgoingThis class is for serializing objects. It is the complement to GBlobIncoming
GClasses::GBlobQueueThis is a special queue for handling blobs that come in and go out in varying sizes. It is particulary designed for streaming things that must travel or be parsed in packets that may differ in size from how they are sent or transmitted
GClasses::GBrandesBetweennessCentralityComputes the number of times that the shortest-path between every pair of points passes over each edge and vertex
GClasses::GBreadthFirstUnfoldingA manifold learning algorithm that reduces dimensionality in local neighborhoods, and then stitches the reduced local neighborhoods together using the Kabsch algorithm
GClasses::GBruteForceNeighborFinderFinds neighbors by measuring the distance to all points. This one should work properly even if the distance metric does not support the triangle inequality
GClasses::GBruteForceSearchThis performs a brute force search with uniform sampling over the unit hypercube with increasing granularity. (Your target function should scale the candidate vectors as necessary to cover the desired space.)
GClasses::GBucketWhen Train is called, this performs cross-validation on the training set to determine which learner is the best. It then trains that learner with the entire training set
GClasses::GCameraThis camera assumes the canvas is specified in cartesian coordinates. The 3D space is based on a right-handed coordinate system. (So if x goes to the right and y goes up, then z comes out of the screen toward you.)
GClasses::GCategoricalDistributionThis is a distribution that specifies a probability for each value in a set of nominal values
GClasses::GCategoricalSamplerThis class is for efficiently drawing random values from a categorical distribution with a large number of categories
GClasses::GCategoricalSamplerBatch
GClasses::GCharSetThis is a helper-class used by GTokenizer. Use GTokenizer::charSet to create
GClasses::GChessBoardRepresents the state of a chess board, and provides some basic functionality for implementing a chess game
GClasses::GChessMoveIteratorIterates through all the possible moves for the specified color. It iterates through the pieces in a random order. It also iterates through the moves for each piece in a random order, but it will visit each move for the current piece before considering the next piece
GClasses::GClustererThe base class for clustering algorithms. Classes that inherit from this class must implement a method named "cluster" which performs clustering, and a method named "whichCluster" which reports which cluster the specified row is determined to be a member of
GClasses::GCollaborativeFilterThe base class for collaborative filtering recommender systems
GClasses::GCompressorThis implements a simple compression/decompression algorithm
GClasses::GConstStringHashTableHash table based on keys of constant strings (or at least strings that won't change during the lifetime of the hash table). It's a good idea to use a GHeap in connection with this class
GClasses::GConstStringToIndexHashTableHash table based on keys of constant strings (or at least strings that won't change during the lifetime of the hash table). It's a good idea to use a GHeap in connection with this class
GClasses::GCoordVectorIteratorAn iterator for an n-dimensional coordinate vector. For example, suppose you have a 4-dimensional 2x3x2x1 grid, and you want to iterate through its coordinates: (0000, 0010, 0100, 0110, 0200, 0210, 1000, 1010, 1100, 1110, 1200, 1210). This class will iterate over coordinate vectors in this manner. (For 0-dimensional coordinate vectors, it behaves as though the origin is the only valid coordinate.)
GClasses::GCosineSimilarityThis is a similarity metric that computes the cosine of the angle bewtween two sparse vectors
GClasses::GCryptoThis is a symmetric-key block-cypher. It utilizes a 2048-byte internal state which is initialized using the passphrase. It uses repeated applications of sha-512 to advance the internal state, and to generate an 1024-byte pad that it xor's with your data to encrypt or decrypt it. Warning: You use this algorithm at your own risk. Many encryption algorithms eventually turn out to be insecure, and to my knowledge, this algorithm has not yet been extensively scrutinized
GClasses::GCycleCutThis finds the shortcuts in a table of neighbors and replaces them with INVALID_INDEX
GClasses::GDecisionTreeThis is an efficient learning algorithm. It divides on the attributes that reduce entropy the most, or alternatively can make random divisions
GClasses::GDenseClusterRecommenderThis class clusters the rows according to a dense distance metric, then uses the baseline vector in each cluster to make predictions
GClasses::GDiffThis class finds the differences between two text files It is case and whitespace sensitive, but is tolerant of Unix/Windows/Mac line endings. It uses lines as the atomic unit. It accepts matching lines in a greedy manner
GClasses::GDiffLineThis is a helper struct used by GDiff
GClasses::GDijkstraFinds the shortest path from an origin vertex to all other vertices. Implemented with a binary-heap priority-queue. If the graph is sparse on edges, it will run in about O(n log(n)) time. If the graph is dense, it runs in about O(n^2 log(n))
GClasses::GDirListThis class contains a list of files and a list of folders. The constructor populates these lists with the names of files and folders in the current working directory
GClasses::GDiscreteActionIteratorThis is a simple and common action iterator that can be used when there is a discrete set of possible actions
GClasses::GDiscretizeThis transform uses buckets to convert continuous data into discrete data. It is common to use GFilter to combine this with your favorite modeler (which only supports discrete values) to create a modeler that can also support continuous values as well
GClasses::GDistanceMetricThis class enables you to define a distance (or dissimilarity) metric between two vectors. pScaleFactors is an optional parameter (it can be NULL) that lets the calling class scale the significance of each dimension. Distance metrics that do not mix with this concept may simply ignore any scale factors. Typically, classes that use this should be able to assume that the triangle inequality will hold, but do not necessarily enforce the parallelogram law
GClasses::GDistribution
GClasses::GDomA Document Object Model. This represents a document as a hierarchy of objects. The DOM can be loaded-from or saved-to a file in JSON (JavaScript Object Notation) format. (See http://json.org.) In the future, support for XML and/or other formats may be added
GClasses::GDomListIteratorThis class iterates over the items in a list node
GClasses::GDomNodeRepresents a single node in a DOM
GClasses::GDoubleRectRepresents a rectangular region with doubles
GClasses::GDynamicPageServer
GClasses::GDynamicPageSession
GClasses::GDynamicPageSessionExtension
GClasses::GDynamicSystemStateAlignerThis uses graph-cut to divide the data into two clusters. It then trains a linear regression model for each cluster to map from inputs to change-in-state. It then aligns the smaller cluster with the larger one such that the linear models are in agreement (as much as possible)
GClasses::GEmpiricalGradientDescentThis algorithm does a gradient descent by feeling a small distance out in each dimension to measure the gradient. For efficiency reasons, it only measures the gradient in one dimension (which it cycles round-robin style) per iteration and uses the remembered gradient in the other dimensions
GClasses::GEnsembleThis is a base-class for ensembles that combine the predictions from multiple weightd models
GClasses::GEvolutionaryOptimizerUses an evolutionary process to optimize a vector
GClasses::GExtendedKalmanFilterThis is an implementation of the Extended Kalman Filter. This class is used by alternately calling advance and correct
GClasses::GFileContains some useful routines for manipulating files
GClasses::GFloatRectRepresents a rectangular region with floats
GClasses::GFloydWarshallComputes the shortest-cost path between all pairs of vertices in a graph. Takes O(n^3) time
GClasses::GFolderDeserializerThis class complements GFolderSerializer
GClasses::GFolderSerializerThis turns a file or a folder (and its contents recursively) into a stream of bytes
GClasses::GFourierFourier transform
GClasses::GFourierWaveProcessorThis is an abstract class that processes a wave file in blocks. Specifically, it divides the wave file up into overlapping blocks, converts them into Fourier space, calls the abstract "process" method with each block, converts back from Fourier space, and then interpolates to create the wave output
GClasses::GFunctionThis class represents a math function. (It might be used, for example, in a plotting tool.)
GClasses::GFunctionParserThis class parses math equations. (This is useful, for example, for plotting tools.)
GClasses::GFuzzyKMeansA K-means clustering algorithm where every point has partial membership in each cluster. This algorithm is specified in Li, D. and Deogun, J. and Spaulding, W. and Shuart, B., Towards missing data imputation: A study of fuzzy K-means clustering method, In Rough Sets and Current Trends in Computing, Springer, pages 573--579, 2004
GClasses::GGammaDistributionThe Gamma distribution
GClasses::GGraphCutThis implements an optimized max-flow/min-cut algorithm described in "An experimental comparison of min-cut/max-flow algorithms for energy minimization in vision" by Boykov, Y. and Kolmogorov, V. This implementation assumes that edges are undirected
GClasses::GGraphCutTransducerA transduction algorithm that uses a max-flow/min-cut graph-cut algorithm to partition the data until each class is in a separate cluster. Unlabeled points are then assigned the label of the cluster in which they fall
GClasses::GGraphEdgeIteratorIterates over the edges that connect to the specified node
GClasses::GHashTableImplements a typical hash table. (It doesn't take ownership of the objects you add, so you must still delete them yourself.)
GClasses::GHashTableBaseThe base class of hash tables
GClasses::GHashTableEnumeratorThis class iterates over the values in a hash table
GClasses::GHeapProvides a heap in which to put strings or whatever you need to store. If you need to allocate space for a lot of small objects, it's much more efficient to use this class than the C++ heap. Plus, you can delete them all by simply deleting the heap. You can't, however, reuse the space for individual objects in this heap
GClasses::GHiddenMarkovModel
GClasses::GHillClimber
GClasses::GHistogramGathers values and puts them in bins
GClasses::GHtmlThis class is for parsing HTML files. It's designed to be very simple. This class might be useful, for example, for building a web-crawler or for extracting readable text from a web page
GClasses::GHttpClientThis class allows you to get files using the HTTP protocol
GClasses::GHttpMultipartParser
GClasses::GHttpParamParserA class for parsing the name/value pairs that follow the "?" in a URL
GClasses::GHttpServerThis class allows you to implement a simple HTTP daemon
GClasses::GIdentityFunctionThis is an implementation of the identity function. It might be useful, for example, as the observation function in a GRecurrentModel if you want to create a Jordan network
GClasses::GImageRepresents an image
GClasses::GImageJittererGiven an image encoded as a rasterized row of channel values, this class computes a single pixel drawn from the image as if the image had been rotated, translated, and zoomed by a small random amount. (The purpose of this class is to make it possible to train GUnsupervisedBackProp to understand these common image-based transformations.)
GClasses::GImputeMissingVals
GClasses::GIncrementalLearnerThis is the base class of supervised learning algorithms that can learn one row at a time
GClasses::GIncrementalLearnerQAgentThis is an implementation of GQLearner that uses an incremental learner for its Q-table and a SoftMax (usually pick the best action, but sometimes randomly pick the action) strategy to balance between exploration vs exploitation. To use this class, you need to supply an incremental learner (see the comment for the constructor for more details) and to implement the GetRewardForLastAction method
GClasses::GIncrementalTransformThis is the base class of algorithms that can transform data one row at a time without supervision
GClasses::GIndexVecUseful functions for operating on vectors of indexes
GClasses::GInstanceRecommenderThis class makes recommendations by finding the nearest-neighbors (as determined by evaluating only overlapping ratings), and assuming that the ratings of these neighbors will be predictive of your ratings
GClasses::GInstanceTableThis represents a grid of values. It might be useful as a Q-table with Q-learning
GClasses::GInverseGammaDistributionThe inverse Gamma distribution
GClasses::GIsomapIsomap is a manifold learning algorithm that uses the Floyd-Warshall algorithm to compute an estimate of the geodesic distance between every pair of points using local neighborhoods, and then uses classic multidimensional scaling to compute a low-dimensional projection
GClasses::GKdTreeAn efficient algorithm for finding neighbors
GClasses::GKernelThe base class for kernel functions. Classes which implement this must provide an "apply" method that applies the kernel to two vectors. Kernels may be combined together to form a more complex kernel, to which the kernel trick will still apply
GClasses::GKernelAddAn addition kernel
GClasses::GKernelExpThe Exponential kernel
GClasses::GKernelGaussianRBFA Gaussian RBF kernel
GClasses::GKernelIdentityThe identity kernel
GClasses::GKernelMultiplyA multiplication kernel
GClasses::GKernelNormalizeA Normalizing kernel
GClasses::GKernelPolynomialA polynomial kernel
GClasses::GKernelPowA power kernel
GClasses::GKernelScaleA scalar kernel
GClasses::GKernelTranslateA translation kernel
GClasses::GKeyPair
GClasses::GKMeansAn implementation of the K-means clustering algorithm
GClasses::GKMeansSparseAn implementation of the K-means clustering algorithm
GClasses::GKMedoidsAn implementation of the K-medoids clustering algorithm
GClasses::GKMedoidsSparseAn implementation of the K-medoids clustering algorithm for sparse data
GClasses::GKNNThe k-Nearest Neighbor learning algorithm
GClasses::GLearnerLoaderThis class is for loading various learning algorithms from a DOM. When any learning algorithm is saved, it calls baseDomNode, which creates (among other things) a field named "class" which specifies the class name of the algorithm. This class contains methods that will recognize any of the classes in this library and load them. If it doesn't recognize a class, it will either return NULL or throw and exception, depending on the flags you pass to the constructor. Obviously this loader won't recognize any classes that you make. Therefore, you should overload the corresponding method in this class with a new method that will first recognize and load your classes, and then call these methods to handle other types
GClasses::GLinearProgramming
GClasses::GLinearRegressorA linear regression model. Let f be a feature vector of real values, and let l be a label vector of real values, then this model estimates l=Bf+e, where B is a matrix of real values, and e is a vector of real values. (In the Wikipedia article on linear regression, B is called "beta", and e is called "epsilon". The approach used by this model to compute beta and epsilon, however, is much more efficient than the approach currently described in that article.)
GClasses::GLLELocally Linear Embedding is a manifold learning algorithm that uses sparse matrix techniques to efficiently compute a low-dimensional projection
GClasses::GLNormDistanceInterpolates between manhattan distance (norm=1), Euclidean distance (norm=2), and Chebyshev distance (norm=infinity). For nominal attributes, Hamming distance is used
GClasses::GManifoldThis class stores static methods that are useful for manifold learning
GClasses::GManifoldLearnerThis is the base class of manifold learning (aka non-linear dimensionality reducing) algorithms
GClasses::GManifoldSculptingManifold Sculpting. A non-linear dimensionality reduction algorithm. (See Gashler, Michael S. and Ventura, Dan and Martinez, Tony. Iterative non-linear dimensionality reduction with manifold sculpting. In Advances in Neural Information Processing Systems 20, pages 513–520, MIT Press, Cambridge, MA, 2008.)
GClasses::GMathProvides some useful math functions
GClasses::GMatrixRepresents a matrix or a database table. Elements can be discrete or continuous. References a GRelation object, which stores the meta-information about each column
GClasses::GMatrixArrayRepresents an array of matrices or datasets that all have the same number of columns
GClasses::GMatrixFactorizationThis factors the sparse matrix of ratings, M, such that M = PQ^T where each row in P gives the principal preferences for the corresponding user, and each row in Q gives the linear combination of those preferences that map to a rating for an item. (Actually, P and Q also contain an extra column added for a bias.) This class is implemented according to the specification on page 631 in Takacs, G., Pilaszy, I., Nemeth, B., and Tikk, D. Scalable collaborative filtering approaches for large recommender systems. The Journal of Machine Learning Research, 10:623–656, 2009. ISSN 1532-4435., except with the addition of learning-rate decay and a different stopping criteria
GClasses::GMeanMarginsTreeA GMeanMarginsTree is an oblique decision tree specified in Gashler, Michael S. and Giraud-Carrier, Christophe and Martinez, Tony. Decision Tree Ensemble: Small Heterogeneous Is Better Than Large Homogeneous. In The Seventh International Conference on Machine Learning and Applications, Pages 900 - 905, ICMLA '08. 2008. It divides features as follows: It finds the mean and principle component of the output vectors. It divides all the vectors into two groups, one that has a positive dot-product with the principle component (after subtracting the mean) and one that has a negative dot-product with the principle component (after subtracting the mean). Next it finds the average input vector for each of the two groups. Then it finds the mean and principle component of those two vectors. The dividing criteria for this node is to subtract the mean and then see whether the dot-product with the principle component is positive or negative
GClasses::GMergeDataHolderThis class guarantees that the rows in b are merged vertically back into a when this object goes out of scope
GClasses::GMixedRelation
GClasses::GMixtureOfGaussiansThis class uses Expectency Maximization to find the mixture of Gaussians that best approximates the data in a specified real attribute of a data set
GClasses::GMomentumGreedySearchAt each iteration this algorithm moves in only one dimension. If the situation doesn't improve it tries the opposite direction. If both directions are worse, it decreases the step size for that dimension, otherwise it increases the step size for that dimension
GClasses::GMultivariateNormalDistributionA multivariate Normal distribution. It can compute the likelihood of a specified vector, and can also generate random vectors from the distribution
GClasses::GNaiveBayesA naive Bayes classifier
GClasses::GNaiveInstanceThis is an instance-based learner. Instead of finding the k-nearest neighbors of a feature vector, it finds the k-nearst neighbors in each dimension. That is, it finds n*k neighbors, considering each dimension independently. It then combines the label from all of these neighbors to make a prediction. Finding neighbors in this way makes it more robust to high-dimensional datasets. It tends to perform worse than k-nn in low-dimensional space, and better than k-nn in high-dimensional space. (It may be thought of as a cross between a k-nn instance learner and a Naive Bayes learner. It only supports continuous features and labels (so it is common to wrap it in a Categorize filter which will convert nominal features to a categorical distribution of continuous values)
GClasses::GNeighborFinderFinds the k-nearest neighbors of any vector in a dataset
GClasses::GNeighborFinderCacheWrapperThis wraps a neighbor finding algorithm. It caches the queries for neighbors for the purpose of improving runtime performance
GClasses::GNeighborFinderGeneralizingFinds the k-nearest neighbors (in a dataset) of an arbitrary vector (which may or may not be in the dataset)
GClasses::GNeighborTransducerAn instance-based transduction algorithm
GClasses::GNeuralNetAn artificial neural network
GClasses::GNeuralNetInverseLayerA helper class used by GNeuralNetPseudoInverse
GClasses::GNeuralNetLayerRepresents a layer of neurons in a neural network
GClasses::GNeuralNetPseudoInverseComputes the pseudo-inverse of a neural network
GClasses::GNeuronRepresents a single neuron in a neural network
GClasses::GNeuroPCAThis class is a generalization of PCA. When the bias is clamped, and the activation function is "identity", it is strictly equivalent to PCA. By default, however, the bias is allowed to drift from the mean, which gives better results. Also, by default, the activation function is "logistic", which enables it to find non-linear components in the data. (GUnsupervisedBackProp is a multi-layer generalization of this algorithm.)
GClasses::GNodeHashTableThis is a hash table that uses any object which inherits from HashTableNode as the key
GClasses::GNoiseGeneratorJust generates Gaussian noise
GClasses::GNominalToCatThis is sort-of the opposite of discretize. It converts each nominal attribute to a categorical distribution by representing each value using the corresponding row of the identity matrix. For example, if a certain nominal attribute has 4 possible values, then a value of 3 would be encoded as the vector 0 0 1 0. When predictions are converted back to nominal values, the mode of the categorical distribution is used as the predicted value. (This is similar to Weka's NominalToBinaryFilter.)
GClasses::GNonlinearPCAThis class trains a neural network to fit to the ratings. Although the name implies that it is an extension of PCA, I think it is better described as a non-linear generalization of matrix factorization. This algorithm was published in Scholz, M. Kaplan, F. Guy, C. L. Kopka, J. Selbig, J., Non-linear PCA: a missing data approach, In Bioinformatics, Vol. 21, Number 20, pp. 3887-3895, Oxford University Press, 2005
GClasses::GNormalDistributionThis is the Normal (a.k.a. Gaussian) distribution
GClasses::GNormalizeThis transform scales and shifts continuous values to make them fall within a specified range
GClasses::GNurbsNURBS = Non Uniform Rational B-Spline Periodic = closed loop
GClasses::GOptimizerThis is the base class of all search algorithms that can jump to any vector in the search space seek the vector that minimizes error
GClasses::GOverrunSentinelPlacing these on the stack can help catch buffer overruns
GClasses::GPackageClientThis class abstracts a client that speaks a home-made protocol that guarantees packages will arrive in the same order and size as when they were sent. This protocol is a simple layer on top of TCP
GClasses::GPackageServerThis class abstracts a server that speaks a home-made protocol that guarantees packages will arrive in the same order and size as when they were sent. This protocol is a simple layer on top of TCP
GClasses::GPairProductGenerates data by computing the product of each pair of attributes. This is useful for augmenting data
GClasses::GParallelOptimizersThis class simplifies simultaneously solving several optimization problems
GClasses::GParticleSwarmAn optimization algorithm inspired by flocking birds
GClasses::GPassiveConsoleThis class provides a non-blocking method for reading characters from stdin. (If there are no characters ready in stdin, it immediately returns '\0'.) The constructor sets flags on the console so that it passes characters to the stream immediately (instead of when Enter is pressed), and so that it doesn't echo the keys (if desired), and it makes stdin non-blocking. The destructor puts all those things back the way they were
GClasses::GPCAPrincipal Component Analysis. (Computes the principal components about the mean of the data when you call train. The transformed (reduced-dimensional) data will have a mean about the origin.)
GClasses::GPCARotateOnlyPrinciple Component Analysis without the projection. It only rotates axes to align with the first few principal components
GClasses::GPeachAgentThis is an experimental policy-learning algorithm. It's currently too slow to be practical
GClasses::GPearsonCorrelationThis is a similarity metric that computes the Pearson correlation between two sparse vectors
GClasses::GPipeThis class wraps the handle of a pipe. It closes the pipe when it is destroyed. This class is useful in conjunction with GApp::systemExecute for reading from, or writing to, the standard i/o streams of a child process
GClasses::GPlotLabelSpacerIf you need to place grid lines or labels at regular intervals (like 1000, 2000, 3000, 4000... or 20, 25, 30, 35... or 0, 2, 4, 6, 8, 10...) this class will help you pick where to place the labels so that there are a reasonable number of them, and they all land on nice label values
GClasses::GPlotLabelSpacerLogarithmicSimilar to GPlotLabelSpacer, except for logarithmic grids. To plot in logarithmic space, set your plot window to have a range from log_e(min) to log_e(max). When you actually plot things, plot them at log_e(x), where x is the position of the thing you want to plot
GClasses::GPlotWindowThis class makes it easy to plot points and functions on 2D cartesian coordinates
GClasses::GPoissonDistributionThe Poisson distribution
GClasses::GPolicyLearnerThis is the base class for algorithms that learn a policy
GClasses::GPolynomialThis regresses a multi-dimensional polynomial to fit the data
GClasses::GPredictionThis class is used to represent the predicted distribution made by a supervised learning algorithm. (It is just a shallow wrapper around GDistribution.) It is used in conjunction with calls to GSupervisedLearner::predictDistribution. The predicted distributions will be either categorical distributions (for nominal values) or Normal distributions (for continuous values)
GClasses::GPriorityQueueAn implementation of a double-ended heap-based priority queue. (Note that the multimap STL class can also be used to implement a double-ended priority queue, but the STL does not currently provide a heap-based double-ended priority queue, which is asymptotically more efficient for insertions.)
GClasses::GPriorityQueueEntryAn internal class used by GPriorityQueue. You should not use this class directly
GClasses::GProbeSearchThis is somewhat of a multi-dimensional version of binary-search. It greedily probes the best choices first, but then starts trying the opposite choices at the higher divisions so that it can also handle non-monotonic target functions. Each iteration performs a binary (divide-and-conquer) search within the unit hypercube. (Your target function should scale the candidate vectors as necessary to cover the desired space.) Because the high-level divisions are typically less correlated with the quality of the final result than the low-level divisions, it searches through the space of possible "probes" by toggling choices in the order from high level to low level. In low-dimensional space, this algorithm tends to quickly find good solutions, especially if the target function is somewhat smooth. In high-dimensional space, the number of iterations to find a good solution seems to grow exponentially
GClasses::GQLearnerThe base class of a Q-Learner. To use this class, there are four abstract methods you'll need to implement. See also the comment for GPolicyLearner
GClasses::GRandThis is a 64-bit pseudo-random number generator
GClasses::GRandomForest
GClasses::GRandomSearchAt each iteration, this tries a random vector from the unit hypercube. (Your target function should scale the candidate vectors as necessary to cover the desired space.)
GClasses::GRayTraceAreaLightRepresents a light source with area
GClasses::GRayTraceBoundingBoxBaseA class used for making ray-tracing faster
GClasses::GRayTraceBoundingBoxInteriorA class used for making ray-tracing faster
GClasses::GRayTraceBoundingBoxLeafA class used for making ray-tracing faster
GClasses::GRayTraceCameraRepresents the camera for a ray tracing scene
GClasses::GRayTraceColorThis class represents a color. It's more precise than GColor, but takes up more memory. Note that the ray tracer ignores the alpha channel because the material specifies a unique transmission color
GClasses::GRayTraceDirectionalLightRepresents directional light in a ray-tracing scene
GClasses::GRayTraceImageTexture
GClasses::GRayTraceLightRepresents a source of light in a ray-tracing scene
GClasses::GRayTraceMaterial
GClasses::GRayTraceObjectAn object in a ray-tracing scene
GClasses::GRayTracePhysicalMaterialRepresents the material of which an object is made in a ray-tracing scene
GClasses::GRayTracePointLightRepresents a point light in a ray-tracing scene
GClasses::GRayTraceSceneRepresents a scene that you can ray-trace
GClasses::GRayTraceSphereA sphere in a ray-tracing scene
GClasses::GRayTraceTriangleA single triangle in a ray-tracing scene
GClasses::GRayTraceTriMeshRepresents a triangle mesh in a ray-tracing scene
GClasses::GRectRepresents a rectangular region with integers
GClasses::GRecurrentModelThis class can be used to implement recurrent neural networks, or recurrent forms of other supervised models
GClasses::GRegionAjacencyGraphThe base class for region ajacency graphs. These are useful for breaking down an image into patches of similar color
GClasses::GRegionAreaIteratorIterates over all the pixels in an image that have the same color and are transitively adjacent. In other words, if you were to flood-fill a the specified point, this returns all the pixels that would be changed
GClasses::GRegionBorderIteratorIterates the border of a 2D region by running around the border and reporting the coordinates of each interior border pixel and the direction to the edge. It goes in a counter-clockwise direction
GClasses::GRelationHolds the metadata for a dataset, including which attributes are continuous or nominal, and how many values each nominal attribute supports
GClasses::GReleaseDataHolderThis is a special holder that guarantees the data set will release all of its data before it is deleted
GClasses::SOM::GridTopologySet the nodes to lie on an integer grid within the given maxima. A grid with 10,10 maximum is assumed to go from 0..9. NOTE: if the difference between a dimensional maximum and the nearest integer is less than 1e-6 then the maximum is taken to be that integer. Otherwise it is taken to be the maximum rounded down
GClasses::GRowDistanceThis uses a combination of Euclidean distance for continuous attributes, and Hamming distance for nominal attributes. In particular, for each attribute, it calculates pA[i]-pB[i], squares it and takes the square root of that sum. For nominal attributes pA[i]-pB[i] is 0 if they are the same and 1 if they are different
GClasses::GRowDistanceScaledThis uses a combination of Euclidean distance for continuous attributes, and Hamming distance for nominal attributes. This version honors scale factors given by the user. See comments on GRowDistance
GClasses::GRubberBallSwarmThis is an algorithm for finding good starting points within a constrained optimization problem. It works by simulating "rubber balls" which bounce around inside the constrained region. After many iterations, they tend to be spread somewhat uniformly, even with very complex constrained shapes. The balls learn to approximate the shape of the shell, so if the room is wider than it is tall, the balls will learn to bounce sideways more often than vertically
GClasses::GSaffronThis class implementes the SAFFRON intelligent neighbor-finding algorithm published in Gashler, Michael S. and Martinez, Tony. Tangent space guided intelligent neighbor finding. In Proceedings of the IEEE International Joint Conference on Neural Networks IJCNN’11, pages 2617–2624, IEEE Press, 2011. This class intelligently selects neighbors for each point in a dataset, such that the neighbors define a good neighborhood for manifold learning. A relaxation technique is used to ensure that neighbors lie on a consistent tangent-space while remaining close to the point. This makes manifold learning possible with difficult (somtimes even self-intersecting) manifolds
GClasses::GSampleClimberThis is a variant of empirical gradient descent that tries to estimate the gradient using a minimal number of samples. It is more efficient than empirical gradient descent, but it only works well if the optimization surface is quite locally linear
GClasses::GSDLA collection of routines that are useful when interfacing with SDL
GClasses::GSelfOrganizingMapAn implementation of a Kohonen self-organizing map
GClasses::GSequenceNeighborFinderA simple neighbor-finder that reports the nearest neighbors in the sequence. (That is, the previous and next rows are the closest neighbors.) The distance is sequential distance to the neighbor (not squared)
GClasses::GShortcutPrunerThis finds the shortcuts in a table of neighbors and replaces them with INVALID_INDEX
GClasses::GSignalHandlerTemporarily handles certain signals. (When this object is destroyed, it puts all the signal handlers back the way they were.) Periodically call "check" to see if a signal has occurred
GClasses::GSmtpFor sending email to an SMTP server
GClasses::GSoftImpulseDistribution
GClasses::GSparseClustererThis is a base class for clustering algorithms that operate on sparse matrices
GClasses::GSparseClusterRecommenderThis class clusters the rows according to a sparse similarity metric, then uses the baseline vector in each cluster to make predictions
GClasses::GSparseMatrixThis class stores a row-compressed sparse matrix. That is, each row consists of a map from a column-index to a value
GClasses::GSparseSimilarityThe base class for similarity metrics that operate on sparse vectors
GClasses::GSpinLockA spin-lock for synchronization purposes
GClasses::GSpinLockHolder
GClasses::GStemmerThis class just wraps the Porter Stemmer. It finds the stems of words. Examples: "cats"->"cat" "dogs"->"dog" "fries"->"fri" "fishes"->"fish" "pies"->"pi" "lovingly"->"lovingli" "candy"->"candi" "babies"->"babi" "bus"->"bu" "busses"->"buss" "women"->"women" "hasty"->"hasti" "hastily"->"hastili" "fly"->"fly" "kisses"->"kiss" "goes"->"goe" "brought"->"brought" As you can see the stems aren't always real words, but that's okay as long as it produces the same stem for words that have the same etymological roots. Even then it still isn't perfect (notice it got "bus" wrong), but it should still improve analysis somewhat in many cases
GClasses::GStringChopperThis class chops a big string at word breaks so you can display it intelligently on multiple lines
GClasses::GSubImageFinderThis class uses Fourier phase correlation to efficiently find sub-images within a larger image
GClasses::GSubImageFinder2This class uses heuristics to find sub-images within a larger image. It is slower, but more stable than GSubImageFinder
GClasses::GSupervisedLearnerThis is the base class of algorithms that learn with supervision and have an internal hypothesis model that allows them to generalize rows that were not available at training time
GClasses::GSystemLearnerThis is the base class for algorithms that learn to model dynamical systems
GClasses::GTargetFunctionThe optimizer seeks to find values that minimize this target function
GClasses::GTCPClientThis class is an abstraction of a TCP client socket connection
GClasses::GTCPConnectionThis class is used by GTCPServer to represent a connection with one of the clients. (If you want to associate some additional objects with each connection, you can inherrit from this class, and overload GTCPServer::makeConnection to return your own custom object.)
GClasses::GTCPServerThis class is an abstraction of a TCP server, which maintains a set of socket connections
GClasses::GTempBufHelperA helper class used by the GTEMPBUF macro
GClasses::GTemporalNeighborFinderA neighbor finder that specializes in dynamical systems. It determines neighbors by searching for the shortest path of actions between observations, and computes the distance as the number of time-steps in that path. This algorithm was published in Gashler, Michael S. and Martinez, Tony. Temporal nonlinear dimensionality reduction. In Proceedings of the International Joint Conference on Neural Networks IJCNN’11, pages 1959–1966, IEEE Press, 2011
GClasses::GThreadA wrapper for PThreads on Linux and for some corresponding WIN32 api on Windows
GClasses::GTimeProvides some time-related functions
GClasses::GTokenizerThis is a simple tokenizer that reads a file, one token at-a-time
GClasses::GTokenizerMapComparerThis is a helper-class used by GTokenizer
GClasses::GTransducerThis is the base class of supervised learning algorithms (that may or may not have an internal model allowing them to generalize rows that were not available at training time). Note that the literature typically refers to supervised learning algorithms that can't generalize (because they lack an internal hypothesis model) as "Semi-supervised". (You cannot generalize with a semi-supervised algorithm--you have to train again with the new rows.)
GClasses::GTransformThis is the base class of algorithms that transform data without supervision
GClasses::GTriMeshBuilder
GClasses::GTwoWayIncrementalTransformThis is the base class of algorithms that can transform data one row at a time without supervision, and can (un)transform a row back to its original form if necessary
GClasses::GTwoWayTransformChainerThis wraps two two-way-incremental-transoforms to form a single combination transform
GClasses::GUniformDistributionThis is a continuous uniform distribution
GClasses::GUniformRelationA relation with a minimal memory footprint that assumes all attributes are continuous, or all of them are nominal and have the same number of possible values
GClasses::GUnivariateDistributionThis is the base class for univariate distributions
GClasses::GUnsupervisedBackPropA manifold learning algorithm that uses back-propagation to train a neural net model to map from low-dimensional space to high-dimensional space
GClasses::GVecContains some useful functions for operating on vectors
GClasses::GVecBufHolds an array of doubles that can be resized. This class is slightly lighter-weight than the C++ vector class, and it allows access to the buffer in the form of an array of doubles. Basically, it is useful when working with C-style functions that expect parameters in the form of an array of doubles, rather than as a vector of doubles
GClasses::GVocabularyThis is a helper class which is useful for text-mining. It collects words, stems them, filters them through a list of stop-words, and assigns a discrete number to each word
GClasses::GWagThis model trains several multi-layer perceptrons, then averages their weights together in an intelligent manner
GClasses::GWaveCurrently only supports PCM wave format
GClasses::GWaveIteratorThis class iterates over the samples in a WAVE file. Regardless of the bits-per-sample, this iterator will convert all samples to doubles with a range from -1 to 1
GClasses::GWebSocketClient
GClasses::GWeightedModelThis is a helper-class used by GBag
GClasses::GWidgetThe base class of all GUI widgets
GClasses::GWidgetAnimationAn image with multiple frames
GClasses::GWidgetAtomicThe base class of all atomic widgets (widgets that are not composed of other widgets)
GClasses::GWidgetBulletGroupThis creates a whole group of bullets arranged either horizontally or vertically at regular intervals
GClasses::GWidgetBulletHoleThe easiest way to do bullets is to use the GWidgetBulletGroup class, but if you really want to manage individual bullets yourself, you can use this class to do it
GClasses::GWidgetCanvasA painting canvas
GClasses::GWidgetCheckBox
GClasses::GWidgetCommon
GClasses::GWidgetDialogA form or dialog
GClasses::GWidgetFileSystemBrowser
GClasses::GWidgetGrid
GClasses::GWidgetGroupThe base class of all widgets that are composed of other widgets
GClasses::GWidgetGroupBoxThis just draws a rectangular box
GClasses::GWidgetHorizScrollBarMakes a horizontal scroll bar
GClasses::GWidgetHorizSlider
GClasses::GWidgetImageButtonA button with an image on it. The left half of the image is the unpressed image and the right half is the pressed image
GClasses::GWidgetImageLabel
GClasses::GWidgetProgressBarAutomatically determines wether to be horizontal or vertical based on dimensions. Progress ranges from 0 to 1, or from 0 to -1 if you want it to go the other way
GClasses::GWidgetSliderTabThis widget is not meant to be used by itself. It creates one of the parts of a scroll bar or slider bar
GClasses::GWidgetTextBoxThis is a box in which the user can enter text
GClasses::GWidgetTextButtonA button with text on it
GClasses::GWidgetTextLabelA text label
GClasses::GWidgetTextTabRepresents a tab (like for tabbed menus, etc.)
GClasses::GWidgetVCRButtonA button with a common icon on it
GClasses::GWidgetVertScrollBarMakes a vertical scroll bar
GClasses::GWidgetVertSlider
GClasses::GWidgetWave
GClasses::GWordIteratorThis iterates over the words in a block of text
GClasses::GWordStatsStores statistics about each word in a GVocabulary
GClasses::HashBucketThis is an internal structure used by GHashTable
GClasses::HashTableNodeObjects used with GNodeHashTable should inherit from this class. They must implement two methods (to hash and compare the nodes)
GClasses::Holder< T >This class is very similar to the standard C++ class auto_ptr, except it throws an exception if you try to make a copy of it. This way, it will fail early if you use it in a manner that could result in non-deterministic behavior. (For example, if you create a vector of auto_ptrs, wierd things happen if an oom exception is thrown while resizing the buffer--part of the data will be lost when it reverts back to the original buffer. But if you make a vector of these, it will fail quickly, thus alerting you to the issue.)
GClasses::SOM::IterationIntervalReporterCalls its sub-reporter on start, the first iteration of a block of "interval" iterations, and finally on stop
GClasses::SOM::NeighborhoodWindowFunctionFunction that given a width, and a distance from the center of the neighborhood returns a weight to be used to calculate the influence of neighboring nodes at that distance. For each radius, can tell a distance d (possibly infinity) from the center for which all weights for distances greater than or equal to d will be 0
GClasses::SOM::NodeA node in a self-organizing map
GClasses::SOM::NodeAndDistanceUsed for creating an array of nodes sorted by nearness to a source node
GClasses::SOM::NodeLocationInitializationWay of initializing the node positions according to a given topology - for example: points on a grid, on a triangular lattice, or random points in space
GClasses::SOM::NodeWeightInitializationAlgorithm to initialize the weights of the nodes in the network before training
GClasses::SOM::NodeWeightInitializationTrainingSetSampleInitializes the weights to a random sample of rows from the training set
GClasses::SOM::NoReportingA reporter that does nothing
GClasses::PathDataHelper struct to hold the results from GFile::ParsePath
GClasses::SOM::SVG2DWeightReporter::Point
GClasses::SOM::ReporterReports periodically on the training of a self-organizing map - writing status to a stream every so many seconds or iterations, writing visualizations of the network or the network itself to sequentially named files
GClasses::SOM::ReporterChainA ReporterChain contains a list of Reporter objects. When a method is called on the ReporterChain, it calls the same method on each of its sub-objects in turn
GClasses::smart_ptr< T >A reference-counting smart-pointer
GClasses::smart_ptr_ref_counter< T >A helper class used by the smart_ptr class
GClasses::strComp
GClasses::SOM::SVG2DWeightReporterWrites out sequentially numbered svg files giving the weight locations in 2 dimensions of input space connected by a mesh that connects each weight with its nearest neighbors. Writes one file each time newStatus is called and once when stop is called. The output of stop may duplicate the last newStatus's output, but is not guaranteed to
GClasses::SOM::TraditionalTrainingImplments the traditional step-wise training of self-organized maps //TODO: finish this comment
GClasses::SOM::TrainingAlgorithmAn algorithm for training self-organizing maps. Before training is started, it is expected that the nodes are allocated and that the geometry of the map has been set by giving each node a position and a distance function. However, the weight vectors and the output dimensionality will be completely overwritten by training
GClasses::SOM::UniformWindowFunctionUses a unit-height, zero-mean Uniform weighting with the width being the radius of the circle anything beyond width is 0
GClasses::VectorOfPointersHolder< T >Deletes all of the pointers in a vector when this object goes out of scope