Adds a BiasOrigin, which operates in parallel with an existing Origin, such that the effective weights of the
two origins together are all of the same sign (as is normally the case with synaptic weights in the brain).
Adds a one dimension termination to the network
This allows the user to specify which dimension the input value should be stored in as opposed to sending in a weight matrix to do so
A multiplier transform is also expected
To be used in place of the Mac look & feel apple.laf.AquaTreeUI (and CUIAquaTree), which seem not to
respect differences in tree cell size, or to expand tree cells when they change size.
An editor component (from ConfigurationHandler.getEditor(...)) must
implement EditorProxy in order to allow retrieval of a new value when
editing is complete.
This method is publically exposed because normal deviates are often needed,
and static access allows the compiler to inline the call, which brings a
small performance advantage.
Euler's method of numerical integration: x(t+h) ~ x(t) + h*x'(t)
TODO: test
TODO: should there be some means for aborting early (aside from exceptions, e.g.
A computationally efficient model of the difference between a NEFEnsemble's DecodedOrigin output
in DIRECT mode and DEFAULT mode (at the level of state variables).
Model of spike generation in medium-spiny striatal neurons from: Gruber, Solla, Surmeier & Houk (2003)
Modulation of striatal single units by expected reward: a spiny neuron model displaying dopamine-induced
bistability, J Neurophysiol 90: 1095-1114.
Gets all the necessary data from the nodes and projections which are assigned to run on GPUss
and puts it in a form appropriate for passing to the native setup function.
Employs the multi-level Kernighan-Lin graph partitioning heuristic to partition
a network into a given number of partitions such that the amount of information
passed along the projections that cross partitions is minimized, while making sure
the number of neurons in each partition is relatively balanced.
Note: setters are private, because Origins typically make copies for each output dimension,
which would then not be updated with changes to the original.
Non Linear Network
This network is a model of Pyramidal Cells found in the central nervous system
These cells contain an active dendritic tree with functional computation occuring
within the dendrites themselves.
Note: by-products of decoding are sometimes cached, so if these are changed it may be
necessary to call setReuseApproximators(false) for the change to take effect.
Sets the object to run in either the given mode or the closest mode that it supports
(all ModeConfigurables must support SimulationMode.DEFAULT, and must default to this mode).
By default, attempts to call method setX(y) on Configurable, where X is the name of the property (with
first letter capitalized) and y is the value (changed to a primitive if it's a primitive wrapper).
A linear time-invariant system with the following properties:
A diagonal dynamics matrix
A zero passthrough matrix
This implementation will run faster than an instance of the superclass that
has these properties.
A LinearApproximator in which error is evaluated at a fixed set of points, and
the cost function that is minimized is a weighted integral of squared error.
X -
Static variable in interface ca.nengo.model.nef.NEFEnsemble
Standard name for the Origin corresponding to the decoded estimate of the state variables
that Ensemble represents (X is a standard name for state variables in state-space models).