ホップフィルド・機械とボルツマン・ネットワーク

最近回帰神経回路網の二種類を比較しています。Both the Boltzmann machine and the Hopfield Network are Recurrent Neural Networks whose units have on and off states. However the Boltzmann machine is stochastic, as the unit's state depends on a logistic function which takes the dot product of the weights of connections and the states of predecessors as input. Both networks utilize the concept of energy to describe the on and off states and the weights. Higher energy indicates that a state vector (set if on/off for all units) is more likely. Strangely, the concept of temperature is also used in the Boltzmann case, which seems to be a way to reduce the weights of connections. However how the network becomes deterministic and equivalent to a Hopfield network when the temperature is 0 is unclear. The Boltzmann machine training literature discusses sampling from distributions and not the output of TLUs on training inputs, which is confusing.
Simulated annealing is used to infer the optimal function from a state space of possible models. The optimal model has the lowest energy. Apparently, this rule can be used to update weights until the input data is correctly classified. It seems that the criterion for convergence is minimizing energy over the input set, not minimizing error.
The visible layer of the Boltzmann machine would seem to be the inputs.
It is unclear how to map this model onto a practical problem such as the pricing of mortgage-backed securities. Perhaps s1 could be the maturity of the loan, s2 could be a prepayment of the principal, and s3 could be a default. The model distribution for a maturity of a certain length and a prepayment both being true would be the sum over all bit vectors where both of these bits are one of the probability of each bit vector.