Probabilistic Knowledge Representation
- L. Ngo, P. Haddawy. Answering queries from context-sensitive
probabilistic knowledge bases.
- L. Ngo, P. Haddawy, J. Helwig. A theoretical framework for
contest-sensitive temporal probability model construction with
application to plan projection. UAI-95.
To my understanding, these two
papers introduce another logic programming approach for probabilistic
reasoning. The basic idea is that a small Bayes network can be
constructed from such an LP, in which only relevent nodes are included.
- T. Sato. A statistical learning method for logic programs with
distribution semantics. ICLP-95.
This paper introduces a formal
distributional semantics for logic programs with deterministic rules
and random facts, and claims that this semantics bridges logic
programming and learning.
- J. Binder, D. Koller, S. Russell, K. Kanazawa. Adaptive
probabilistic networks with hidden variables.
The authors argue for the
importance of hidden variables in BNs to reduce the complexity of the
networks. (Not yet read in detail.)
- KR&R Book.
Section 12.4.3 gives a Bayes
interpretation of the production-system-like tip calculation in Section
12.4.2. Can we get a similar result for RPS?
Machine Learning and Connectionist Models
- K. B. Laskey. Adapting connectionist learning to Bayes networks.
I. J. of Approximate Reasoning 1990.
As an attempt to unifying symbolic and "sub-symbolic" reasoning, this
paper finds some close relationship between the Boltzmann machines and
the Bayes networks with Markov random fields as the intermediate
bridge. The author suggests that the learning algorithm for BM can be
adapted to adjust conditional probabilities in BN.
- D. H. Ackley, G. E. Hinton, T. J. Sejnowski. A learning algorithm
for Boltzmann machines. Cognitive Science 1985.
The classic paper introducing the SA algorithm for learning Boltzmann
machine parameters.
General AI and Others
- C. Baral, J. Lobo. Characterizing production systems using logic
programming and situation calculus.
This paper offers production
systems a declarative semantics by mapping production rules to logic
programs encoding the situation calculus. This is likely to help
formally analyze properties of production systems.
- E. Baum. A working hypothesis for general intelligence.
The author argues for the
important role of an extrapolated version of Occam's razor in general
intelligence, and offers an evolutionary view of the formation of
intelligence. While I agree with the hypothesis in the first section, I
doubt if his introspection experiments will finally lead to general
ingelligence.
- R. J. McEliece, D. MacKay, J. F. Cheng. Turbo decoding as an
instance of Pearl's "belief propagation" algorithm. IEEE J. on selected
areas in communication 1998.
Identifies the relationship
between turbo decoding in information theory and belief propagation in
Bayes belief networks. (Reading in progress.)