• Classified by Research Topic • Sorted by Date • Classified by Publication Type •
On the Hardness of Probabilistic Inference Relaxations.
Supratik Chakraborty, Kuldeep S. Meel and Moshe Y. Vardi.
In Proceedings of AAAI Conference on Artificial Intelligence (AAAI), January 2019.
Probabilistic inference is increasingly being used in applications that compute with uncertain data. A promising approach to inference that has attracted recent attention exploits its inter-reducibility with model counting. Since probabilistic inference and model counting are #P-complete, various relaxations are used in practice, with the hope that these relaxations lend themselves to efficient computation while also providing rigorous approximation guarantees. In this paper, we show that contrary to commonly held belief, several relaxations used in the probabilistic inference literature do not really lead to computational efficiency in a complexity theoretic sense. Our arguments proceed by showing the corresponding relaxed notions of counting to be computationally hard. We argue that approximate counting (and hence, inference) with multiplicative tolerance and probabilistic guarantees of correctness is the only class of relaxations that provably simplifies the problem, given access to an NP-oracle. Finally, we show that for applications that compare probability estimates with a threshold, a new notion of relaxation with gaps between low and high thresholds can be used. This new relaxation allows efficient decision making in practice, given access to an NP-oracle, while also bounding the approximation error.
@inproceedings{CMV19, title={On the Hardness of Probabilistic Inference Relaxations}, author={Chakraborty, Supratik and Meel, Kuldeep S. and Vardi, Moshe Y.}, booktitle=AAAI, year={2019}, month=jan, bib2html_dl_pdf={../Papers/aaai19-cmv.pdf}, bib2html_pubtype={Refereed Conference}, bib2html_rescat={Counting}, abstract={Probabilistic inference is increasingly being used in applications that compute with uncertain data. A promising approach to inference that has attracted recent attention exploits its inter-reducibility with model counting. Since probabilistic inference and model counting are #P-complete, various relaxations are used in practice, with the hope that these relaxations lend themselves to efficient computation while also providing rigorous approximation guarantees. In this paper, we show that contrary to commonly held belief, several relaxations used in the probabilistic inference literature do not really lead to computational efficiency in a complexity theoretic sense. Our arguments proceed by showing the corresponding relaxed notions of counting to be computationally hard. We argue that approximate counting (and hence, inference) with multiplicative tolerance and probabilistic guarantees of correctness is the only class of relaxations that provably simplifies the problem, given access to an NP-oracle. Finally, we show that for applications that compare probability estimates with a threshold, a new notion of relaxation with gaps between low and high thresholds can be used. This new relaxation allows efficient decision making in practice, given access to an NP-oracle, while also bounding the approximation error.} }
Generated by bib2html.pl (written by Patrick Riley with layout from Sanjit A. Seshia ) on Thu Aug 22, 2024 18:37:34