• Classified by Research Topic • Sorted by Date • Classified by Publication Type •

** WAPS: Weighted and Projected Sampling**.

Rahul Gupta, Shubham Sharma, Subhajit Roy and Kuldeep S. Meel.

In * Proceedings of Tools and Algorithms for the Construction and Analysis of Systems (TACAS)*, April 2019.

Previous work on applying Knowledge compilation has focused on uniform sampling over all the variables. Since the constraints are written in high level languages such as Verilog, the popular CNF encoding schemes as Tseitin encoding introduces additional auxiliary variables. The resulting CNF formulas are not equivalent but equisatisfiable. In particular, for a formula $G$ specified in high level language we obtain a CNF formula F such that $G (X) = \exists Y F(X,Y)$. This makes one wonder if it is possible to extend Knowledge compilation based techniques to sample over a subset of variables. Furthermore, languages such as Verilog allow specification of weights to user-defined constraints, so there is a need to sample according to the posterior distribution. In this paper, we provide affirmative question to the above two questions: We propose KUS that samples over user defined subset of variables from posterior distribution for a given prior distribution defined over product spaces.

@inproceedings{GSRM19, title={WAPS: Weighted and Projected Sampling}, author={Gupta, Rahul and Sharma, Shubham and Roy, Subhajit and Meel, Kuldeep S.}, bib2html_pubtype={Refereed Conference}, bib2html_dl_pdf={../Papers/tacas19.pdf}, booktitle=TACAS, month=apr, year={2019}, bib2html_rescat={Sampling}, code={https://github.com/meelgroup/waps}, abstract={ Previous work on applying Knowledge compilation has focused on uniform sampling over all the variables. Since the constraints are written in high level languages such as Verilog, the popular CNF encoding schemes as Tseitin encoding introduces additional auxiliary variables. The resulting CNF formulas are not equivalent but equisatisfiable. In particular, for a formula $G$ specified in high level language we obtain a CNF formula F such that $G (X) = \exists Y F(X,Y)$. This makes one wonder if it is possible to extend Knowledge compilation based techniques to sample over a subset of variables. Furthermore, languages such as Verilog allow specification of weights to user-defined constraints, so there is a need to sample according to the posterior distribution. In this paper, we provide affirmative question to the above two questions: We propose KUS that samples over user defined subset of variables from posterior distribution for a given prior distribution defined over product spaces.}, }

Generated by bib2html.pl (written by Patrick Riley with layout from Sanjit A. Seshia ) on Sun Apr 14, 2024 11:15:51