• Classified by Research Topic • Sorted by Date • Classified by Publication Type •

** A Scalable Shannon Entropy Estimator**.

Priyanka Golia, Brendan Juba and Kuldeep S. Meel.

In * Proceedings of International Conference on Computer-Aided Verification (CAV)*, August 2022.

** Invited to FMSD Special Issue dedicated to the Best Papers from CAV-22**

Quantified information flow (QIF) has emerged as a rigorous approach to quantitatively measure confidentiality; the information-theoretic underpinning of QIF allows the end-users to link the computed quantities with the computational effort required on the part of the adversary to gain access to desired confidential information. In this work, we focus on the estimation of Shannon entropy for a given program P. As a first step, we focus on the case wherein a Boolean formula F(X, Y ) captures the relationship between inputs X and output Y of P. Such formulas F(X, Y ) have the property that for every valuation to X, there exists exactly one valuation to Y such that F is satisfied. The existing techniques require O(2**m ) model counting queries, where m = |Y |. We propose the first efficient algorithmic technique, called EntropyEstimation to estimate the Shannon entropy of F with PAC-style guarantees, i.e., the computed estimate is guaranteed to lie within a (1 + eps)-factor of the ground truth with confidence at least 1-delta. Furthermore, EntropyEstimation makes only O( min(m,n)) counting and sampling queries, where m = |Y|, and n = |X|, thereby achieving a significant reduction in the number of model counting queries. We demonstrate the practical efficiency of our algorithmic framework via a detailed experimental evaluation. Our evaluation demonstrates that the proposed framework scales to the formulas beyond the reach of the previously known approaches.

@inproceedings{GJM22, title={A Scalable Shannon Entropy Estimator}, author={Golia, Priyanka and Juba, Brendan and Meel, Kuldeep S. }, booktitle=CAV, year={2022}, note={Invited to FMSD Special Issue dedicated to the Best Papers from CAV-22}, bib2html_rescat={Distribution Testing}, bib2html_pubtype={Refereed Conference,Award Winner}, month=aug, bib2html_dl_pdf={../Papers/cav22.pdf}, abstract={ Quantified information flow (QIF) has emerged as a rigorous approach to quantitatively measure confidentiality; the information-theoretic underpinning of QIF allows the end-users to link the computed quantities with the computational effort required on the part of the adversary to gain access to desired confidential information. In this work, we focus on the estimation of Shannon entropy for a given program P. As a first step, we focus on the case wherein a Boolean formula F(X, Y ) captures the relationship between inputs X and output Y of P. Such formulas F(X, Y ) have the property that for every valuation to X, there exists exactly one valuation to Y such that F is satisfied. The existing techniques require O(2**m ) model counting queries, where m = |Y |. We propose the first efficient algorithmic technique, called EntropyEstimation to estimate the Shannon entropy of F with PAC-style guarantees, i.e., the computed estimate is guaranteed to lie within a (1 + eps)-factor of the ground truth with confidence at least 1-delta. Furthermore, EntropyEstimation makes only O( min(m,n)) counting and sampling queries, where m = |Y|, and n = |X|, thereby achieving a significant reduction in the number of model counting queries. We demonstrate the practical efficiency of our algorithmic framework via a detailed experimental evaluation. Our evaluation demonstrates that the proposed framework scales to the formulas beyond the reach of the previously known approaches. }, }

Generated by bib2html.pl (written by Patrick Riley with layout from Sanjit A. Seshia ) on Sun Apr 14, 2024 11:15:51