Aarhus University Seal

The Fine-Grained Complexity of Andersen's Pointer Analysis

Research output: Working paper/Preprint Working paperResearch

Pointer analysis is one of the fundamental problems in static program analysis. Given a set of pointers, the task is to produce a useful over-approximation of the memory locations that each pointer may point-to at runtime. The most common formulation is Andersen's Pointer Analysis (APA), defined as an inclusion-based set of $m$ pointer constraints over a set of $n$ pointers. Existing algorithms solve APA in $O(n^2\cdot m)$ time, while it has been conjectured that the problem has no truly sub-cubic algorithm, with a proof so far having remained elusive. It is also well-known that $APA$ can be solved in $O(n^2)$ time under certain sparsity conditions that hold naturally in some settings. Besides these simple bounds, the complexity of the problem has remained poorly understood. In this work we draw a rich fine-grained complexity landscape of APA, and present upper and lower bounds. First, we establish an $O(n^3)$ upper-bound for general APA, improving over $O(n^2\cdot m)$ as $n=O(m)$. Second, we show that sparse instances can be solved in $O(n^{3/2})$ time, improving the current $O(n^2)$ bound. Third, we show that even on-demand APA ("may a specific pointer $a$ point to a specific location $b$?") has an $\Omega(n^3)$ (combinatorial) lower bound under standard complexity-theoretic hypotheses. This formally establishes the long-conjectured "cubic bottleneck" of APA, and shows that our $O(n^3)$-time algorithm is optimal. Fourth, we show that under mild restrictions, APA is solvable in $\tilde{O}(n^{\omega})$ time, where $\omega
Original languageUndefined/Unknown
PublisherArXiv
Publication statusPublished - Jun 2020

See relations at Aarhus University Citationformats

ID: 189993951