{NRCL} - a model building approach to the {Bernays-Schönfinkel} fragment
Résumé
We combine constrained literals for model representation with key concepts from first-order superposition and propositional conflict-driven clause learning (CDCL) to create the new calculus Non-Redundant Clause Learning (NRCL) deciding the Bernays-Schönfinkel fragment. We use first-order literals constrained by disequalities between tuples of terms for compact model representation. From superposition, NRCL inherits the abstract redundancy criterion and the monotone model operator. CDCL adds the dynamic, conflict-driven search for a model. As a result, NRCL finds a false clause modulo the current model candidate effectively. It guides the derivation of a first-order ordered resolvent that is never redundant. Similar to 1UIP-learning in CDCL, the learned resolvent induces backtracking and, by blocking the previous conflict state via propagation, it enforces progress towards finding a model or a refutation. The non-redundancy result also implies that only finitely many clauses can be generated by NRCL on the Bernays-Schönfinkel fragment, which proves termination.