Benavente-Fokina, E., & Lempp, S. (2025). Syntactic characterization of learnability of structures with mind changes. Information and Computation, 307, Article 105381. https://doi.org/10.1016/j.ic.2025.105381
We study the learnability of classes of computable structures under models that allow finitely many mind changes. Extending classical notions of explanatory learning from informant and from text, we introduce new paradigms where the information source and the convergence requirements are modified. In particular, we define Δ20[jls-end-space/]-learning, where each atomic fact may be presented with finitely many errors before stabilizing, and c.e.- and d.c.e.-learning, where information is restricted to positive atomic facts that may either never be retracted (c.e.) or be retracted at most once (d.c.e.). We provide syntactic characterizations for these notions: Δ20[jls-end-space/]-learning coincides with definability by finite existential sentences, and c.e.-learning coincides with TxtEx-learning. For d.c.e.-learning we give partial characterization in restricted languages. Furthermore, for n-learning from informant, where learners are allowed at most n mind changes, we establish a complete syntactic characterization in terms of specific infinitary formulas of bounded depth.
en
Project title:
Strukturen durch Lernen Klassifizieren: P 36781-N (FWF - Österr. Wissenschaftsfonds)
-
Research Areas:
Logic and Computation: 5% Fundamental Mathematics Research: 95%