Papers
arxiv:2603.06027

Agnostic learning in (almost) optimal time via Gaussian surface area

Published on Mar 6
Authors:
,
,

Abstract

The complexity of learning a concept class under Gaussian marginals in the difficult agnostic model is closely related to its L_1-approximability by low-degree polynomials. For any concept class with Gaussian surface area at most Γ, Klivans et al. (2008) show that degree d = O(Γ^2 / varepsilon^4) suffices to achieve an varepsilon-approximation. This leads to the best-known bounds on the complexity of learning a variety of concept classes. In this note, we improve their analysis by showing that degree d = tilde O (Γ^2 / varepsilon^2) is enough. In light of lower bounds due to Diakonikolas et al. (2021), this yields (near) optimal bounds on the complexity of agnostically learning polynomial threshold functions in the statistical query model. Our proof relies on a direct analogue of a construction of Feldman et al. (2020), who considered L_1-approximation on the Boolean hypercube.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2603.06027 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2603.06027 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2603.06027 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.