A Closed-Form Upper Bound for Admissible Learning-Rate Steps in Belief-Space Dynamics
Abstract
Admissibility in learning-rate steps is characterized by contractivity in KL/Bregman geometry, providing a formulaic upper bound rather than a tunable parameter.
Learning-rate steps are usually treated as hyperparameters. This paper isolates a local beliefspace calculation: when an update is modeled as a projected forward step on the probability simplex, admissibility means contractivity in the natural KL/Bregman geometry. Under this model, the upper bound of an admissible step is not a tuning slogan but a formula.
Community
Learning-rate steps are usually treated as hyperparameters. This paper isolates a local belief-space calculation: when an update is modeled as a projected forward step on the probability simplex, admissibility means contractivity in the natural KL/Bregman geometry. Under this model, the upper bound of an admissible step is not a tuning slogan but a formula.
Get this paper in your agent:
hf papers read 2605.06741 Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper