
By testing these separately, researchers found that "Stochastic Sign Descent" can actually outperform standard Adam on specific datasets like MNIST and CIFAR10. 2. Adaptive Multilevel Splitting (ADAM)
It proposes Coupled Adam to fix this specific side effect. Splitting Adam
It shows that Adam minimizes a specific form of sharpness —specifically the trace of the square root of the Hessian—which is fundamentally different from how SGD behaves. 4. Better Embeddings with Coupled Adam It shows that Adam minimizes a specific form
A more recent and highly regarded paper (2025) investigates what happens when Adam "wanders" around the manifold of minimizers. If you are coming from a statistics or
If you are coming from a statistics or rare-event simulation background, "ADAM" refers to .
Based on your interest in "Splitting Adam," you are likely referring to research surrounding the widely used in machine learning. There isn't one single paper with that exact title, but several "interesting" papers analyze splitting the algorithm's components or its behavior in complex ways: 1. The Sign, Magnitude and Variance of Stochastic Gradients
It's often applied to power grid reliability or particle transport. 3. Adam Reduces a Unique Form of Sharpness