From 1fa7e842c55202a825e35de6f27e0fcc5861b649 Mon Sep 17 00:00:00 2001 From: Jaron Kent-Dobias Date: Mon, 10 Feb 2025 17:21:17 -0300 Subject: Wrote abstract --- abstract.txt | 7 +++++++ 1 file changed, 7 insertions(+) create mode 100644 abstract.txt diff --git a/abstract.txt b/abstract.txt new file mode 100644 index 0000000..2019c0a --- /dev/null +++ b/abstract.txt @@ -0,0 +1,7 @@ + +Fitting with more parameters than data points: Topology of the solutions to overparameterized problems + +In university we're taught to be conservative when picking parameters for data fitting, and with good reason: many-parameter models are susceptible to overfitting. As the number of parameters approaches the number of data points, the model runs perfectly through all data points but goes haywire in the space between them. The last decades of work in machine learning have revealed a surprise: when the number of parameters is taken much higher than the number of data points, good and sometimes better fits are still possible. I will review what is known about when and why overparameterized fits work, touching on the relationship between algorithms, their initialization, and solution geometry. Then, I will describe recent work to quantify the topology of the space of "perfect" fits and discuss what we might glean from this information. + +See more: On the topology of solutions to random continuous constraint satisfaction problems, Jaron Kent-Dobias, arXiv:2409.12781 [cond-mat.dis-nn] https://arxiv.org/abs/2409.12781 + -- cgit v1.2.3-70-g09d2