
Sign up to save your podcasts
Or


Today we’re joined by Ilias Diakonikolas, faculty in the CS department at the University of Wisconsin-Madison, and author of the paper Distribution-Independent PAC Learning of Halfspaces with Massart Noise, recipient of the NeurIPS 2019 Outstanding Paper award. The paper is regarded as the first progress made around distribution-independent learning with noise since the 80s. In our conversation, we explore robustness in ML, problems with corrupt data in high-dimensional settings, and of course, the paper.
By Sam Charrington4.7
419419 ratings
Today we’re joined by Ilias Diakonikolas, faculty in the CS department at the University of Wisconsin-Madison, and author of the paper Distribution-Independent PAC Learning of Halfspaces with Massart Noise, recipient of the NeurIPS 2019 Outstanding Paper award. The paper is regarded as the first progress made around distribution-independent learning with noise since the 80s. In our conversation, we explore robustness in ML, problems with corrupt data in high-dimensional settings, and of course, the paper.

480 Listeners

1,089 Listeners

170 Listeners

303 Listeners

334 Listeners

208 Listeners

201 Listeners

95 Listeners

512 Listeners

130 Listeners

227 Listeners

608 Listeners

25 Listeners

35 Listeners

40 Listeners