
Sign up to save your podcasts
Or


Today's episode overviews the perceptron algorithm. This rather simple approach is characterized by a few particular features. It updates its weights after seeing every example, rather than as a batch. It uses a step function as an activation function. It's only appropriate for linearly separable data, and it will converge to a solution if the data meets these criteria. Being a fairly simple algorithm, it can run very efficiently. Although we don't discuss it in this episode, multi-layer perceptron networks are what makes this technique most attractive.
By Kyle Polich4.4
475475 ratings
Today's episode overviews the perceptron algorithm. This rather simple approach is characterized by a few particular features. It updates its weights after seeing every example, rather than as a batch. It uses a step function as an activation function. It's only appropriate for linearly separable data, and it will converge to a solution if the data meets these criteria. Being a fairly simple algorithm, it can run very efficiently. Although we don't discuss it in this episode, multi-layer perceptron networks are what makes this technique most attractive.

290 Listeners

622 Listeners

584 Listeners

302 Listeners

332 Listeners

228 Listeners

205 Listeners

205 Listeners

306 Listeners

96 Listeners

515 Listeners

262 Listeners

131 Listeners

228 Listeners

622 Listeners