
Sign up to save your podcasts
Or


Today's episode overviews the perceptron algorithm. This rather simple approach is characterized by a few particular features. It updates its weights after seeing every example, rather than as a batch. It uses a step function as an activation function. It's only appropriate for linearly separable data, and it will converge to a solution if the data meets these criteria. Being a fairly simple algorithm, it can run very efficiently. Although we don't discuss it in this episode, multi-layer perceptron networks are what makes this technique most attractive.
By Kyle Polich4.4
475475 ratings
Today's episode overviews the perceptron algorithm. This rather simple approach is characterized by a few particular features. It updates its weights after seeing every example, rather than as a batch. It uses a step function as an activation function. It's only appropriate for linearly separable data, and it will converge to a solution if the data meets these criteria. Being a fairly simple algorithm, it can run very efficiently. Although we don't discuss it in this episode, multi-layer perceptron networks are what makes this technique most attractive.

32,243 Listeners

30,635 Listeners

288 Listeners

1,107 Listeners

629 Listeners

583 Listeners

305 Listeners

345 Listeners

209 Listeners

205 Listeners

313 Listeners

100 Listeners

554 Listeners

102 Listeners

229 Listeners