
Sign up to save your podcasts
Or


The supply chain attacks on npm continue and this week, Crowdstrike’s npm packages fell victim to the “Shai-Hulud” worm.
To mitigate the potential of downloading these malicious packages, consider pinning specific package versions in JS projects and using 2FA to publish new package versions to npm.
Also this week, WebAssembly Specification (Wasm) released v3.0. This version dramatically expands the memory Wasm apps can use, supports multiple memory usage, and now allows garbage collection.
It’s been a while since we last covered LLM options for folks who want to run their own models locally or in the browser, so Jack gives a quick rundown of some of the best options out today.
There’s WebLLM from MLC, MediaPipe from Google, and ONNX from Microsoft, and although none are easily interchangeable with another, if cost, privacy, or working offline are concerns of your LLM-enabled app, these may be good options to explore.
Chapter Markers:
Links:
Thanks as always to our sponsor, the Blue Collar Coder channel on YouTube. You can join us in our Discord channel, explore our website and reach us via email, or talk to us on X, Bluesky, or YouTube.
By TJ VanToll, Paige Niedringhaus, Jack Herrington4.4
1111 ratings
The supply chain attacks on npm continue and this week, Crowdstrike’s npm packages fell victim to the “Shai-Hulud” worm.
To mitigate the potential of downloading these malicious packages, consider pinning specific package versions in JS projects and using 2FA to publish new package versions to npm.
Also this week, WebAssembly Specification (Wasm) released v3.0. This version dramatically expands the memory Wasm apps can use, supports multiple memory usage, and now allows garbage collection.
It’s been a while since we last covered LLM options for folks who want to run their own models locally or in the browser, so Jack gives a quick rundown of some of the best options out today.
There’s WebLLM from MLC, MediaPipe from Google, and ONNX from Microsoft, and although none are easily interchangeable with another, if cost, privacy, or working offline are concerns of your LLM-enabled app, these may be good options to explore.
Chapter Markers:
Links:
Thanks as always to our sponsor, the Blue Collar Coder channel on YouTube. You can join us in our Discord channel, explore our website and reach us via email, or talk to us on X, Bluesky, or YouTube.

271 Listeners

380 Listeners

291 Listeners

625 Listeners

285 Listeners

41 Listeners

987 Listeners

210 Listeners

210 Listeners

62 Listeners

301 Listeners

59 Listeners

97 Listeners

37 Listeners

64 Listeners