
Sign up to save your podcasts
Or


Enterprises are wrestling with delivering data to fuel their AI efforts, hitting roadblocks around data security and privacy concerns and sifting through use cases and models to put it to work. Too many are making high-stake gambles feeding vast quantities of data into massive models. Jesse Robbins, one of the founders of Chef, a progenitor of the DevOps movement, a builder of the early Internet infrastructure and now partner at Heavybit, joins host Eric Hanselman to look at alternatives to the path that many are taking in pursuit of successful AI projects. In much the same way that DevOps patterns look to shift application development to more smaller, incremental changes with a pipeline that drives continuous improvement, AI projects can work with smaller models and localized datasets to manage risk and iterate faster. It's a pattern that avoids concerns of pushing sensitive data to cloud-based offerings by working locally. Using smaller models reduces infrastructure costs and the need for vast quantities of GPU's.
Larger models sizes and data sets create two problems – more computational power and supporting infrastructure is required and more data complicates data provenance, security and ownership issues. Starting smaller and expecting to iterate on the results locally can have multiple benefits. If the data being used never leaves the local confines, security concerns are constrained to local environments. Tools like the open source project Ollama can deliver a choice of models to fit a variety of use cases and infrastructure capacities. Just like DevOps patterns, starting small and iterating quickly can get further faster and with lower risk.
More S&P Global Content:
Credits:
Other Resources:
By S&P Global Market Intelligence4.9
2828 ratings
Enterprises are wrestling with delivering data to fuel their AI efforts, hitting roadblocks around data security and privacy concerns and sifting through use cases and models to put it to work. Too many are making high-stake gambles feeding vast quantities of data into massive models. Jesse Robbins, one of the founders of Chef, a progenitor of the DevOps movement, a builder of the early Internet infrastructure and now partner at Heavybit, joins host Eric Hanselman to look at alternatives to the path that many are taking in pursuit of successful AI projects. In much the same way that DevOps patterns look to shift application development to more smaller, incremental changes with a pipeline that drives continuous improvement, AI projects can work with smaller models and localized datasets to manage risk and iterate faster. It's a pattern that avoids concerns of pushing sensitive data to cloud-based offerings by working locally. Using smaller models reduces infrastructure costs and the need for vast quantities of GPU's.
Larger models sizes and data sets create two problems – more computational power and supporting infrastructure is required and more data complicates data provenance, security and ownership issues. Starting smaller and expecting to iterate on the results locally can have multiple benefits. If the data being used never leaves the local confines, security concerns are constrained to local environments. Tools like the open source project Ollama can deliver a choice of models to fit a variety of use cases and infrastructure capacities. Just like DevOps patterns, starting small and iterating quickly can get further faster and with lower risk.
More S&P Global Content:
Credits:
Other Resources:

1,993 Listeners

2,672 Listeners

1,649 Listeners

1,105 Listeners

154 Listeners

6 Listeners

1,448 Listeners

41 Listeners

9 Listeners

6 Listeners

684 Listeners

232 Listeners

28 Listeners

28 Listeners

9 Listeners

4 Listeners

61 Listeners

28 Listeners

11 Listeners

10,254 Listeners

4 Listeners

5,576 Listeners

214 Listeners

1 Listeners

69 Listeners

194 Listeners

146 Listeners

6 Listeners

3 Listeners

0 Listeners

7 Listeners

5 Listeners

5 Listeners

61 Listeners