
Sign up to save your podcasts
Or


In an interview with The New Stack, renowned technologist Adrian Cockcroft discussed the process of fine-tuning Large Language Models (LLMs) through prompt engineering. Cockcroft, known for his roles at Netflix and Amazon Web Services, explained how to obtain tailored programming advice from an LLM. By crafting specific prompts like asking the model to provide code in the style of a certain expert programmer, such as Java's James Gosling, users can guide the AI's output.
Prompt engineering involves setting up conversations to bias the AI's responses. These prompts are becoming more advanced with plugins and loaded information that shape the model's behavior before use. Cockcroft highlighted the concept of fine-tuning, where models are adapted beyond what a prompt can contain. Companies are incorporating vast amounts of their internal data, like wiki pages and corporate documents, to train the model to understand their specific domain and processes.
Cockcroft pointed out the efficacy of ChatGPT within certain tasks, illustrated by his experience using it for data analysis and programming assistance. He also discussed the growing need for improved results from LLMs, which has led to the demand for vector databases. These databases store word meanings as vectors with associated weights, enabling fuzzy matching for enhanced information retrieval from LLMs. In essence, Cockcroft emphasized the multifaceted process of shaping and optimizing LLMs through prompt engineering and fine-tuning, reflecting the evolving landscape of AI-human interactions.
Learn more from The New Stack about LLMs and Prompt Engineering:
Top 5 Large Language Models and How to Use Them Effectively
The Pros (And Con) of Customizing Large Language Models
Prompt Engineering: Get LLMs to Generate the Content You Want
Developer Tips in AI Prompt Engineering
Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
By The New Stack4.3
3131 ratings
In an interview with The New Stack, renowned technologist Adrian Cockcroft discussed the process of fine-tuning Large Language Models (LLMs) through prompt engineering. Cockcroft, known for his roles at Netflix and Amazon Web Services, explained how to obtain tailored programming advice from an LLM. By crafting specific prompts like asking the model to provide code in the style of a certain expert programmer, such as Java's James Gosling, users can guide the AI's output.
Prompt engineering involves setting up conversations to bias the AI's responses. These prompts are becoming more advanced with plugins and loaded information that shape the model's behavior before use. Cockcroft highlighted the concept of fine-tuning, where models are adapted beyond what a prompt can contain. Companies are incorporating vast amounts of their internal data, like wiki pages and corporate documents, to train the model to understand their specific domain and processes.
Cockcroft pointed out the efficacy of ChatGPT within certain tasks, illustrated by his experience using it for data analysis and programming assistance. He also discussed the growing need for improved results from LLMs, which has led to the demand for vector databases. These databases store word meanings as vectors with associated weights, enabling fuzzy matching for enhanced information retrieval from LLMs. In essence, Cockcroft emphasized the multifaceted process of shaping and optimizing LLMs through prompt engineering and fine-tuning, reflecting the evolving landscape of AI-human interactions.
Learn more from The New Stack about LLMs and Prompt Engineering:
Top 5 Large Language Models and How to Use Them Effectively
The Pros (And Con) of Customizing Large Language Models
Prompt Engineering: Get LLMs to Generate the Content You Want
Developer Tips in AI Prompt Engineering
Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

9 Listeners

3 Listeners

289 Listeners

1,089 Listeners

625 Listeners

43 Listeners

4 Listeners

226 Listeners

988 Listeners

190 Listeners

211 Listeners

203 Listeners

63 Listeners

511 Listeners

494 Listeners

33 Listeners

467 Listeners

35 Listeners