AI Today Podcast: Artificial Intelligence Insights, Experts, and Opinion

Prompt Engineering Best Practices: Using Custom Instructions [AI Today Podcast]

04.19.2024 - By AI & Data TodayPlay

Download our free app to listen on your phone

Download on the App StoreGet it on Google Play

As folks continue to use LLMs, best practices are emerging to help users get the most out of LLMs. OpenAI's ChatGPT allows users to tailor responses to match their tone and desired output goals. Many have reported that using custom instructions results in much more accurate, precise, consistent, and predictable results. But why would you want to do this and why does it matter? In this episode, hosts Kathleen Walch and Ron Schmelzer discuss why this is a best practice.

What are custom instructions in ChatGPT?

In ChatGPT, custom instructions are provided by answering two questions asked in settings that get sent along with your prompts:

What would you like ChatGPT to know about you to provide better responses?

How would you like ChatGPT to respond?

It's important to note that once created, these instructions will apply to all future chat prompt sessions (not previous or existing ones). This allows you to make somewhat permanent settings that don’t have to be constantly reset. Custom prompt instructions are short and generally limited to about 1500 characters, so keep it precise and concise.

Show Notes:

Free Intro to CPMAI course

CPMAI Certification

Subscribe to Cognilytica newsletter on LinkedIn

Properly Scoping AI Projects [AI Today Podcast]

Prompt Engineering Best Practices: What is Prompt Chaining? [AI Today Podcast]

AI Today Podcast: AI Glossary Series – OpenAI, GPT, DALL-E, Stable Diffusion

AI Today Podcast: AI Glossary Series – Tokenization and Vectorization

More episodes from AI Today Podcast: Artificial Intelligence Insights, Experts, and Opinion