AI: post transformers

DSPy and TextGrad: Compiling Language Model Systems


Listen Later

These two academic papers introduce novel programming models aimed at systematically optimizing complex AI systems, particularly those built using Large Language Models (LLMs). The first source presents **DSPy**, a framework that abstracts traditional, hard-coded LLM pipelines into parameterized, declarative modules that can be automatically optimized using a compiler and **teleprompters**, demonstrating superior performance compared to hand-crafted prompts on tasks like math word problems. The second source introduces **TEXTGRAD**, a general optimization framework that utilizes LLMs to generate and propagate **natural language gradients**—textual feedback—through computation graphs, applying this "textual differentiation" approach successfully across diverse domains, including prompt optimization, code refinement, and scientific applications like molecular and medical treatment plan design. Both works highlight the shift from relying on expert prompt engineering to employing systematic, programmatic optimization techniques for compound AI systems.


Sources:

October 5, 2023

DSPY: COMPILING DECLARATIVE LANGUAGE

MODEL CALLS INTO SELF-IMPROVING PIPELINES

https://arxiv.org/pdf/2310.03714


June 11, 2024

TextGrad: Automatic “Differentiation” via Text

https://arxiv.org/pdf/2406.07496

...more
View all episodesView all episodes
Download on the App Store

AI: post transformersBy mcgrof