Advanced Generative AI
- Speaker: Thomas Winters
- Type: Talk
- Date: 2024-04-17
- Location: KU Leuven KULAK
These days, we are surrounded by creative text and image generators like GPT and diffusion models that seem to be able to generate anything we want. But how do we ensure that these types of AI truly aids us in overcoming our unique challenges? This talk sheds light onto several techniques for controlling such generative models. We look at several powerful prompt engineering techniques – the art of enhancing our communication with AI – as well as useful ways of connecting these generators to other systems.
We dive into the world of autoregressive text generators, learn their inner mechanisms and which training phases they went through to get to the current state-of-the-art text generators. These insights help understand why certain prompt engineering techniques (such as few-shot prompting, role-prompting and chain-of-thought prompting) are able to outperform simpler prompting methods.
We also briefly look at several other techniques to overcome limitations of such models, such as retrieval-augmented generation and function calling. Similarly, we uncover the working of diffusion models, and show several techniques to gain more control over the generated images. We show how even some of AI's classic hard problems, such as humor generation, become even more within reach thanks to these large language models and their prompt engineering techniques.
Thomas Winters is a post-doctoral researcher in creative artificial intelligence at KU Leuven. In his research, he develops neuro-symbolic AI models for aiding in co-creative humor writing tasks.
- Links:
- Notebooks: Jupyter Notebooks for some practical exercises.
- GPT-4 platform
- ChatGPT
- Stable Diffusion on Huggingface
- OpenArt Prompt Book: Leer prompten met StableDiffusion