Lab 5 AI Prototype
What the heck are interrupt routines??
Learning Goals
The goal of this prototype is to experiment with using AI as a coding assistant to generate code. By the end of this experiment you should be able to:
- Demonstrate how an LLM can be used as a coding partner.
- Analyze the quality of the LLM-generated code.
- Document specific tips on how to use the LLM effectively.
Prototype
Fire up your favorite LLM. ChatGPT is a good place to start, but you may consider using other LLMs as well like Claude or Gemini. Enter the prompt below.
Write me interrupt handlers to interface with a quadrature encoder. I’m using the STM32L432KC, what pins should I connect the encoder to in order to allow it to easily trigger the interrupts?
Once the LLM responds, plug in the code as a replacement for what you wrote. Does it run? If so, how does it compare with your original design? If not, see if you can prompt the LLM to debug.
Reflect
Write up a few paragraphs reflecting on your experience using the LLM to help you code. Feel free to make full use of screenshots, code snippets, and other media as you write your reflections.
Here are a few ideas of on what you might comment on:
- How would you rate the quality of the output and why?
- Did the LLM generate any code for you? If so, how does it compare to your setup?
- How does the LLM’s explanation compare to your reasoning for choosing your timer?
- Does the LLM work well as a sounding board, rather than a code generator as used in previous prototypes?