I've been looking for something like this! Does it optimize the prompt template for LangChain only or is there a way I can get it to generate a raw system prompt that I can pass to the OpenAI API directly?
Hello, I'm glad you find it useful. I aimed to create something that would serve a purpose. If you can provide me details about use case you are trying to solve, I may add a feature to llmdantic to support it. Right now:
After initialize llmdantic you can get the prompt by running the following command:
"""
from llmdantic import LLMdantic, LLMdanticConfig