Getting output data out of an LLM outside of raw chat interfaces is like scooping water with a leaky bucket. You need to plug more and more holes and add extra buckets to catch the spills. Structured JSON can help for a lot of cases but it makes the text generation less capable.