OpenAI API Batch Processor
A FastAPI application for automating bulk question processing using OpenAI's suite of models, optimized for efficient data collection and analysis at scale. The tool reads input questions from CSV files, routes them to the selected OpenAI model (e.g., GPT-4), and saves responses along with detailed metadata, including token usage and cost breakdowns. It includes robust error handling to ensure uninterrupted processing even when individual requests fail, automatic fee calculation based on token consumption, and seamless CSV data persistence. Built with researchers in mind, the application is ideal for examining how large language models respond to subtle changes in input—such as variations in question phrasing or shifts in self-reported political affiliation—when studying effects on persona generation or output tone. It supports secure API key management through environment-based configuration and provides a simple REST endpoint for triggering batch runs. This makes it a valuable tool for research workflows, survey automation, customer support prototyping, or any high-volume use of OpenAI models with full transparency over process and cost.