In the first two parts of this series (previous article), we laid the foundation for our weather forecast API and explored how to handle forecasts and historical revisions efficiently. In this part, we’ll focus on optimizing our Pydantic models to make the API documentation more accessible to AI tools, such as those that scrape OpenAPI specs for training or dynamic usage.
The Problem 🧠
While your API might work perfectly for humans, AI tools often need more context and structure to understand it effectively. For example:
- Missing or vague field descriptions can confuse AI.
- Models without examples or metadata make it harder for AI to generate relevant queries.
- Unclear relationships between endpoints, models, and responses can lead to poor insights.
By optimizing Pydantic models, we can enhance the OpenAPI spec generated by FastAPI, ensuring that AI tools like ChatGPT, Postman, or custom integrations can make better use of your API.
Step 1: Enhancing Pydantic Models
Add Field Descriptions
Pydantic allows you to use the Field
class to add descriptions and constraints for each field.
Original Model:
|
|
Enhanced Model:
|
|
Use Config
for Model Metadata
Pydantic models can include a Config
class to add model-level metadata.
|
|
Step 2: Enhancing the OpenAPI Spec
FastAPI automatically generates an OpenAPI spec from your models and endpoints. You can enhance it further by:
Adding Tags and Summaries to Endpoints:
1 2 3 4 5 6 7 8
@router.post("/", response_model=ForecastCreate, tags=["Forecasts"], summary="Create a new forecast") async def create_forecast( forecast_data: ForecastCreate, db: AsyncSession = Depends(get_db) ): """ Create a new weather forecast or update an existing forecast. """ return await upsert_forecast_data(db, forecast_data)
Customizing Responses:
1 2 3 4 5 6 7 8 9 10 11
from fastapi.responses import JSONResponse @router.get("/current", response_model=list[ForecastResponse], tags=["Forecasts"], summary="Get current forecasts") async def get_current_forecasts(days: int): """ Retrieve the latest forecasts for the next specified number of days. """ return JSONResponse( content={"message": "Success", "data": forecasts}, status_code=200 )
Step 3: Adding Examples
AI tools benefit from clear examples in your OpenAPI spec. Add examples using the example
parameter in Pydantic or through FastAPI response annotations.
Adding Field-Level Examples
In Pydantic models, use the example
parameter in Field
to provide example values.
|
|
Adding Endpoint-Level Examples
You can include examples in the OpenAPI spec for endpoints using FastAPI’s example
argument in body parameters or response models.
Example:
|
|
Step 4: Testing the AI-Friendliness
Generate OpenAPI Spec
Run your FastAPI app and visit the /docs
endpoint to view the updated documentation:
|
|
Visit http://127.0.0.1:8000/docs
in your browser to confirm that descriptions, examples, and metadata are showing correctly.
Export OpenAPI JSON
You can also export the OpenAPI spec as a JSON file for external use:
|
|
Test with AI Tools
Use AI tools like ChatGPT or Postman to interact with your API. For example:
- Upload the OpenAPI spec to Postman for automated testing.
- Use the OpenAPI JSON with AI agents to simulate integration.
What’s Next? ⏩
In the final part of this series, we’ll:
- Deploy the API using Docker for containerized environments.
- Explore monitoring and logging strategies to ensure stability.
Stay tuned! 🌟