In the first two parts of this series (previous article), we laid the foundation for our weather forecast API and explored how to handle forecasts and historical revisions efficiently. In this part, we’ll focus on optimizing our Pydantic models to make the API documentation more accessible to AI tools, such as those that scrape OpenAPI specs for training or dynamic usage.


The Problem 🧠

While your API might work perfectly for humans, AI tools often need more context and structure to understand it effectively. For example:

  1. Missing or vague field descriptions can confuse AI.
  2. Models without examples or metadata make it harder for AI to generate relevant queries.
  3. Unclear relationships between endpoints, models, and responses can lead to poor insights.

By optimizing Pydantic models, we can enhance the OpenAPI spec generated by FastAPI, ensuring that AI tools like ChatGPT, Postman, or custom integrations can make better use of your API.


Step 1: Enhancing Pydantic Models

Add Field Descriptions

Pydantic allows you to use the Field class to add descriptions and constraints for each field.

Original Model:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
from pydantic import BaseModel
from datetime import datetime

class ForecastCreate(BaseModel):
    start_time: datetime
    end_time: datetime
    temperature: int
    temperature_unit: str
    relative_humidity: int
    wind_speed: str
    wind_direction: str
    short_forecast: str
    icon: str

Enhanced Model:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
from pydantic import BaseModel, Field
from datetime import datetime

class ForecastCreate(BaseModel):
    start_time: datetime = Field(..., description="The start time of the forecast period in ISO 8601 format (UTC).")
    end_time: datetime = Field(..., description="The end time of the forecast period in ISO 8601 format (UTC).")
    temperature: int = Field(..., description="The forecasted temperature.", example=72)
    temperature_unit: str = Field(..., description="The unit for temperature (e.g., Fahrenheit, Celsius).", example="Fahrenheit")
    relative_humidity: int = Field(..., description="The relative humidity as a percentage.", ge=0, le=100, example=45)
    wind_speed: str = Field(..., description="The wind speed (e.g., '15 mph').", example="15 mph")
    wind_direction: str = Field(..., description="The wind direction (e.g., 'NE').", example="NE")
    short_forecast: str = Field(..., description="A brief summary of the forecast.", example="Partly Cloudy")
    icon: str = Field(..., description="A URL to an icon representing the forecast.", example="https://example.com/icons/partly-cloudy.png")

Use Config for Model Metadata

Pydantic models can include a Config class to add model-level metadata.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
class ForecastCreate(BaseModel):
    start_time: datetime = Field(..., description="The start time of the forecast period in ISO 8601 format (UTC).")
    end_time: datetime = Field(..., description="The end time of the forecast period in ISO 8601 format (UTC).")
    # other fields...

    class Config:
        title = "Forecast Creation Model"
        description = (
            "This model is used to create or update weather forecasts. "
            "It includes all necessary fields for defining a forecast."
        )

Step 2: Enhancing the OpenAPI Spec

FastAPI automatically generates an OpenAPI spec from your models and endpoints. You can enhance it further by:

  1. Adding Tags and Summaries to Endpoints:

    1
    2
    3
    4
    5
    6
    7
    8
    
    @router.post("/", response_model=ForecastCreate, tags=["Forecasts"], summary="Create a new forecast")
    async def create_forecast(
        forecast_data: ForecastCreate, db: AsyncSession = Depends(get_db)
    ):
        """
        Create a new weather forecast or update an existing forecast.
        """
        return await upsert_forecast_data(db, forecast_data)
    
  2. Customizing Responses:

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    
    from fastapi.responses import JSONResponse
    
    @router.get("/current", response_model=list[ForecastResponse], tags=["Forecasts"], summary="Get current forecasts")
    async def get_current_forecasts(days: int):
        """
        Retrieve the latest forecasts for the next specified number of days.
        """
        return JSONResponse(
            content={"message": "Success", "data": forecasts},
            status_code=200
        )
    

Step 3: Adding Examples

AI tools benefit from clear examples in your OpenAPI spec. Add examples using the example parameter in Pydantic or through FastAPI response annotations.

Adding Field-Level Examples

In Pydantic models, use the example parameter in Field to provide example values.

1
2
3
4
5
from pydantic import BaseModel, Field

class ForecastCreate(BaseModel):
    temperature: int = Field(..., description="Temperature in Fahrenheit", example=72)
    wind_speed: str = Field(..., description="Wind speed (e.g., '15 mph')", example="15 mph")

Adding Endpoint-Level Examples

You can include examples in the OpenAPI spec for endpoints using FastAPI’s example argument in body parameters or response models.

Example:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
from fastapi import APIRouter
from fastapi.openapi.models import Example

example_forecast = {
    "summary": "Example Forecast",
    "value": {
        "temperature": 72,
        "wind_speed": "15 mph",
        "short_forecast": "Partly Cloudy",
    },
}

router = APIRouter()

@router.post("/forecast", response_model=ForecastCreate, tags=["Forecasts"], summary="Create a forecast")
async def create_forecast(forecast_data: ForecastCreate):
    """
    Create a forecast entry based on user input.
    """
    return forecast_data

Step 4: Testing the AI-Friendliness

Generate OpenAPI Spec

Run your FastAPI app and visit the /docs endpoint to view the updated documentation:

1
uvicorn app.main:app --reload

Visit http://127.0.0.1:8000/docs in your browser to confirm that descriptions, examples, and metadata are showing correctly.

Export OpenAPI JSON

You can also export the OpenAPI spec as a JSON file for external use:

1
curl http://127.0.0.1:8000/openapi.json -o openapi.json

Test with AI Tools

Use AI tools like ChatGPT or Postman to interact with your API. For example:

  • Upload the OpenAPI spec to Postman for automated testing.
  • Use the OpenAPI JSON with AI agents to simulate integration.

What’s Next?

In the final part of this series, we’ll:

  1. Deploy the API using Docker for containerized environments.
  2. Explore monitoring and logging strategies to ensure stability.

Stay tuned! 🌟