Python Conversation History
This guide continues from Python Getting Started and shows how to record conversation history and expose APIs for frontend applications.
Prerequisites
Starting checkpoint: View the code from the previous section at python/examples/langchain/doc-checkpoints/02-with-memory
Make sure you’ve completed the Python Getting Started guide first. You should have:
- A working LangChain agent with Memory Service checkpointing
- Memory Service running via Docker Compose
- OIDC authentication configured
Also complete Step 2 in Python Dev Setup (build local memory-service-langchain wheel + UV_FIND_LINKS); this is temporary until the package is released.
Enable Conversation History Recording
In the previous guide, you added conversation memory, but frontend conversation views still need history channel entries.
To enable that, wire MemoryServiceHistoryMiddleware into the agent and bind request context:
checkpointer = MemoryServiceCheckpointSaver.from_env()
history_middleware = MemoryServiceHistoryMiddleware.from_env()
agent = create_agent(
model=model,
tools=[],
checkpointer=checkpointer,
middleware=[history_middleware],
system_prompt="You are a Python memory-service demo agent.",
)
app = FastAPI(title="Python LangChain Agent With Conversation History") What changed: MemoryServiceHistoryMiddleware.from_env() is constructed and passed into create_agent(middleware=[history_middleware]). The chat endpoint wraps the agent.invoke(...) call in with memory_service_scope(conversation_id): instead of only setting thread_id in configurable.
Why: The middleware intercepts the agent’s model call, writing a USER entry to the history channel before the call and an AI entry after. memory_service_scope(conversation_id) sets a context variable that the middleware reads to know which conversation to write to — without this scope, the middleware has nowhere to record the entries. The checkpointer continues storing LangGraph state in the context channel unchanged.
Make sure you define a shell function that can get the bearer token for the bob user:
function get-token() {
curl -sSfX POST http://localhost:8081/realms/memory-service/protocol/openid-connect/token \
-H "Content-Type: application/x-www-form-urlencoded" \
-d "client_id=memory-service-client" \
-d "client_secret=change-me" \
-d "grant_type=password" \
-d "username=bob" \
-d "password=bob" \
| jq -r '.access_token'
}Now test it again.
curl -NsSfX POST http://localhost:9090/chat/0ef33d7b-11b1-4992-9785-681e222dbcd2 \
-H "Content-Type: text/plain" \
-H "Authorization: Bearer $(get-token)" \
-d "Give me a random number between 1 and 100." Example output:
42 Expose Conversation Entries API
Checkpoint 03 exposes the same conversation endpoints as the Quarkus checkpoint:
GET /v1/conversations/{conversation_id}GET /v1/conversations/{conversation_id}/entries
The entries endpoint forces channel=history so frontend clients always receive recorded conversation turns.
response = await proxy.get_conversation(conversation_id)
return to_fastapi_response(response)
@app.get("/v1/conversations/{conversation_id}/entries")
async def get_entries(conversation_id: str, request: Request):
response = await proxy.list_conversation_entries(
conversation_id,
after_cursor=request.query_params.get("afterCursor"),
limit=int(limit) if (limit := request.query_params.get("limit")) is not None else None,
channel="history",
)
return to_fastapi_response(response)
@app.get("/v1/conversations") What changed: proxy.list_conversation_entries(...) is called with the hardcoded channel="history" parameter. The get_conversation endpoint delegates entirely to the proxy without modification.
Why: The Memory Service stores two kinds of data per conversation — raw LangGraph checkpoint state in the context channel and human-readable turns in the history channel. Forcing channel="history" here ensures frontend clients always receive the readable turn-by-turn view and never accidentally see internal LangGraph state blobs. The MemoryServiceProxy handles forwarding the user’s Bearer token automatically, so no manual header construction is needed.
Test it with curl:
curl -sSfX GET http://localhost:9090/v1/conversations/0ef33d7b-11b1-4992-9785-681e222dbcd2 \
-H "Authorization: Bearer $(get-token)" | jq Example output:
{
"id": "0ef33d7b-11b1-4992-9785-681e222dbcd2",
"ownerUserId": "bob",
"accessLevel": "owner",
"title": "Give me a random number between 1 and 100."
} curl -sSfX GET http://localhost:9090/v1/conversations/0ef33d7b-11b1-4992-9785-681e222dbcd2/entries \
-H "Authorization: Bearer $(get-token)" | jq Example output:
{
"id": "0ef33d7b-11b1-4992-9785-681e222dbcd2",
"title": "Give me a random number between 1 and 10",
"ownerUserId": "bob",
"metadata": {},
"createdAt": "2026-03-06T14:58:32.343885Z",
"updatedAt": "2026-03-06T14:58:32.458576Z",
"accessLevel": "owner"
} Expose Conversation Listing API
To let users see all conversations they can access, expose GET /v1/conversations:
response = await proxy.list_conversations(
mode=request.query_params.get("mode"),
after_cursor=request.query_params.get("afterCursor"),
limit=int(limit) if (limit := request.query_params.get("limit")) is not None else None,
query=request.query_params.get("query"),
)
return to_fastapi_response(response) What changed: A new GET /v1/conversations endpoint is added that calls proxy.list_conversations(...).
Why: The endpoint passes mode, afterCursor, limit, and query query parameters through to the Memory Service unchanged, so the frontend can support pagination and keyword filtering without the agent app needing to understand those semantics. The proxy handles Bearer token forwarding automatically, keeping the endpoint itself free of authentication logic.
Test it with curl:
curl -sSfX GET http://localhost:9090/v1/conversations \
-H "Authorization: Bearer $(get-token)" | jq Example output:
{
"afterCursor": null,
"data": [
{
"id": "bedf7599-4b34-4310-83de-d70e95922774",
"conversationId": "0ef33d7b-11b1-4992-9785-681e222dbcd2",
"userId": "bob",
"clientId": "checkpoint-agent",
"channel": "history",
"contentType": "history",
"createdAt": "2026-03-06T14:58:32.349103Z",
"content": [
{
"role": "USER",
"text": "Give me a random number between 1 and 100."
}
]
},
{
"id": "3bfad611-ef90-40b5-a5e3-0ea20397e416",
"conversationId": "0ef33d7b-11b1-4992-9785-681e222dbcd2",
"userId": "bob",
"clientId": "checkpoint-agent",
"channel": "history",
"contentType": "history",
"createdAt": "2026-03-06T14:58:32.42779Z",
"content": [
{
"role": "AI",
"text": "The random number between 1 and 100 is 42."
}
]
}
]
} Completed Checkpoint
Completed code: View the full implementation at python/examples/langchain/doc-checkpoints/03-with-history
Next Steps
Continue to:
- Indexing and Search — Add search indexing and semantic search to your conversations
- Conversation Forking — Branch conversations to explore alternative paths
- Response Recording and Resumption — Streaming responses with resume and cancel support