Quarkus Getting Started
This guide walks you through integrating Memory Service with a Quarkus AI agent. You’ll start with basic chat memory and can progressively add features in the follow-up guides.
Make sure you’ve completed the prerequisites before starting this guide.
Step 1: Create a Simple LangChain4j App
Starting checkpoint: View the complete code at java/quarkus/examples/doc-checkpoints/01-basic-agent
First, let’s create a new Quarkus application with:
quarkus create example
cd example
Add the LangChain4j OpenAI dependency to your pom.xml:
<dependency>
<groupId>io.quarkiverse.langchain4j</groupId>
<artifactId>quarkus-langchain4j-openai</artifactId>
<version>${langchain4j.version}</version>
</dependency> Why: This extension integrates LangChain4j with Quarkus and provides built-in support for calling OpenAI-compatible LLMs, along with CDI-managed AI service beans via @RegisterAiService.
Create a simple AI service interface:
package org.acme;
import io.quarkiverse.langchain4j.RegisterAiService;
import jakarta.enterprise.context.ApplicationScoped;
@ApplicationScoped
@RegisterAiService
public interface Agent {
String chat(String userMessage);
} Why: LangChain4j generates the LLM-calling implementation at compile time from the interface signature. Declaring it as an interface means you never write networking code — Quarkus injects a fully configured AI service bean wherever you use @Inject Agent.
Create a REST resource to expose the agent:
package org.acme;
import jakarta.inject.Inject;
import jakarta.ws.rs.Consumes;
import jakarta.ws.rs.POST;
import jakarta.ws.rs.Path;
import jakarta.ws.rs.Produces;
import jakarta.ws.rs.core.MediaType;
@Path("/chat")
public class ChatResource {
@Inject Agent agent;
@POST
@Consumes(MediaType.TEXT_PLAIN)
@Produces(MediaType.TEXT_PLAIN)
public String chat(String userMessage) {
return agent.chat(userMessage);
}
} Why: This is the entry point that HTTP clients hit. The resource delegates immediately to the AI service, keeping the REST layer thin and the LLM interaction logic in the Agent interface.
Configure your LLM in application.properties so it gets OpenAI credentials from the environment variables and change the HTTP port to 9090 to avoid conflict with services that will be started later:
quarkus.http.port=9090
quarkus.langchain4j.openai.api-key=${OPENAI_API_KEY}
quarkus.langchain4j.openai.base-url=${OPENAI_BASE_URL:https://api.openai.com}/v1
# Readiness endpoint
quarkus.http.auth.permission.ready.paths=/ready
quarkus.http.auth.permission.ready.policy=permit Why: Reading credentials from environment variables keeps secrets out of source control. The port is moved to 9090 so it doesn’t conflict with the Memory Service and Keycloak that will be started on ports 8082 and 8081 in later steps.
Run your agent:
export OPENAI_API_KEY=your-api-keymvn quarkus:devTest it with curl:
curl -NsSfX POST http://localhost:9090/chat \
-H "Content-Type: text/plain" \
-d "Hi, I'm Hiram, who are you?" Example output:
I am Claude, an AI assistant created by Anthropic. I'm here to help answer questions and have conversations on a wide variety of topics. How can I assist you today? Expected: The agent responds but has no memory of your name.
curl -NsSfX POST http://localhost:9090/chat \
-H "Content-Type: text/plain" \
-d "Who am I?" Example output:
That's a great question! I don't have information about your identity from our conversation so far. Could you tell me more about yourself? Let’s add conversation memory.
Step 2: Add Memory Service Extension
Starting checkpoint: View the complete code at java/quarkus/examples/doc-checkpoints/02-with-memory
Add the Memory Service extension to your pom.xml:
<dependency>
<groupId>io.github.chirino.memory-service</groupId>
<artifactId>memory-service-extension</artifactId>
<version>999-SNAPSHOT</version>
</dependency>
Update the Agent interface to accept a conversation ID using the @MemoryId annotation:
package org.acme;
import dev.langchain4j.service.MemoryId;
import io.quarkiverse.langchain4j.RegisterAiService;
import jakarta.enterprise.context.ApplicationScoped;
@ApplicationScoped
@RegisterAiService
public interface Agent {
String chat(@MemoryId String conversationId, String userMessage);
} Why: The @MemoryId annotation tells LangChain4j which conversation the call belongs to. The Memory Service extension uses this ID to load and save the agent’s memory so each conversation is isolated and the LLM receives only the history for that specific conversation.
Update the REST resource to accept a conversation ID:
package org.acme;
import io.smallrye.common.annotation.Blocking;
import jakarta.inject.Inject;
import jakarta.ws.rs.Consumes;
import jakarta.ws.rs.POST;
import jakarta.ws.rs.Path;
import jakarta.ws.rs.PathParam;
import jakarta.ws.rs.Produces;
import jakarta.ws.rs.core.MediaType;
@Path("/chat")
public class ChatResource {
@Inject Agent agent;
@POST
@Path("/{conversationId}")
@Blocking
@Consumes(MediaType.TEXT_PLAIN)
@Produces(MediaType.TEXT_PLAIN)
public String chat(@PathParam("conversationId") String conversationId, String userMessage) {
return agent.chat(conversationId, userMessage);
}
} What changed: The endpoint path now includes /{conversationId} and the method accepts it as a @PathParam. The @Blocking annotation was also added. Why: Passing the conversation ID in the URL makes each conversation addressable by the caller. @Blocking is required here because the Memory Service client makes synchronous HTTP calls to store and retrieve memory, which would deadlock on Vert.x’s non-blocking event loop.
Start the Memory Service in Docker Compose following the Getting Started guide.
Configure the agent to connect to the Memory Service and configure OIDC authentication in application.properties:
# Memory Service Client
memory-service.client.url=http://localhost:8082
memory-service.client.api-key=agent-api-key-1
# OIDC
quarkus.oidc.auth-server-url=http://localhost:8081/realms/memory-service
quarkus.oidc.token.issuer=http://localhost:8081/realms/memory-service
quarkus.oidc.client-id=memory-service-client
quarkus.oidc.credentials.secret=change-me
quarkus.http.auth.permission.authenticated.paths=/chat/*
quarkus.http.auth.permission.authenticated.policy=authenticated What changed: Added Memory Service client URL and API key, OIDC server configuration, and a route policy that requires authentication on /chat/*. Why: The Memory Service uses a dual-authentication pattern — your Quarkus app authenticates to it using a service account API key, while the user’s Bearer token is passed through to enforce ownership of conversations. Enabling OIDC on the Quarkus side allows it to validate and propagate the user’s identity automatically.
Run your agent again:
export OPENAI_API_KEY=your-api-key
mvn quarkus:devMake sure you define a shell function that can get the bearer token for the bob user:
function get-token() {
curl -sSfX POST http://localhost:8081/realms/memory-service/protocol/openid-connect/token \
-H "Content-Type: application/x-www-form-urlencoded" \
-d "client_id=memory-service-client" \
-d "client_secret=change-me" \
-d "grant_type=password" \
-d "username=bob" \
-d "password=bob" \
| jq -r '.access_token'
}Test it with curl—now with conversation memory:
curl -NsSfX POST http://localhost:9090/chat/6a6e7762-4569-4f63-b01c-c3666a27428b \
-H "Content-Type: text/plain" \
-H "Authorization: Bearer $(get-token)" \
-d "Hi, I'm Hiram, who are you?" Example output:
Hi Hiram! I'm an AI created to help answer questions and provide information. How can I assist you today? Expected: The agent now remembers your name from the previous message.
curl -NsSfX POST http://localhost:9090/chat/6a6e7762-4569-4f63-b01c-c3666a27428b \
-H "Content-Type: text/plain" \
-H "Authorization: Bearer $(get-token)" \
-d "Who am I?" Example output:
You are Hiram. If you browse to the demo agent app at http://localhost:8080/, you will see that a conversation has been created with the ID 6a6e7762-4569-4f63-b01c-c3666a27428b.
But it won’t show any messages. That’s because we are not yet storing what we call the history of the conversation. The only thing being stored is the agent memory, and
that’s not typically what you want to display to a user in a UI.
Next Steps
Continue to Conversation History to learn how to:
- Record conversation history for frontend display
- Expose conversation APIs (messages, listing)
- Build a complete chat UI experience
Or jump ahead to:
- Conversation Forking - Branch conversations to explore alternative paths
- Response Recording and Resumption - Streaming responses with resume and cancel support