The Micronaut Foundation has released Micronaut Framework 4.7.0 in December 2024, four months after the release of version 4.6.0. This version provides LangChain4J support to integrate LLMs into Java applications. The Micronaut Graal Languages module provides integration with Graal-based dynamic languages such as the Micronaut GraalPy feature to interact with Python.
Experimental support for LangChain4J is available via version 0.2.0 of the Micronaut LangChain4j module. Micronaut LangChain4j requires the specification of an annotation processor in the Maven compiler plugin configuration of the Maven POM file:
<annotationProcessorPaths>
<path>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-processor</artifactId>
</path>
</annotationProcessorPaths>
Next to that, the corresponding dependency should be declared in the POM file:
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-core</artifactId>
</dependency>
Micronaut supports various Chat Language Models such as Anthropic, Azure, Amazon Bedrock, HuggingFace, Mistral AI, Ollama, OpenAI, Google Gemini, Google Vertex AI, or Google Vertex AI Gemini. The example provided here uses the Ollama language model and requires the following dependency:
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-ollama</artifactId>
</dependency>
Micronaut also provides dependencies to test the language model:
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-ollama-testresource</artifactId>
<scope>testResourcesService</scope>
</dependency>
Alternatively, it’s possible to configure Micronaut with other build tools such as Gradle.
The model name can be configured via Yaml, Toml, Groovy, Hocon, JSON, or via a properties file with the following content:
langchain4j.ollama.model-name=orca-mini
Now it’s possible to define LangChain4j’s AI Services in a Java file:
@AiService
public interface MyAiService {
@SystemMessage("Let's have a conversation")
String chat(String message);
}
Lastly, the AI service can be used in any Micronaut component. For example, consider the following test:
@MicronautTest
public class AiServiceTest {
@Test
void testAiService(MyAiService myAiService) {
String result = myAiService.chat("What is 42?");
assertNotNull(result);
}
}
Micronaut provides configurations for various Embedding Stores to store data: Elastic Search, MongoDB, Neo4j, Oracle, Open Search, PGVector, Redis and Qdrant.
The source code and Gradle configuration for a generic example project and an OpenAI example project are available on GitHub.
Micronaut also released version 1.0.0 of the Micronaut Graal Languages module which integrates Graal-based dynamic languages. GraalPy is a Python 3.11 compliant runtime built on top of GraalVM. Oracle released GraalPy version 24.1.2 in January 2025. GraalPy is available as a separate download for Oracle GraalVM and GraalVM Community Edition. Two flavors are available: a native standalone version with a Native Image compiled launcher; and a JVM standalone version with Python in the JVM configuration.
Oracle also released the Graal Development Kit for Micronaut (GDK) version 4.7.3.1 in February 2025. The GDK may be used as an alternative to using plain Micronaut. The kit contains a curated set of Micronaut framework modules and their corresponding dependencies. It can be used to build portable cloud-native applications starting instantly with a reduced hardware footprint.
A complete overview of all changes can be found in the release notes and more information is available in the Micronaut documentation.