Skip to main content

Comprehend code with AI

io.moderne.prethink.ComprehendCode

Use an LLM to generate descriptions for classes and methods in the codebase. Descriptions are cached based on source code checksums to avoid regenerating descriptions for unchanged code.

Recipe source

This recipe is only available to users of Moderne.

This recipe is available under the Moderne Proprietary License.

Options

TypeNameDescriptionExample
StringproviderOptional. LLM provider: openai, gemini, or poolside.poolside
StringapiKeyOptional. API key for the LLM provider.sk-...
StringmodelOptional. Model name to use.malibu
StringbaseUrlOptional. Custom base URL for the LLM provider (for self-hosted instances).https://divers.poolsi.de/openai/v1/
IntegerrequestsPerMinuteOptional. Rate limit for LLM requests.60

Used by

This recipe is used as part of the following composite recipes:

Usage

This recipe has no required configuration options. Users of Moderne can run it via the Moderne CLI.

You will need to have configured the Moderne CLI on your machine before you can run the following command.

shell
mod run . --recipe ComprehendCode

If the recipe is not available locally, then you can install it using:

mod config recipes jar install io.moderne.recipe:rewrite-prethink:0.3.1

See how this recipe works across multiple open-source repositories

Run this recipe on OSS repos at scale with the Moderne SaaS.

The community edition of the Moderne platform enables you to easily run recipes across thousands of open-source repositories.

Please contact Moderne for more information about safely running the recipes on your own codebase in a private SaaS.

Data Tables

Method descriptions

io.moderne.prethink.table.MethodDescriptions

AI-generated descriptions of methods in the codebase with inference time and token usage metrics.

Column NameDescription
Source pathThe path to the source file containing the method.
Class nameThe fully qualified name of the class containing the method.
SignatureThe method signature including parameter types.
ChecksumSHA-256 checksum of the method source code for cache validation.
DescriptionAI-generated description of what the method does.
Return value descriptionAI-generated description of what the method returns.
Technique 1First programming technique or pattern used in the method.
Technique 2Second programming technique or pattern used in the method.
Technique 3Third programming technique or pattern used in the method.
Inference time (ms)Time taken for the LLM to generate the description, in milliseconds.
Input tokensNumber of tokens in the input prompt sent to the LLM.
Output tokensNumber of tokens in the response generated by the LLM.