Skip to main content

Find Node.js test coverage

io.moderne.prethink.calm.FindNodeTestCoverage

Identify test methods in Jest, Mocha, and Vitest test files. Detects describe(), it(), and test() blocks and populates the TestMapping table.

Recipe source

This recipe is only available to users of Moderne.

This recipe is available under the Moderne Proprietary License.

Used by

This recipe is used as part of the following composite recipes:

Usage

This recipe has no required configuration options. Users of Moderne can run it via the Moderne CLI.

You will need to have configured the Moderne CLI on your machine before you can run the following command.

shell
mod run . --recipe FindNodeTestCoverage

If the recipe is not available locally, then you can install it using:

mod config recipes jar install io.moderne.recipe:rewrite-prethink:0.3.0

See how this recipe works across multiple open-source repositories

Run this recipe on OSS repos at scale with the Moderne SaaS.

The community edition of the Moderne platform enables you to easily run recipes across thousands of open-source repositories.

Please contact Moderne for more information about safely running the recipes on your own codebase in a private SaaS.

Data Tables

Test mapping

io.moderne.prethink.table.TestMapping

Maps test methods to implementation methods with optional AI-generated summaries and inference metrics.

Column NameDescription
Test source pathThe path to the source file containing the test.
Test classThe fully qualified name of the test class.
Test methodThe signature of the test method.
Implementation source pathThe path to the source file containing the implementation.
Implementation classThe fully qualified name of the implementation class.
Implementation methodThe signature of the implementation method being tested.
Test summaryAI-generated summary of what the test is verifying.
Test checksumSHA-256 checksum of the test method source code for cache validation.
Inference time (ms)Time taken for the LLM to generate the summary, in milliseconds.
Input tokensNumber of tokens in the input prompt sent to the LLM.
Output tokensNumber of tokens in the response generated by the LLM.