A comprehensive testing framework for Nim with automatic resource management, CLI testing, and CI/CD integration.
# One command to test like a god:
nimble install nimtest && nimtest→ Get started quickly with comprehensive testing utilities.
nimble install nimtestimport nimtest/api
var ctx = createTestContext()
try:
let file = createTestFile(ctx, "test.txt", "hello")
discard assertFileContains(file, "hello")
finally:
ctx.cleanup()| Feature | Killer Detail |
|---|---|
| TestContext | createTestContext() → ctx.cleanup() in finally |
| File Testing | assertFileContains(), createTempTestDir() |
| CLI Testing | runCliCommand(), assertExitCode() |
| Perf Testing | benchmark("op", 10_000): proc() |
| Reporting | saveReport(rfJunit, "ci.xml") |
| Progress Bars | pbsGlobe, pbsPulse, pbsDots, pbsBlocks — lock-free |
| Cross-Platform | Linux, macOS, Windows |
# .github/workflows/ci.yml
name: CI
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: jiro4989/setup-nim-action@v2
- run: nimble install nimtest
- run: nim c -r examples/test_all.nim
- uses: actions/upload-artifact@v3
with:
name: junit-report
path: test-results.xmlThis documentation covers all aspects of the nimtest framework:
- Quick Start Guide - Get up and running in 5 minutes
- Configuration Guide - Complete setup and configuration
- User Guide - Complete usage instructions
- API Reference - Complete API documentation
- Architecture - Framework design and architecture
- Best Practices - Recommended patterns and practices
- Examples and Patterns - Common testing scenarios
- Contribution Guidelines - How to contribute to the project
- CI/CD Guide - Integration with CI/CD systems
The recommended way to import nimtest is:
import nimtest/apiThis provides access to all core functionality in a clean namespace.
The TestContext manages temporary resources and ensures proper cleanup:
# Create test context
var ctx = createTestContext()
try:
# Create temporary files and directories
let tempDir = createTempTestDir(ctx, "test_prefix")
# ... your test code
finally:
# Cleanup all registered resources
ctx.cleanup()Comprehensive utilities for file and directory testing:
# Basic assertions (return bool, throw exception on failure)
discard assertFileExists("path/to/file")
discard assertFileContains("path/to/file", "expected content")
discard assertDirExists("path/to/directory")Built-in benchmarking and timing utilities:
# Time a code block
let duration = measureTime("operation"):
proc() =
# Your code here
sleep(100)
# Run benchmarks
let results = benchmark("test operation", 1000):
proc() =
# Code to benchmark
discard 1 + 1Five different animated progress bar styles:
let bar = newProgressBar(pbsGlobe, total = 100, message = "Processing...")
for i in 0..99:
# Do work
updateProgress(bar, i + 1)
bar.finish("Complete!")Multiple output formats for different use cases:
var report = newTestSuiteReport("My Tests")
# ... add test results
generateConsoleReport(report) # Human-readable output
saveReport(report, rfJunit, "junit.xml") # CI/CD integration
saveReport(report, rfJson, "report.json") # Programmatic accessnimtest is organized into focused modules:
src/nimtest/
├── api.nim # Public API facade - import this
├── core.nim # TestContext, basic utilities
├── helpers.nim # Advanced assertions, extended utilities
├── reporting.nim # Test results, multiple output formats
├── progress.nim # Progress bar implementations
└── config.nim # Project configuration constants
We welcome contributions! See our Contribution Guidelines for details on how to get involved.
MIT License - see LICENSE for details. # Use nimtest utilities in your tests let testDir = ctx.createTempTestDir("basic_test") let testFile = testDir / "sample.txt" writeFile(testFile, "Hello, nimtest!")
# Verify with assertions
assertFileExists(testFile)
assertFileContains(testFile, "Hello, nimtest!")
# Test passes if no assertion fails
check true == true
### Key Framework Components
#### Resource Management
```nim
# Create test context for automatic cleanup
var ctx = createTestContext()
defer: ctx.cleanup()
# Create temporary resources
let testDir = ctx.createTempTestDir("my_test")
assertFileExists("path/to/file")
assertDirExists("path/to/directory")
assertFileContains("config.json", "expected_content")
assertOutputContains(output, "expected_text")# Example usage with command output stored in a variable
let output = "Version: 1.0.0\nBuild Date: 2025-01-01"
discard assertOutputContains(output, "1.0.0")measureTime("operation name"):
performOperation()
benchmark("operation", 1000):
performOperation()# Create progress bar with different styles
let bar = newProgressBar(pbsGlobe, width = 40, total = 100, message = "Processing...")
# Update progress
updateProgress(bar, 50, "Halfway done...")
bar.display()
# Complete progress bar
bar.finish("All done!")Configure nimtest by editing src/nimtest/config.nim in your project:
const
ProjectName* = "yourproject" # Your project name
ProjectDisplayName* = "Your Project" # Human-readable name
TempDirPrefix* = "nimtest_temp" # Prefix for temporary directories
TestSuiteVersion* = "0.1.0" # Version for test reportsOrganize your tests in a logical directory structure:
tests/
├── unit/ # Unit tests for individual functions/modules
├── integration/ # Integration tests for multiple components
├── performance/ # Performance and benchmark tests
├── cli/ # CLI command tests (if applicable)
├── fixtures/ # Test data and fixture files
├── helpers.nim # Shared test utilities specific to your project
└── test_all.nim # Main test runner
nimtest is designed to work well in CI/CD environments:
- Compatible with GitHub Actions, GitLab CI, and other systems
- Cross-platform support (Linux, Windows, macOS)
- Multiple report formats (JSON, JUnit XML) for CI integration
- Console output formatted for CI logs
- Always use TestContext: Create and clean up contexts properly for resource management
- Use setup/teardown: Initialize resources in setup, clean up in teardown
- Use descriptive test names: Make test names clear and specific
- Test one thing per test: Keep tests focused on a single functionality
- Use meaningful assertions: Provide clear messages for failed assertions
- Clean up resources: Always ensure temporary files and directories are cleaned up
- Use performance utilities: Measure and track performance of critical operations
- Generate reports: Use reporting utilities to track test results over time
nimtest has an ambitious roadmap for 2026 focused on usability, performance, and observability. See ROADMAP.md for the complete strategic plan including:
- Q1 2026 (v1.1): Macro DSL for tests, CLI runner binary, enhanced assertions
- Q2 2026 (v1.2): Parallel execution, async/await support, E2E integration lanes
- Q3 2026 (v1.3): Interactive HTML reports, coverage integration, fuzzing hooks
- Q4 2026+: Wild cards including AI-assisted test generation and compile-time fuzzing
Contribute: Help shape the future by participating in roadmap discussions on the Nim Forum or GitHub Discussions.
nim >= 2.0.0std/unittest(built-in)std/os(built-in)std/osproc(built-in)std/json(built-in)std/parseutils(built-in)
MIT License - See LICENSE file for details