provisioning/tests/integration/docs/TESTING_GUIDE.md
2025-10-07 11:12:02 +01:00

18 KiB

Integration Testing Guide

Version: 1.0.0 Last Updated: 2025-10-06

This guide provides comprehensive documentation for the provisioning platform integration testing suite.

Table of Contents

  1. Overview
  2. Test Infrastructure
  3. Running Tests Locally
  4. Running Tests on OrbStack
  5. Writing New Tests
  6. Test Organization
  7. CI/CD Integration
  8. Troubleshooting

Overview

The integration testing suite validates all four execution modes of the provisioning platform:

  • Solo Mode: Single-user, minimal services (orchestrator, CoreDNS, OCI registry)
  • Multi-User Mode: Multi-user support with Gitea, PostgreSQL, RBAC
  • CI/CD Mode: Automation mode with API server, service accounts
  • Enterprise Mode: Full enterprise features (Harbor, KMS, Prometheus, Grafana, ELK)

Key Features

  • Comprehensive Coverage: Tests for all 4 modes, 15+ services
  • OrbStack Integration: Tests deployable to OrbStack machine "provisioning"
  • Parallel Execution: Run independent tests in parallel for speed
  • Automatic Cleanup: Resources cleaned up automatically after tests
  • Multiple Report Formats: JUnit XML, HTML, JSON
  • CI/CD Ready: GitHub Actions and GitLab CI integration

Test Infrastructure

Prerequisites

  1. OrbStack Installed:

    # Install OrbStack (macOS)
    brew install --cask orbstack
    
  2. OrbStack Machine Named "provisioning":

    # Create OrbStack machine
    orb create provisioning
    
    # Verify machine is running
    orb status provisioning
    
  3. Nushell 0.107.1+:

    # Install Nushell
    brew install nushell
    
  4. Docker CLI:

    # Verify Docker is available
    docker version
    

Test Configuration

The test suite is configured via provisioning/tests/integration/test_config.yaml:

# OrbStack connection
orbstack:
  machine_name: "provisioning"
  connection:
    type: "docker"
    socket: "/var/run/docker.sock"

# Service endpoints
services:
  orchestrator:
    host: "172.20.0.10"
    port: 8080

  coredns:
    host: "172.20.0.2"
    port: 53

  # ... more services

Key Settings:

  • orbstack.machine_name: Name of OrbStack machine to use
  • services.*: IP addresses and ports for deployed services
  • test_execution.parallel.max_workers: Number of parallel test workers
  • test_execution.timeouts.*: Timeout values for various operations

Running Tests Locally

Quick Start

  1. Setup Test Environment:

    # Setup solo mode environment
    nu provisioning/tests/integration/setup_test_environment.nu --mode solo
    
  2. Run Tests:

    # Run all tests for solo mode
    nu provisioning/tests/integration/framework/test_runner.nu --mode solo
    
    # Run specific test file
    nu provisioning/tests/integration/modes/test_solo_mode.nu
    
  3. Teardown Test Environment:

    # Cleanup all resources
    nu provisioning/tests/integration/teardown_test_environment.nu --force
    

Test Runner Options

nu provisioning/tests/integration/framework/test_runner.nu \
  --mode <mode>              # Test specific mode (solo, multiuser, cicd, enterprise)
  --filter <pattern>         # Filter tests by regex pattern
  --parallel <n>             # Number of parallel workers (default: 1)
  --verbose                  # Detailed output
  --report <path>            # Generate HTML report
  --skip-setup               # Skip environment setup
  --skip-teardown            # Skip environment teardown

Examples:

# Run all tests for all modes
nu provisioning/tests/integration/framework/test_runner.nu

# Run only solo mode tests
nu provisioning/tests/integration/framework/test_runner.nu --mode solo

# Run tests matching pattern
nu provisioning/tests/integration/framework/test_runner.nu --filter "dns"

# Run tests in parallel with 4 workers
nu provisioning/tests/integration/framework/test_runner.nu --parallel 4

# Generate HTML report
nu provisioning/tests/integration/framework/test_runner.nu --report /tmp/test-report.html

# Run tests without cleanup (for debugging)
nu provisioning/tests/integration/framework/test_runner.nu --skip-teardown

Running Tests on OrbStack

Setup OrbStack Machine

  1. Create OrbStack Machine:

    # Create machine named "provisioning"
    orb create provisioning --cpu 4 --memory 8192 --disk 100
    
    # Verify machine is created
    orb list
    
  2. Configure Machine:

    # Start machine
    orb start provisioning
    
    # Verify Docker is accessible
    docker -H /var/run/docker.sock ps
    

Deploy Platform to OrbStack

The test setup automatically deploys platform services to OrbStack:

# Deploy solo mode
nu provisioning/tests/integration/setup_test_environment.nu --mode solo

# Deploy multi-user mode
nu provisioning/tests/integration/setup_test_environment.nu --mode multiuser

# Deploy CI/CD mode
nu provisioning/tests/integration/setup_test_environment.nu --mode cicd

# Deploy enterprise mode
nu provisioning/tests/integration/setup_test_environment.nu --mode enterprise

Deployed Services:

Mode Services
Solo Orchestrator, CoreDNS, Zot (OCI registry)
Multi-User Solo services + Gitea, PostgreSQL
CI/CD Multi-User services + API server, Prometheus
Enterprise CI/CD services + Harbor, KMS, Grafana, Elasticsearch

Verify Deployment

# Check service health
nu provisioning/tests/integration/framework/test_helpers.nu check-service-health orchestrator

# View service logs
nu provisioning/tests/integration/framework/orbstack_helpers.nu orbstack-logs orchestrator

# List running containers
docker -H /var/run/docker.sock ps

Writing New Tests

Test File Structure

All test files follow this structure:

# Test Description
# Brief description of what this test validates

use std log
use ../framework/test_helpers.nu *
use ../framework/orbstack_helpers.nu *

# Main test suite
export def main [] {
    log info "Running <Test Suite Name>"

    let test_config = (load-test-config)

    mut results = []

    # Run all tests
    $results = ($results | append (test-case-1 $test_config))
    $results = ($results | append (test-case-2 $test_config))

    # Report results
    report-test-results $results
}

# Individual test case
def test-case-1 [test_config: record] {
    run-test "test-case-1-name" {
        log info "Testing specific functionality..."

        # Test logic
        let result = (some-operation)

        # Assertions
        assert-eq $result.status "success" "Operation should succeed"
        assert-not-empty $result.data "Result should contain data"

        log info "✓ Test case 1 passed"
    }
}

# Report test results
def report-test-results [results: list] {
    # ... reporting logic
}

Using Assertion Helpers

The test framework provides several assertion helpers:

# Equality assertion
assert-eq $actual $expected "Error message if assertion fails"

# Boolean assertions
assert-true $condition "Error message"
assert-false $condition "Error message"

# Collection assertions
assert-contains $list $item "Error message"
assert-not-contains $list $item "Error message"
assert-not-empty $value "Error message"

# HTTP assertions
assert-http-success $response "Error message"

Using Test Fixtures

Create reusable test fixtures:

# Create test workspace
let workspace = create-test-workspace "my-test-ws" {
    provider: "local"
    environment: "test"
}

# Create test server
let server = create-test-server "test-server" "local" {
    cores: 4
    memory: 8192
}

# Cleanup
cleanup-test-workspace $workspace
delete-test-server $server.id

Using Retry Logic

For flaky operations, use retry helpers:

# Retry operation up to 3 times
let result = (with-retry --max-attempts 3 --delay 5 {
    # Operation that might fail
    http get "http://example.com/api"
})

# Wait for condition with timeout
wait-for-condition --timeout 60 --interval 5 {
    # Condition to check
    check-service-health "orchestrator"
} "orchestrator to be healthy"

Example: Writing a New Service Integration Test

# Test Gitea Integration
# Validates Gitea workspace git operations and extension publishing

use std log
use ../framework/test_helpers.nu *

def test-gitea-workspace-operations [test_config: record] {
    run-test "gitea-workspace-git-operations" {
        log info "Testing Gitea workspace operations..."

        # Create workspace
        let workspace = create-test-workspace "gitea-test" {
            provider: "local"
        }

        # Initialize git repo
        cd $workspace.path
        git init

        # Configure Gitea remote
        let gitea_url = $"http://($test_config.services.gitea.host):($test_config.services.gitea.port)"
        git remote add origin $"($gitea_url)/test-user/gitea-test.git"

        # Create test file
        "test content" | save test.txt
        git add test.txt
        git commit -m "Test commit"

        # Push to Gitea
        git push -u origin main

        # Verify push succeeded
        let remote_log = (git ls-remote origin)
        assert-not-empty $remote_log "Remote should have commits"

        log info "✓ Gitea workspace operations work"

        # Cleanup
        cleanup-test-workspace $workspace
    }
}

Test Organization

Directory Structure

provisioning/tests/integration/
├── test_config.yaml                    # Test configuration
├── setup_test_environment.nu           # Environment setup script
├── teardown_test_environment.nu        # Cleanup script
├── framework/                          # Test framework utilities
│   ├── test_helpers.nu                 # Common test helpers
│   ├── orbstack_helpers.nu             # OrbStack integration
│   └── test_runner.nu                  # Test orchestration
├── modes/                              # Mode-specific tests
│   ├── test_solo_mode.nu               # Solo mode tests
│   ├── test_multiuser_mode.nu          # Multi-user mode tests
│   ├── test_cicd_mode.nu               # CI/CD mode tests
│   └── test_enterprise_mode.nu         # Enterprise mode tests
├── services/                           # Service integration tests
│   ├── test_dns_integration.nu         # CoreDNS tests
│   ├── test_gitea_integration.nu       # Gitea tests
│   ├── test_oci_integration.nu         # OCI registry tests
│   └── test_service_orchestration.nu   # Service manager tests
├── workflows/                          # Workflow tests
│   ├── test_extension_loading.nu       # Extension loading tests
│   └── test_batch_workflows.nu         # Batch workflow tests
├── e2e/                                # End-to-end tests
│   ├── test_complete_deployment.nu     # Full deployment workflow
│   └── test_disaster_recovery.nu       # Backup/restore tests
├── performance/                        # Performance tests
│   ├── test_concurrency.nu             # Concurrency tests
│   └── test_scalability.nu             # Scalability tests
├── security/                           # Security tests
│   ├── test_rbac_enforcement.nu        # RBAC tests
│   └── test_kms_integration.nu         # KMS tests
└── docs/                               # Documentation
    ├── TESTING_GUIDE.md                # This guide
    ├── TEST_COVERAGE.md                # Coverage report
    └── ORBSTACK_SETUP.md               # OrbStack setup guide

Test Naming Conventions

  • Test Files: test_<feature>_<category>.nu
  • Test Functions: test-<specific-scenario>
  • Test Names: <mode>-<category>-<specific-scenario>

Examples:

  • File: test_dns_integration.nu
  • Function: test-dns-registration
  • Test Name: solo-mode-dns-registration

CI/CD Integration

GitHub Actions

Create .github/workflows/integration-tests.yml:

name: Integration Tests

on:
  pull_request:
  push:
    branches: [main]
  schedule:
    - cron: '0 2 * * *'  # Nightly at 2 AM

jobs:
  integration-tests:
    runs-on: macos-latest

    strategy:
      matrix:
        mode: [solo, multiuser, cicd, enterprise]

    steps:
      - name: Checkout code
        uses: actions/checkout@v3

      - name: Install OrbStack
        run: brew install --cask orbstack

      - name: Create OrbStack machine
        run: orb create provisioning

      - name: Install Nushell
        run: brew install nushell

      - name: Setup test environment
        run: |
          nu provisioning/tests/integration/setup_test_environment.nu \
            --mode ${{ matrix.mode }}

      - name: Run integration tests
        run: |
          nu provisioning/tests/integration/framework/test_runner.nu \
            --mode ${{ matrix.mode }} \
            --report test-report.html

      - name: Upload test results
        if: always()
        uses: actions/upload-artifact@v3
        with:
          name: test-results-${{ matrix.mode }}
          path: |
            /tmp/provisioning-test-reports/
            test-report.html

      - name: Teardown test environment
        if: always()
        run: |
          nu provisioning/tests/integration/teardown_test_environment.nu --force

GitLab CI

Create .gitlab-ci.yml:

stages:
  - test

integration-tests:
  stage: test
  image: ubuntu:22.04

  parallel:
    matrix:
      - MODE: [solo, multiuser, cicd, enterprise]

  before_script:
    # Install dependencies
    - apt-get update && apt-get install -y docker.io nushell

  script:
    # Setup test environment
    - nu provisioning/tests/integration/setup_test_environment.nu --mode $MODE

    # Run tests
    - nu provisioning/tests/integration/framework/test_runner.nu --mode $MODE --report test-report.html

  after_script:
    # Cleanup
    - nu provisioning/tests/integration/teardown_test_environment.nu --force

  artifacts:
    when: always
    paths:
      - /tmp/provisioning-test-reports/
      - test-report.html
    reports:
      junit: /tmp/provisioning-test-reports/junit-results.xml

Troubleshooting

Common Issues

1. OrbStack Machine Not Found

Error: OrbStack machine 'provisioning' not found

Solution:

# Create OrbStack machine
orb create provisioning

# Verify creation
orb list

2. Docker Connection Failed

Error: Cannot connect to Docker daemon

Solution:

# Verify OrbStack is running
orb status provisioning

# Restart OrbStack
orb restart provisioning

3. Service Health Check Timeout

Error: Timeout waiting for service orchestrator to be healthy

Solution:

# Check service logs
nu provisioning/tests/integration/framework/orbstack_helpers.nu orbstack-logs orchestrator

# Verify service is running
docker -H /var/run/docker.sock ps | grep orchestrator

# Increase timeout in test_config.yaml
# test_execution.timeouts.test_timeout_seconds: 600

4. Test Environment Cleanup Failed

Error: Failed to remove test workspace

Solution:

# Manual cleanup
rm -rf /tmp/provisioning-test-workspace*

# Cleanup OrbStack resources
nu provisioning/tests/integration/framework/orbstack_helpers.nu orbstack-cleanup

5. DNS Resolution Failed

Error: DNS record should exist for server

Solution:

# Check CoreDNS logs
nu provisioning/tests/integration/framework/orbstack_helpers.nu orbstack-logs coredns

# Verify CoreDNS is running
docker -H /var/run/docker.sock ps | grep coredns

# Test DNS manually
dig @172.20.0.2 test-server.local

Debug Mode

Run tests with verbose logging:

# Enable verbose output
nu provisioning/tests/integration/framework/test_runner.nu --verbose --mode solo

# Keep environment after tests for debugging
nu provisioning/tests/integration/framework/test_runner.nu --skip-teardown --mode solo

# Inspect environment manually
docker -H /var/run/docker.sock ps
docker -H /var/run/docker.sock logs orchestrator

Viewing Test Logs

# View test execution logs
cat /tmp/provisioning-test.log

# View service logs
ls /tmp/provisioning-test-reports/logs/

# View HTML report
open /tmp/provisioning-test-reports/test-report.html

Performance Benchmarks

Expected test execution times:

Test Suite Duration (Solo) Duration (Enterprise)
Mode Tests 5-10 min 15-20 min
Service Tests 3-5 min 10-15 min
Workflow Tests 5-10 min 15-20 min
E2E Tests 10-15 min 30-40 min
Total 25-40 min 70-95 min

Parallel Execution (4 workers):

  • Solo mode: ~10-15 min
  • Enterprise mode: ~25-35 min

Best Practices

  1. Idempotent Tests: Tests should be repeatable without side effects
  2. Isolated Tests: Each test should be independent
  3. Clear Assertions: Use descriptive error messages
  4. Cleanup: Always cleanup resources, even on failure
  5. Retry Flaky Operations: Use with-retry for network operations
  6. Meaningful Names: Use descriptive test names
  7. Fast Feedback: Run quick tests first, slow tests later
  8. Log Important Steps: Log key operations for debugging

References


Maintained By: Platform Team Last Updated: 2025-10-06