[Optional] Load Generation

Prev Next

Testing Expansion & Shrink — Load Generation Scripts

To optionally validate Lucidity AutoScaler behavior under real-world workload patterns, you can use the following load-generation scripts for Windows and Linux. These scripts simulate a typical storage usage scenario where data gradually grows and then reduces — triggering both expansion and shrink operations.

What These Scripts Do

  • Run continuously in the background

  • Write 100 MB blocks of random data at regular intervals (every 5 seconds)

  • Accumulate enough data to cause the disk to expand

  • Remove the created files to reduce utilization

  • Trigger a shrink once data drops below thresholds

  • Repeat this for multiple cycles (configurable)

Why This is Representative

The pattern of moderate but sustained growth followed by clean-up closely mirrors common enterprise workloads, such as:

  • Logging and telemetry pipelines

  • Database and analytics temporary storage

  • Scratch space for batch jobs

  • Application staging areas

By adjusting delays and number of cycles, customers can model:

  • Slow/steady growth

  • Spiky or burst workloads

  • Normal business-hour usage patterns

This enables a realistic demonstration of Lucidity AutoScaler’s ability to expand and shrink storage — safely and automatically — without downtime.

To run the load generation, simply copy and paste the scripts below based on your OS:

  • Windows

    # fill this section or leave default
    $config = @{
        Path          = "D:\TestPath"      # Where to create files
        DurationHours = 2                   # Hours per cycle (1-6)
        Cycles        = 3                   # Number of cycles to run
    }
    # Run the job (don't modify below)
    Start-Job -ScriptBlock {
        param($Path, $DurationHours, $Cycles)
        if (-not (Test-Path $Path)) { New-Item -Path $Path -ItemType Directory -Force | Out-Null }
        $endTime=(Get-Date).AddHours($DurationHours);$fileNum=0
        for($i=1;$i -le $Cycles;$i++){
            $files=@()
            while((Get-Date) -lt $endTime){
                $fileNum++;$fp=Join-Path $Path "testfile_$fileNum.dat"
                $b=New-Object byte[] (100MB);(New-Object Random).NextBytes($b)
                [IO.File]::WriteAllBytes($fp,$b);$files+=$fp;Start-Sleep 5
            }
            foreach($f in $files){if(Test-Path $f){Remove-Item $f -Force;Start-Sleep 7}}
            if($i -lt $Cycles){$endTime=(Get-Date).AddHours($DurationHours)}
        }
    } -ArgumentList $config.Path, $config.DurationHours, $config.Cycles
  • Linux

    # Run directly in background
    (PATH="/data/testpath"; DURATION_HOURS=2; CYCLES=3; mkdir -p "$PATH"; filenum=0; \
    for ((cycle=1; cycle<=CYCLES; cycle++)); do \
        files=(); end_time=$(($(date +%s) + DURATION_HOURS * 3600)); \
        while [ $(date +%s) -lt $end_time ]; do \
            ((filenum++)); filepath="$PATH/testfile_$filenum.dat"; \
            dd if=/dev/urandom of="$filepath" bs=1M count=100 status=none 2>/dev/null; \
            files+=("$filepath"); sleep 5; \
        done; \
        for file in "${files[@]}"; do [ -f "$file" ] && rm -f "$file" && sleep 7; done; \
    done) > /tmp/file-cycle.log 2>&1 &