Batch Processing Use Case

Carbon-Aware Batch Processing & Data Pipelines

Schedule ETL jobs, data processing, and batch workloads during low-carbon, off-peak hours to reduce both emissions and electricity costs.

The Opportunity in Batch Workloads

Batch processing jobs—ETL pipelines, data aggregations, report generation, backups—often run on fixed schedules (midnight, hourly, daily) regardless of grid conditions or electricity pricing. These deferrable workloads are perfect candidates for carbon and cost optimization through intelligent scheduling.

ETL Pipelines

Overnight data transformations

Scheduled Jobs

Cron-based processing tasks

Cost Savings

Time-of-use rate optimization

Dual Optimization: Carbon + Cost

Carbon-Aware Scheduling

Schedule batch jobs to run when grid carbon intensity is lowest, automatically shifting workloads to cleaner energy windows.

  • Real-time carbon intensity data
  • Configurable thresholds per job
  • Maximum delay guarantees (SLA protection)

Price-Aware Scheduling

Optimize for time-of-use (TOU) electricity pricing by scheduling during off-peak hours, reducing infrastructure costs alongside carbon emissions.

  • Custom TOU pricing schedules
  • Combine carbon + cost optimization
  • Prometheus metrics for ROI tracking

Perfect For These Workloads

ETL & Data Transformation

Overnight data pipelines that process logs, aggregate metrics, or transform data for analytics. These jobs often have flexible completion windows (e.g., "must complete by 6 AM").

Example: Daily data warehouse refresh, log aggregation, reporting pipelines

Scheduled Maintenance & Backups

Database backups, index rebuilds, cache warming, and other maintenance tasks that run on cron schedules but don't need precise timing.

Example: Database backups, index optimization, cache pre-population

Periodic Report Generation

Weekly/monthly reports, analytics dashboards, or data exports that can be generated within a time window rather than at an exact moment.

Example: Weekly business reports, analytics exports, compliance reports

Quick Setup

1. Install (2 minutes)

helm repo add compute-gardener https://elevated-systems.github.io/compute-gardener-scheduler
helm install compute-gardener-scheduler compute-gardener/compute-gardener-scheduler \
  --set carbonAware.electricityMap.apiKey=YOUR_API_KEY

2. Use It (add one line)

spec:
  schedulerName: compute-gardener-scheduler  # That's it!

Optimize Your Batch Workloads Today

Free open-source scheduler for Kubernetes. Start reducing carbon and costs in minutes.

Stay Updated on Batch Optimization