AWS Batch easily and efficiently runs hundreds of thousands of batch computing jobs on AWS.
AWS Batch
- AWS Batch is to processes large workloads in smaller chunks.
- AWS Batch uses Docker images as an environment to run a job.
- Simple Management
- No need to manage the infrastructure required for computing
- Automatic Provision and Scaling
- AWS Batch dynamically provisions the optimal quantity and type of compute resources (e.g., CPU or memory optimized instances) based on the volume and specific resource requirements of the batch jobs submitted.
Components
- Jobs
- A unit of work
- Job Definitions
- Instructions on how jobs are to be run
- Job Queues
- A job gets submitted to a specific queue and will be scheduled.
- Compute Environment
- A set of managed or unmanaged compute resources required to run your job
- Fargate
- Recommended for most cases
- EC2
- Used when you need to use custom AMIs
- Used when you need more extensive vCPUs (more than 4) or Memory (more than 30Gi)