← Back to VOLUME 15, ISSUE 3, MARCH 2026
This work is licensed under a Creative Commons Attribution 4.0 International License.
AWS BATCH FOR BATCH PROCESSING OF DOCKER CONTAINER
P. Kishan, Dr . S. Thavamani
DOI: 10.17148/IJARCCE.2026.15328
Abstract: Cloud computing has significantly transformed the way large-scale computational workloads are processed and managed. Traditional systems often struggle to efficiently handle highvolume batch jobs due to limitations in scalability, resource allocation, and automation. This paper presents a scalable solution for batch processing using AWS Batch integrated with containerized applications built using Docker.
The proposed system enables developers to run large numbers of batch computing jobs without manually provisioning or managing infrastructure. AWS Batch dynamically provisions the optimal quantity and type of compute resources based on the volume and requirements of submitted jobs. By packaging applications and their dependencies into Docker containers, the system ensures consistent execution environments, improved portability, and simplified deployment across cloud infrastructures.
In this project, Docker containers are stored and managed through Amazon Elastic Container Registry (ECR) and executed on scalable compute resources such as Amazon EC2 instances. AWS Batch automatically handles job scheduling, queue management, resource allocation, and monitoring, enabling efficient batch execution for workloads such as data processing, scientific simulations, and large-scale analytics.
The implemented architecture demonstrates how containerized workloads can be efficiently orchestrated using AWS Batch to improve performance, reduce operational complexity, and optimize resource utilization. The system also supports automated scaling and fault-tolerant execution, ensuring reliable job completion even under heavy workloads.
Overall, this research highlights the advantages of integrating AWS Batch with Docker-based containerization to build a robust, scalable, and cost-effective batch processing framework suitable for modern cloud-based applications.
The proposed system enables developers to run large numbers of batch computing jobs without manually provisioning or managing infrastructure. AWS Batch dynamically provisions the optimal quantity and type of compute resources based on the volume and requirements of submitted jobs. By packaging applications and their dependencies into Docker containers, the system ensures consistent execution environments, improved portability, and simplified deployment across cloud infrastructures.
In this project, Docker containers are stored and managed through Amazon Elastic Container Registry (ECR) and executed on scalable compute resources such as Amazon EC2 instances. AWS Batch automatically handles job scheduling, queue management, resource allocation, and monitoring, enabling efficient batch execution for workloads such as data processing, scientific simulations, and large-scale analytics.
The implemented architecture demonstrates how containerized workloads can be efficiently orchestrated using AWS Batch to improve performance, reduce operational complexity, and optimize resource utilization. The system also supports automated scaling and fault-tolerant execution, ensuring reliable job completion even under heavy workloads.
Overall, this research highlights the advantages of integrating AWS Batch with Docker-based containerization to build a robust, scalable, and cost-effective batch processing framework suitable for modern cloud-based applications.
👁 34 views
How to Cite:
[1] P. Kishan, Dr . S. Thavamani, “AWS BATCH FOR BATCH PROCESSING OF DOCKER CONTAINER,” International Journal of Advanced Research in Computer and Communication Engineering (IJARCCE), DOI: 10.17148/IJARCCE.2026.15328
