An Efficient and Scalable Traffic Load Balancing Based on Webserver Container Resource Utilization using Kubernetes Cluster


Authors : Ashok L Pomnar; Dr. S.K. Sonkar

Volume/Issue : Volume 7 - 2022, Issue 5 - May

Google Scholar : https://bit.ly/3IIfn9N

Scribd : https://bit.ly/3Hzncjf

DOI : https://doi.org/10.5281/zenodo.6640228

Abstract : A Today digital transformation era, traditional web based systems and application architecture become the bottleneck of scaling, high availability, portability, and many more downtime and performance-related challenges so the world is started adopting the cloud and cloud-based services to get its benefits. Virtualization technology played a very crucial role to migrate applications from traditional physical systems to virtualized systems but again because of limitations and challenges associated with virtualized system scalability, high availability, and application portability, the Container technique is gaining increasing attention in recent years and has become an alternative to traditional virtual machines to run the application. Few of the primary motivations for the enterprise to adopt container technology include its convenience to encapsulate and deploy applications, lightweight, as well as efficiency and flexibility in resource sharing. Considering current vertical and horizontal application scaling challenges for the high traffic websites, in this paper we explains the benefits of cloud technology, virtualization, container and Kubernetes clustering features. Also, discuss the algorithm for building dynamic scaling large traffic platforms and applications, which will be runtime, scaled up and down as per user request and traffic demand. Recent growth in e-business, online web application and mobile user accessibility are exponentially grown with the widespread of Internet of Things, Data Processing and data analytic and many more online portal. To improve better web application performance, web application availability and scalability resource utilization many service provider start design services using micro services and deploy them on the container using Kubernetes cluster. However, the exiting approaches fail to address service availability, handling high traffic load result to dissatisfaction to end users and business impact. To address these issues, in proposed work evaluate many interesting aspects of running high traffic website applications on containerized dynamic scaling and high availability cluster. Proposed Setup run on the cloud again, such as how convenient the execution environment can be set up, what are makes pans of different workloads running in each setup, how efficient the hardware resources, such as CPU and memory, utilized, and how well each environment can scale. The results show that compared with virtual machines, containers are more easy-to-deploy and scalable environment for high traffic workloads

Keywords : Docker, Cloud Computing, Kubernetes Cluster, K8S, Micro service, Web Server, Haproxy Load Balancer, NFS Storag

A Today digital transformation era, traditional web based systems and application architecture become the bottleneck of scaling, high availability, portability, and many more downtime and performance-related challenges so the world is started adopting the cloud and cloud-based services to get its benefits. Virtualization technology played a very crucial role to migrate applications from traditional physical systems to virtualized systems but again because of limitations and challenges associated with virtualized system scalability, high availability, and application portability, the Container technique is gaining increasing attention in recent years and has become an alternative to traditional virtual machines to run the application. Few of the primary motivations for the enterprise to adopt container technology include its convenience to encapsulate and deploy applications, lightweight, as well as efficiency and flexibility in resource sharing. Considering current vertical and horizontal application scaling challenges for the high traffic websites, in this paper we explains the benefits of cloud technology, virtualization, container and Kubernetes clustering features. Also, discuss the algorithm for building dynamic scaling large traffic platforms and applications, which will be runtime, scaled up and down as per user request and traffic demand. Recent growth in e-business, online web application and mobile user accessibility are exponentially grown with the widespread of Internet of Things, Data Processing and data analytic and many more online portal. To improve better web application performance, web application availability and scalability resource utilization many service provider start design services using micro services and deploy them on the container using Kubernetes cluster. However, the exiting approaches fail to address service availability, handling high traffic load result to dissatisfaction to end users and business impact. To address these issues, in proposed work evaluate many interesting aspects of running high traffic website applications on containerized dynamic scaling and high availability cluster. Proposed Setup run on the cloud again, such as how convenient the execution environment can be set up, what are makes pans of different workloads running in each setup, how efficient the hardware resources, such as CPU and memory, utilized, and how well each environment can scale. The results show that compared with virtual machines, containers are more easy-to-deploy and scalable environment for high traffic workloads

Keywords : Docker, Cloud Computing, Kubernetes Cluster, K8S, Micro service, Web Server, Haproxy Load Balancer, NFS Storag

Never miss an update from Papermashup

Get notified about the latest tutorials and downloads.

Subscribe by Email

Get alerts directly into your inbox after each post and stay updated.
Subscribe
OR

Subscribe by RSS

Add our RSS to your feedreader to get regular updates from us.
Subscribe