
Cloud Operations Engineer
Medline India Pvt Ltd, PuneCloud Devops Engineer
Medline India Pvt LtdSr. Support Engineer
Tech Mahindra LtdLinux System Administrator
Afforeserve LtdSystem Analyst
Afforeserve Ltd (Reliance Jio)System Analyst
Afforeserve Ltd(Reliance Jio), New MumbaiLinux System Administrator
Afforeserve Ltd (Konkan Railway)System Administrator
Softenger Technology Ltd., New MumbaiSystem Administrator
Softenger Technology Ltd.System Administrator
Serco Global Services.Azure

Kubernetes
.jpg)
Terrafrom

Linux Admin

GitHub

Bitbucket
.png)
Docker

Linux

Shell Script
Hey. Hi. My name is, Rahul. I'm having total 9 point year of experience. Uh, I started my journey in Linux. Slowly, steadily, I moved in DEVOS Technologies. In the current organization, I'm working on Azure Cloud along with the version system that is Bitbucket, plus the, uh, Terraform Cloud, which is the enterprise version for managing the resources on cloud. Uh, also, I'm using the, uh, Kubernetes. Uh, that is the best service from Azure. That's the Azure Kubernetes service. So yep. Uh, Tasmos, probably, I'm working, uh, in my current organization from last 4 years. Uh, we do have 6 clusters in our environment, which we are managing on AKS. Apart from that, I do have the we do have the, uh, PaaS services, which are deployed on Azure. So yep. That's all about my introduction.
So migrating the on premises application to AWS using Docker containers. Docker containers is all about the, uh, containerized application. What containerized application is Somewhere, uh, you have to minimize the resources which get utilized by your application on on premises, virtual machines, or the physical servers. So in that case, you have to optimize your overall application in terms of everything. Uh, like, what is the resource utilization? What is the, uh, dependencies are there for your, uh, application. So, accordingly, you have to create a bundle of those things, and you can just suggest your strategies to migrate your application, your, uh, bundled application, and how you can just migrate it to the, uh, cloud environment. So on AKS also, you why you are migrating your application to the containers, uh, containerized application? That is a microservices architecture that we have to think before migrating there. So, obviously, on, um, cloud provider side, it's not about, uh, like, you don't have a downtime for your application. Also, it's very easy for you to upgrade regular patches for your application, also the secure environment. So that's all about we have to think about. So we can do it with different deployment strategies by following the deployment strategies for your, uh, cloud app or your application, which you're migrating on the microservices architecture. Like, uh, you have to go for the canary, uh, deployment strategy, which is, uh, for set of users, which we are, uh, deep. Actually, it is very popular, canary deployment strategy for the microservices architecture. Also, we can consider about the blue green. But, yeah, in blue green, uh, deployment strategy, uh, we are dividing our, uh, prod environment in 2 stages. Like, 1 is blue and 1 is green, which is actually running, and one is when you are upgrading your application. So, accordingly, you we have to think about how we are deploying our application, microservices architecture on cloud.
What will you do to secure Azure Logic App workflows? So Logic App is something you can automate all your, uh, operations and in in the app environment, and logic app in the background will need the app service plan. Okay. So how you will secure, Raju, uh, app workflows by, uh, enabling the we need integration for it for your outbound traffic. Also, reason why is, uh, they have provided you the outbound, uh, IPs, which we have to enable on the destination, uh, destination resource where your lodge logic app is getting connected. So, obviously, you have to secure your logic app by enabling the VNET integration, which is the private connection for your logic app. Apart from that, yes, you can, uh, restrict the users by allowing the RBAC, uh, RBAC access, which is a role based access control on your logic app. So, uh, apart from, uh, the secure environment, uh, you will also have the restrictions on your, uh, Logic app.
How does containerization with Docker enhance application deployment? Continetization with Docker enhanced application deployment in a cloud environment. Yes. So when you're deploying your application as a containerized environment, definitely, uh, there are, uh, reliable resources which are available on cloud in terms of Azure. It's a a Azure Kubernetes service. Okay? Apart from that, we do have the container environment. So, uh, you can just go ahead and deploy your, uh, application load, uh, there on, uh, on the Azure cloud where, uh, the all facilities are already there for to deploy your application. So that you can, uh, consider. Apart from that, yes, the containerized instance, which is the dedicated for the single application. But if you have different multi services, then, yeah, definitely, you have to consider for AKS.
How would you secure sensitive data and Terraform code without committing it to VCS? So how you will secure? You can because in our environment, we are using the Terraform cloud. So, uh, Terraform cloud itself is providing you a variable section where you can just add your variable and the value. And while uploading the value, you have to select, like, which, uh, kind of, uh, value you're providing. If it is a sensitive data, yeah, directly, you can mark it there that it's a sensitive data. So that will be available there.
Automating scaling of AWS resources based on demand using Terraform. Automatic that, uh, for, uh, Terraform, you have to enable the pipeline for it when you are creating the resources. So whenever the demand is there or any demand of resources will be available, that will be handled by your pipelines.
In the Docker file, Yes. In the defined Docker file, obviously, by, uh, while creating it, it will not be able to reach to the github.com because it's just the image creation. Okay? So, uh, it's not the deployed container on your, uh, environment. So, obviously, the network will be not there. You're creating a Dockerfile. Okay? So, obviously, uh, it will not be able to, uh, run the command, git clone and the s t d p s git, uh, data.comexample, repo.git because the network will be not attached because it's image creation for you.
Given this Terraform snippet that initialize a new AWS EC two instance, identify and explain what's wrong with the variable interpolation and how it could affect the infrastructure requirement. Yep. So in the given telephone code, which is for the AWS EC 2 instance, Yeah. Definitely, it's a AWS instance. But while creating a tax for it, I don't think, uh, like, if your dot var.environmentenvironment variable is already defined or not, which is not present here in this. So, uh, if it is not present, yeah, definitely, it will ask to define your variable if it is, uh, already there in your variable dottf.
Design a Terraform model to deploy a multi tier web application using AWS services. Yeah. In this case, uh, why we are creating the module? Because module is something, uh, when we are deploying, uh, or creating the resource definition for any any resource, then that will be, uh, with the help of organizational standard and the security standard we have to consider. So no doubt while creating the multi tier web application for you, the you have to define your module in terms of you in terms of, uh, like, considering the organizational standard and the security standard for you. So that's very important because, uh, when you are going to define the model for your web application, yep, definitely, you are going to reuse it. Also, you have to consider the parameters like the security and the organization standard. So those are the important things we have to keep in mind while defining the Terraform module.
CICD pipeline, yes. We can go for GitHub. Uh, Not the GitHub, as the GitLab, uh, or else, we can just go for the legacy Jenkins. Otherwise, we can just go go for the Azure DevOps account if you have in your, uh, in your organization. So you can define your pipeline. Uh, you have to think about the deployment strategy for your Kubernetes ecosystem. Uh, for Kubernetes ecosystem, I think canary deployment will be the best suitable way to deploy your application because whenever in future, uh, you are going to upgrade your application, uh, depending on your need, So that strategy will help you with, uh, by testing it for the set of users. So yep.
Coding standards. So coding standards for which I think, uh, we can, uh, go for coding mechanism, like, uh, check off is there, uh, which is also available with the integration of VCS. That's our, uh, VCS, like, widget studio code. In that, you can integrate, uh, check off to check your code. Okay. Apart from that, also, you can use the Terraform validate when you are writing the code. And, uh, versioning, you you can maintain in your, uh, version controlling system, like Bitbucket, GitHub. Or else, if you are using the Terraform Cloud, they are also providing you the Terraform private registry to store your code, and there you can manage your, uh, versioning for your Terraform code.