profile-pic
Vetted Talent

Amogh Chavan

Vetted Talent

Software Engineer/Architect. I have experience in building ecommerce platforms, delivery services, and analytics dashboards.

Technologies I have used include, but are not limited to:

-Node.js Express, JavaScript, TypeScript, Rust, Python, C++, C, Flutter, Android

-Docker, Firebase FCM, AWS (SQS, S3, SNS), Kubernetes, Git, CI/CD

-PostgreSQL, MySQL, MongoDB, DynamoDB, Redis

-Routing services like OSRM and Google Maps

I have a strong passion for algorithms and a keen interest in crafting complex backend applications on a large scale. I am a professional with a Bachelor of Computer Science from Mumbai University, with a CGPI of 9.5/10.

  • Role

    Senior Software Engineer

  • Years of Experience

    4 years

  • Professional Portfolio

    View here

Skillsets

  • Node Js - 4 Years
  • Git
  • JavaScript
  • Type Script
  • SQL - 3 Years
  • React Js - 1 Years
  • Full Stack - 1 Years
  • Algorithms
  • AWS Services
  • Database management
  • Database management
  • Networking Concepts
  • Terraform
  • Driving proposals
  • Presenting to stakeholders

Vetted For

3Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    Backend Developer NodejsAI Screening
  • 82%
    icon-arrow-down
  • Skills assessed :AI, ML, Rabbit MQ
  • Score: 74/90

Professional Summary

4Years
  • Apr, 2024 - Present1 yr 6 months

    Senior Software Engineer

    DAZN Software Pvt. Ltd.
  • Mar, 2022 - Feb, 20241 yr 11 months

    Software Engineer

    Techshack Pvt. Ltd.
  • Jun, 2021 - Mar, 2022 9 months

    Full Stack Developer

    E2E Research Services Pvt. Ltd.

Applications & Tools Known

  • icon-tool

    MySQL

  • icon-tool

    Git

  • icon-tool

    REST API

  • icon-tool

    Node.js

  • icon-tool

    MongoDB

  • icon-tool

    Slack

  • icon-tool

    Javascript

  • icon-tool

    PostgreSQL

  • icon-tool

    Jira

  • icon-tool

    Postman

  • icon-tool

    Microsoft Teams

  • icon-tool

    AWS Cloud

  • icon-tool

    Google Maps

  • icon-tool

    OSRM

  • icon-tool

    Amazon DynamoDB

  • icon-tool

    NodeJs

  • icon-tool

    Express

  • icon-tool

    Typescript

  • icon-tool

    AWS SQS

  • icon-tool

    SNS

  • icon-tool

    Dynamodb

  • icon-tool

    Docker

  • icon-tool

    Kubernetes

  • icon-tool

    Github Actions

  • icon-tool

    Redis

  • icon-tool

    Kibana

  • icon-tool

    Confluence

  • icon-tool

    Terraform

  • icon-tool

    GitHub Actions

Work History

4Years

Senior Software Engineer

DAZN Software Pvt. Ltd.
Apr, 2024 - Present1 yr 6 months
    Contributed to projects that prevented illegal account sharing, designed and implemented key features, conducted data analysis, optimized SQL queries, assessed AWS service limitations, implemented Amazon ElastiCache, created velocity-based algorithms, used Terraform for backend architecture updates, and maintained CI/CD pipelines.

Software Engineer

Techshack Pvt. Ltd.
Mar, 2022 - Feb, 20241 yr 11 months
    Designed and built core backend systems for quick commerce services, developed and maintained backend services, designed key features, integrated real-time calculations, secure payments, built customer/vendor management APIs, developed tracking system, integrated Firebase Cloud Messaging and Amazon SES, implemented WebSockets, designed database schemas, integrated third-party services.

Full Stack Developer

E2E Research Services Pvt. Ltd.
Jun, 2021 - Mar, 2022 9 months
    Developed a data analytics dashboard, parser engine, user-friendly dashboard using React and TypeScript, RESTful APIs with NestJS, MongoDB for data storage, integrated Highcharts and PptGenjx, used Docker to containerize applications.

Achievements

  • played leading role in developing software dashboards that have generated an average revenue of $10,000 from every E2E Research client.
  • Awarded for an exceptional performance as backend developer in project Raven Dashboard
  • Special Appreciation Award (09/2021 - 09/2021) E2E Research Services Pvt Ltd Awarded for exceptional performance as backend developer in the project
  • Contributed 60% to the entire codebase across all Speedyy backend services

Testimonial

Techshack Private Limited

Arif Chan - business head

It is with great pleasure that I recommend Amogh Chavan for any back end development role. Amogh worked with us at Speedyy for 2 years, during which he consistently showcased his remarkable talent and speed in coding. His ability to swiftly grasp the ideas of our product managers and deliver on requirements was truly impressive.

One of Amogh's standout qualities was his seamless collaboration with our front end team, resulting in a cohesive and efficient workflow. His contributions were instrumental in the successful development and deployment of several key products, including food delivery, grocery delivery, pharmacy delivery, and pick-up and drop services.

Beyond his technical skills, Amogh's dedication and work ethic were exemplary. He consistently went above and beyond, often putting in extra hours to ensure project success, without hesitation. His commitment to excellence and willingness to go the extra mile made him an invaluable asset to our team.

In summary, Amogh Chavan is an exceptionally talented and diligent back end developer who would be a valuable addition to any team. I highly recommend him without reservation.

Major Projects

7Projects

Speedyy Delivery Service

Speedyy
Mar, 2022 - Present3 yr 7 months
    The Speedyy Delivery Service is an independent delivery service that utilizes a dedicated in- house staff to deliver products or services directly to customers. This service is used by all of Speedyy's internal services, including food, grocery, pharmacy, and pickup and drop. I created a tracking service through which we could track the exact location of delivery agents, their uptime, and downtime during defined shift hours. I also worked on the algorithm that finds the nearest delivery agent within a particular radius. This algorithm considered parameters such as distance, time, ongoing orders on that rider, and the rider's overall rating. The technology stack for this project includes Express, Typescript, PostgresSQL, Redis, Open Source Routing Machine, Google Maps.

Speedyy Superapp Food and Grocery Delivery

Mar, 2022 - Present3 yr 7 months
    Speedyy superapp provides services like food and grocery delivery at customers doorstep. As a key contributor to the Speedyy Superapp, I played a crucial role in various areas, including Cart and Order management, where I optimized and implemented order tracking functionalities. I also worked on Outlet and customer serviceability. I developed the Outlet rating system, allowing users to provide feedback and enhancing the overall user experience. I worked Payment and Refunds, ensuring secure transactions and seamless refund processes. I successfully integrated the app with Vendor systems, POS systems, and Delivery System to facilitate efficient order processing and delivery. I also implemented Vendor Payouts, Subscriptions, and Coupons features. The technology stack for this project includes Express, Typescript, PostgresSQL, Redis, AWS DynamoDB, AWS SQS, Elastic Search.

Speedyy Notification Service

Mar, 2022 - Present3 yr 7 months
    The Speedyy Notification Service is responsible for sending customized push notifications and emails to end-users' devices. As part of this project, I integrated Firebase Cloud Messaging to send push notifications and Amazon Simple Email Service (SES) to send emails. Additionally, I created a few services that allow us to track notification failures, view users' recent notifications, send the same notification to multiple user devices, and create notifications with sound. I also implemented dynamic email and push notification template generation. The technology stack for this project includes Express, Typescript, PostgresSQL, Redis, FCM, SES, and SQS.

Speedyy Superapp Pickup and Drop Service

Mar, 2022 - Present3 yr 7 months
    As part of this project, I developed a door-to-door package delivery service that delivers a variety of items, such as clothes, documents, accessories, and electronics. To integrate real-time distance and time calculations, I utilized the Speedy delivery service and its delivery fleet API to conduct serviceability checks. Additionally, I created order, cart, payment, and coupon services within the pickup and drop service. The technology stack for this project includes Express, Typescript, PostgresSQL, Redis, and DynamoDB.

Speedyy Superapp

Techshack pvt ltd
Mar, 2022 - Feb, 20241 yr 11 months
    • Speedyy Superapp offers a wide array of services including food, grocery, cakes, medicines, and electronics delivery directly to customers' doorsteps. Additionally, it facilitates product pickup and drop-off, as well as assists in booking cabs for transportation needs.
    • As a key contributor to the Speedyy Superapp, I played a crucial role in various areas, including Cart and Order management, where I optimized and implemented order tracking functionalities. I also worked on Outlet and customer serviceability. I developed the Outlet Rating system, allowing users to provide feedback and enhancing the overall user experience.
    • I worked Payment and Refunds, ensuring secure transactions and seamless refund processes. I successfully integrated the app with Vendor systems, POS systems, and Delivery System to facilitate ecient order processing and delivery. I also implemented Vendor Payouts, Subscriptions, and Coupons features.
    • As part of this project, I integrated Firebase Cloud Messaging for push notifications, Amazon Simple Email Service (SES) for sending emails, and integrated an SMS service provider to send messages to end users.
    • The technology stack for this project includes Node.js, Express, Typescript, Rust, PostgresSQL, Redis, AWS, DynamoDB, s3, sqs, sns, ec2, Elastic Search. FCM(Firebase cloud messaging).
    • Links

    https://drive.google.com/file/d/1R0vWgOuRGEU0tohmLYZdR-KYyrY7S2aC/view

    https://drive.google.com/file/d/1eLSqx-IbuCC-9E1uPfOqv7qk81BrRZPS/view?usp=drive_link

HFS Pulse Dashboard

E2E Research Services
Jun, 2021 - Mar, 2022 9 months
    The HFS Pulse Dashboard showcases HFS latest and greatest survey-based research about current and future demand trends for technology and business services and related emerging technologies. I was responsible for developing RESTful APIs using the NestJS framework, designing the UI in React with TypeScript, integrating Highchart and PptGenjx functionalities, and designing the schema for the NoSQL database. The technology stack for this project includes NestJS, TypeScript, MongoDB, Redis, and Docker.

Raven

Jun, 2021 - Mar, 2022 9 months
    Raven is a SaaS platform that offers data visualization for survey data. It accepts survey data from various tools like Confifirmit and Decipher and provides user-friendly charts, PDFs, editable PPTs, and Excel sheets for data visualization. My responsibilities included developing RESTful APIs using the NestJS framework, designing the UI in React (TypeScript), implementing Highchart and PptGenjx, and designing the NoSQL database schema. I also introduced and implemented Docker for containerizing the application. The technology stack for this project includes NestJS, TypeScript, MongoDB, Redis, and Docker.

Education

  • Bachelors Degree in Computer Science

    D. G. Ruparel College of Arts, Science and Commerce

Certifications

  • The Last Algorithms Course You'll Need

    (Nov, 2023)
  • Special appreciation award

  • Avishkar research award

Interests

  • Chess
  • AI-interview Questions & Answers

    Will you help me understand more about your background? Okay. So, uh, I am software engineer. I build robust and scalable back end systems. Uh, I am working with Node. Js for last 4 years. My passion for Node. Js started 4 years ago when I was in my graduation. Uh, then I I used to do freelancing work, then I joined an organization in Delhi, named D2E Research Services where I created few data analytics dashboards for them. It was a survey company, so they had a lot of data coming into their databases, and they needed a tool to visualize all that data. So I created a, uh, data analytics tool, which, uh, and the stack of the tool was react for the front end and measures for the back end with database as MongoDB. Uh, it has it had all sort of operations like, uh, transpose, uh, maybe to all sort of we're doing different forms of graphs, clubbing 2 questions using dynamic filters and all sort of that. Then I joined the organization, uh, Speedy, which is a product based startup. Uh, Speedy is basically India's 1st super app, and it has all the ecommerce platforms in it, like, Swiggy's or, but You don't have to install 10 different apps. You just have to install, uh, 1 app, and you can access all that, uh, all those ecommerce platforms. Luckily, I was with Speedy from day 1, so I had the opportunity to design the back end distributed systems and did all sort of databases scheme and work on all the cool features. So I have a good understanding of how ecommerce platform work, and, uh, we did good. We did around, like, 1 lakh orders in 1 year. We had a lifetime of 30,000 users, and we had our own in house logistic service. So this is little bit about me.

    What approach would you take to manage transactions in mergers interfacing with? Okay. What approach would you take care to manage transactions in Node. Js application? Okay. So we have to use the transaction object of the Postgres. So, uh, if so let's take a example. If it is a banking application and if we want to update a a entry of, uh, deposit in 2 tables, then we we need to create a transaction object, uh, and we need to pass that transaction object in the insert queries or update queries of 2 tables. And in the end, we have to commit the transaction. So with the help of transaction, we can maintain that data is synchronized across all tables and does not get partially synchronized in case the system fails during, uh, during, uh, during a transaction during a process. So, uh, I would I would basically in in my Node. Js application, I would create a database module where there would be a class, uh, where there would be a class database, and that class would provide me. That class would have a transaction generation function, which would generate a new transaction every time I call that function. So, uh, so each time I will get a new transaction object, I can pass on the transaction object to my custom model functions, and I can commit the transaction or roll back the transaction depending upon the business logic for that

    What what would be the strategy for handling session management in scalable node? Just APS. Okay. So, uh, so for session management, what we can do is we can whenever the user logins, we can store a we can generate a session ID, and we can, uh, give we can store that session ID into a, uh, persistent database or or maybe a cache database like Redis or something. And whenever the user logs in to the application, we will check if if if there is already a session active or not. And if the session is active, then we will delete the previous session ID, and we will generate a new session ID. So how will we let's take an example. So suppose a user hits the login API, we'll generate a session ID and store that session ID into a database, and we will add that session ID into a JWT payload and return that payload to the user. That user will use the JWT payload to access all the APIs. And if they've if and when and in each API, the back end system will, uh, decode that JWT. It will check if that JWT contains the session ID. Is that session ID valid, and is that session ID currently in the database against that user? And if the session ID matches, then that user will be authorized or else, uh, we will return 401. And, uh, by that, we can basically manage session ID. So, basically, we can use JWT to store the session ID and database to store the session ID. And we we will generate the session ID every time user logs in so that your session ID gets overwrited. So by this week, we'll have a good session management system. We can also use cookies, but, uh, JWT is also an option, so we have both options.

    In what ways you can utilize the observation, uh, design pattern in Node JS involving RabbitMQ? Observing old pattern in Node. Js. Uh, okay. So, uh, let's take an example. Suppose we we are designing a back end system for a food delivery, uh, for a food delivery application. And we need to we need we need we need to process the, uh, all the events in a a single way. So suppose if a new order comes into the system, we need to, uh, we need to inform the restaurant about new order. We need to find a, uh, delivery partner for that order. We need to do all sort of operations. We need to update multiple tables. We need to send push notifications to vendors and customers. We need to send an email to customer regarding the payment method and the bill. We need to do all sort of operations. If we do this in the API itself, it will be very bulky. There are chances that it could get partially completed and the system might fail. So in such scenarios, what we can do is we can push a message into the RabbitMQ, and RabbitMQ will uh, forward the message to the different channels. And, yeah, there will there will be different channels. Like, uh, delivery partner will be listening to this, uh, this channel, and there there will be another restaurant channel. And, uh, once that message is pushed into the rabbit queue, multiple entities will get that message. They will do processing with that message, and they will update the respective databases and do their respective operations. Using RabbitMQ, we don't have to do this into the a common API. We can use the async way using queues, and, uh, uh, it would be efficient. So API would be fast. We would simply return 200 and tell the customer we have accepted your order, and we will process it, in the later point of

    When designing a restful API mode, just how would you ensure it's supposed versioning for for future updates? Okay. So versioning versioning the APIs is very important. There are few versioning API versioning strategies. Uh, on the top of my head, I can remember there is a u URL versioning strategy, and we can also have a header parameter into, uh, header version header parameter into the API. So so let's talk about the 1st way. Uh, like, what is it adding the version in the API URL itself. So, like, if your URL is /xyzed.com/apis/placeorder, and suppose you want to create a new version of the place order, so the API will be x y z.com/versionone/placeorder. So this is only one way, but the problem with this is we need every time we need to change a version, we need to create a new route. We need to have a new controller. It might also be beneficial because we are separating the old route and the new route. But, uh, to, uh, there is also another way in which we can have the route as same, and we can just simply pass the version in the headers of the API. So, like, uh, the route will be always xyz.com/placeorder. And in the headers of the API, we'll pass a version, uh, key and value. And the at the back end systematic controller level, we will switch depending upon the header version value. So this is one of the version strategies. These are the 2 ways. There is also a third way sending the version, uh, into the query parameter. Uh, so the these are maybe 3 ways. There are all there are also other ways, but these are widely

    What caching strategies can be employed in more just to enhance API performance? Okay. So to enhance API performance, caching is the last resort of, uh, improving performance, uh, because, uh, maybe your database query is slow. Maybe maybe your logic is slow. Okay? So caching should be used as the last stage of result to improve the performance of the APIs. And suppose if you want to use caching, then there are multiple caching databases like Redis and Memcached, uh, in which we can cache the results. So, like, uh, so so instead of hitting the database, we can, uh, we can hit the cache. Most of the tools like databases already have a inbuilt cache in them. Load balances already have a inbuilt cache in them. So if a route is getting, uh, a request multiple times or if a database is executing a query, same query, and it is submitting the same output, the they all sort of these tools already have in grid cache in them. But if you still want to build a, uh, custom cache which which works for your custom logic, then you can use a radius. Let's take an example. Suppose we want to, uh, suppose we have a food delivery platform and we want to we whenever a customer opens the app, we want to send him the list of restaurants. So that list of restaurants can go up to 500 or 1,000. And we can't for each customer on each API request, we can't calculate if each of these 500 restaurants are open or not or are they operational or not. So, uh, or or which is the closest restaurant to that user? What we can do is we can precalculate this. We can we can we can precalculate all the restaurants opening and closing, uh, right, uh, time, and we can store that time in the cache itself. And whenever API whenever customers require opens the app and check gets the list of the nearby restaurants, we don't have to recalculate, uh, opening and closing logic for each restaurant. We can just simply check the cash value for that restaurant. We can, uh, also use, like so suppose if there are 2 users which which which are residing into the same building. They live in the same building, and they are opening the app. So in a long long perspective, there might be a significant point of difference, uh, in in a geographical perspective. But, uh, in reality, they live in the same building. So we instead of recalculating the entire, uh, fetching of restaurants logic, we can just simply create a zone, uh, small zone, and we can cache all the restaurants for that particular zone, and we can store, uh, that result into it. So whenever whichever user which belongs to that zone gets gets a list of restaurants, it will get the cached result. So caching can be used in such a way, uh, it would significantly improve the performance, and it will reduce the cost

    Imagine you are reviewing the Python code for fetching and processing data from the database using SQL AI. Chain, Warren, identify the potential performance, put forward, and recommend the strategy to improve it. Okay. So this is a item, Warren. And from SQL Warren session session session maker equal to bind engine. Session equal to new session. Expensive data equal to session dot query my model dot also, get all the data, expensive data, four d data, in the expensive data, process the return, close the session. So, uh, what I can understand here is I can call for the processing the data, how many data and for the client certificate. So what what I can see here is we are every time we are do calling this piece of code, we are creating a new session. That is not needed. We can have a singleton database object, which would return a single, uh, or which would return a single session instance, and that session instance will be used overall. So instead of creating new new sessions and closing the sessions every time, we can we can until the process is running, we can maintain that session and we can when the process the entire process exits, we can close that session. So we don't have to create and close the sessions every time. So by this, we can, uh, improve the performance of this query. And and my model got also yeah. It is it is there is no other such improvement we can do with we just we just use the existing session object instead of creating a new session every time. So this is the performance

    Examine the rest API return in fast API and identify the critical aspects of security that is missing. So this is how you would correct this oversight. Okay. From FastAPI, input FastAPI. Have dot get slash user slash user ID. Assign function, different define, read user, user ID, user ID as an integer, return user ID, and return username. Okay. So what I can see is there is a we are taking a user ID, and we are reading that user ID someone. We are returning that user ID. So identify a critical aspect of the security that is missing. Okay. So I can't see the authentication done here. So, like, if I I can just randomly hit this API with random numbers and I can maybe fetch random customer details so that would be not secure. So first of all, we need a authentication mechanism like JWT or session to authenticate the user, and we don't have to take the user ID itself. So whichever way we are selecting for authentication, uh, the payload, the cookie, or the JWT token will be attached with the request, and we will fetch the user ID from that JWT payload or that cookie payload. And we will then get the, uh, we will then get the user details and return to that user instead of taking a custom user ID. If this, uh, so this is the only improvement, and the improvement is, uh, don't take, say, user ID in the route. Use use some authentication mechanism, decode that payload, get the user ID, get the user details, and return the user details. And in the user details also, you don't have to return the user ID itself. You can just return username and your, uh, and and user phone number, whichever is non sensitive

    How would you refactor a Monolith Node. Js application to a microservice architecture considering current rapid architecture considering current usage. Okay. What we can do is, uh, in our Monolith application, we will have different modules, uh, like, maybe a order module, a payment module, a carton module. What we can do is we can, first of all considering current revenue usage. Yeah. What we can do is if first of all, if you want to separate out the code into different microservices, then we have to group all the related code together, like, model and APIs and business logic based related. All the files of order need to be in in one folder. Payment related, all the code related need to be in 1 folder. If it is already in that way, then all we have to do is separate out that folders, uh, and and run them as a separate microservice. And to communicate between these microservices, you can use the rapid m q. Like, if if a order service wants to communicate with a payment service, the order service can push a message into the RabbitMQ, and payment service can consume that message and do, uh, the necessary operations. So, basically, all the modules will will run as a individual microservice. And for for communication between the individual microservice, we can use RabbitMQ, or we can also use risk fullness APIs in into the, uh, virtual private network of of that, uh, apps. So, yeah, RabbitMQ can be used for the communication between these micro services. So, uh, refactoring would totally depend upon the, uh, current code structure. If it is, like if there is a folder of all the controllers, then it will be hard because we have to pull out all the specific controllers related to all that and services related to all that. If it is already grouped together, then we have to make little bit modifications and run it as a separate

    How would you leverage TypeScript features in Node. Js back end project to enhance quality and integrity? The reason why we are not using JavaScript is itself a major advantage because we have to define types for each and everything. We have to define interfaces for each and everything. So there would be no runtime bugs. Only developers know what kind of data is expected or what kind of data should be there in a particular entity. And with with the help of TypeScript, we can catch a lot of bugs, uh, like, which will which we will not will be able to catch in JavaScript. We can define, uh, for each for each entity, we can define interfaces. We can have type validation checks, and the compiler can really help in catching catching type. So, yeah, we can all enhance port quality and maintainability. Yeah. So we can use interfaces. We can use enums. We can use classes, access modifiers, and all sort of types of features to help in maintaining good quality. And besides from TypeScript, we can use other solid principles to maintain or have defined rules on how to write or what type of naming convention to follow. We can have all the custom winter logics

    How do you manage load balancing and auto scaling enabled with this cloud environment for Node. Js? Okay. Okay. So we can auto scaling the we can auto scaling both load balancing the So we what we can do is we can use the AWS load balancer. We can also set up auto scaling groups. We can use Kubernetes clusters for this, uh, which and we can define the auto scaling we can define this, uh, all policies and in which we can define, uh, how much we want to scale or how much we want to scale down depending upon the load. And we can configure the load balancer to point to this clusters. And, uh, yeah, by this, we can manage auto scaling and load balancing. It is, uh, if we are using AWS, then it is easily configured. Uh, but if we are not using AWS, then we can also use something like p m two and in and NGINX. So we can use a combination of NGINX and