profile-pic
Vetted Talent

Amith D

Vetted Talent

Engineering graduate in Information Science and Engineering with over 8 years of experience in the software industry. Possess 2 years of expertise in ERP (SAP Business One), SQL Server, and Crystal Reports, coupled with 6 years of extensive experience in Business Intelligence (BI) Concepts like Data Warehousing, Data Modelling, Data Reporting, Data mart, and technologies like Power BI, SQL Server, Azure SQL DB, Azure Synapse / MS Fabric, Databricks, Azure Data Factory and used the languages like T-SQL, DAX, M Query, PySpark & YAML. Also proficient in database development, multidimensional analysis, and CI/CD pipelines deployment using Azure DevOps.

Pulled Data from Business Tools Like SAP ERP, SalesForce (CRM), Success Factors (HR), and Oracle EPM (Financial Planning.)

  • Role

    Business Intelligence Engineer

  • Years of Experience

    8 years

Skillsets

  • SQL
  • Data Modelling
  • PySpark
  • Data Warehousing
  • CI/CD
  • ERP
  • YAML
  • Business Intelligence
  • Data Reporting
  • DAX
  • Data Mart
  • M query
  • Database Development
  • Multidimensional analysis

Vetted For

12Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    Power BI - Team LeadAI Screening
  • 58%
    icon-arrow-down
  • Skills assessed :Oracle, Performance Tuning, Queries, Stored Procedures, Data warehouse, Database structure, DAX, Indexing, PowerBI, Data Modelling, Postgre SQL, SQL
  • Score: 52/90

Professional Summary

8Years
  • Feb, 2022 - Present3 yr 8 months

    Business Intelligence Engineer

    Elastacloud
  • Jun, 2018 - Feb, 20223 yr 8 months

    Business Intelligence Analyst

    Sapiens Technologies India Pvt. Ltd.
  • Jun, 2016 - May, 20181 yr 11 months

    SAP B1 Consultant & BI Analyst

    SM Squares Technologies
  • Jun, 2015 - Jun, 20161 yr

    SAP B1 Consultant

    Dynamo Infotech

Applications & Tools Known

  • icon-tool

    SAP Business One

  • icon-tool

    SQL Server

  • icon-tool

    Crystal Reports

  • icon-tool

    Power BI

  • icon-tool

    Azure Synapse

  • icon-tool

    Databricks

  • icon-tool

    Azure Data Factory

  • icon-tool

    T-SQL

  • icon-tool

    DAX

  • icon-tool

    YAML

  • icon-tool

    MS Fabric

  • icon-tool

    SSIS

  • icon-tool

    SSAS

  • icon-tool

    SSRS

  • icon-tool

    Azure Boards

Work History

8Years

Business Intelligence Engineer

Elastacloud
Feb, 2022 - Present3 yr 8 months
    Involved in requirement gathering, analysis, design, development & deployment. Collaborated on ETL tasks, maintaining Data integrity, and verifying ADF pipeline stability. Designed and developed ADF pipelines using Activities to facilitate the incremental load process. Worked on Databricks, created Notebooks for doing data cleansing and data transformations, and copy it to the Azure Data Lake storage or Azure Synapse (DW) or Azure SQL DB. Data integration between various systems using Rest API. Generate large and complex data extracts and queries for the Analytical Leads for data analysis by utilizing various database schemas such as Microsoft SQL Server and Azure Services. Working knowledge in deploying CI/CD pipelines using Azure DevOps. Used Azure DevOps services such as Azure Repos and Azure Boards to plan work and collaborate on code development, built and deployed applications. Automate deployment Pipelines for Power BI Items using Power BI Rest APIs. Developed and deployed dashboards, visualizations, and autonomous and dynamic reporting interfaces to be distributed to stakeholders via the BI reporting platform Power BI Service. Optimize report distribution, execution, and the scalability of all reporting solutions so they are autonomous and dynamic. Work with various Team Members, internal departments, and third-party agencies on reporting business requirements, documentation, timelines, testing, and technical delivery.

Business Intelligence Analyst

Sapiens Technologies India Pvt. Ltd.
Jun, 2018 - Feb, 20223 yr 8 months
    Involved in gathering, analyzing, and displaying financial data from SAP ERP, Sales Force CRM and FPNA systems. Experienced on Azure cloud platform in components of ADF, ADLS, and Azure SQL DB & Data Warehouse (Synapse / Fabric). Copy the Data from Data Source to Landing Stage, Transform the Data in Staging and create the Facts & Dimension Table in the Data Warehouse. (Synapse / Fabric / Databricks) Experienced working on Star and Snowflake Schema and used the facts and dimension tables to build the cubes, perform processing, and deployed them to the Data Warehouse. Designed and developed ETL packages to facilitate the incremental load process for transactional Data. Developed complex SQL queries using stored procedures to support power BI and SSRS reports. Create the Calculated columns, Calculation Groups and Dax Measures in the Power BI Desktop / Tabular Editor. Implemented Dynamic Row level security to restrict data access to the users based on their roles. Design reports for Finance like Balance Sheet, Income Statement and Budget vs Actual Report. Design reports for Sales & Expenses like Sales / Purchase Analysis, Aging Report, Travel Expenses and Recurring Expenses reports. Developed and deployed dashboards and dynamic reporting interfaces distributed to stakeholders via the Power BI Service, Power BI Embed, mobile, tablet devices, widgets, and email. Work with various Team Members, internal departments, and third-party agencies on report business requirements, documentation, timelines, testing, and technical delivery.

SAP B1 Consultant & BI Analyst

SM Squares Technologies
Jun, 2016 - May, 20181 yr 11 months
    Involved in Technical and Business decisions for Business requirements, Interaction with Business Analysts, Client Coordinator, and Development team through the Agile Kanban process. Create Copy Data Activity using ADF for loading the data from On Premises SQL Server/ Excel files into Azure SQL database or ADLS. Involved in processing ETL transformations, data validations, and stored procedures. Experienced working on Star and Snowflake Schema and used the facts and dimension tables to build the cubes, perform processing, and deployed them to the Data Warehouse. Experienced in Developing Power BI Reports and Dashboards from multiple data sources including Excel files, ADLS, REST APIs, Azure Synapse and Databricks. Responsible for creating and changing the visualizations in Power BI reports and Dashboards on change requests from the Scrum Master. Created Calculated Columns and Measures in Power BI depending on the requirement using DAX queries. Created Dashboards and Interactive Visual reports in Power BI, used proper visual for the right kind of report in the Dashboard. Worked with Import, Direct Query and Composite model connectivity modes in Power BI. Managing Power BI Premium capacities and access to standard monitoring and usage reporting.

SAP B1 Consultant

Dynamo Infotech
Jun, 2015 - Jun, 20161 yr
    Preparation of Blueprint, Project Plan, and Documentation for discussions. Mapping of clients complex business process into SAP B1. Handling Requirements, design, build, and testing of the product. Writing SQL queries, views, and stored procedures. Managing SQL Backups, SBO Backups, and restore Backups. Migrating Master Data and Transactional Data using Data Transfer Workbench. (DTW) Creating UDF, UDT, and UDO as per business requirements in SAP B1. SQL queries for any type of business reports, formatted search (FMS), and Transaction Notification (Stored Procedure). Preparation of Functional Specs for the Technical Team to customize screens. Implementation experience and knowledge of SAP Business One reporting for different functional areas.

Major Projects

10Projects

British Petroleum (Digital Twin Project)

Mar, 2022 - Present3 yr 7 months

Sapiens Technologies 1982 Pvt. Ltd

Jun, 2018 - Feb, 20223 yr 8 months

Triveni Aeronautics Pvt. Ltd.

Mar, 2018 - May, 2018 2 months

Citadel Intelligent Systems

Jan, 2018 - Feb, 2018 1 month

Vibhava Marketing Corporation

Aug, 2017 - Dec, 2017 4 months

AKE Infrastructure Pvt. Ltd

Feb, 2017 - Jul, 2017 5 months

Fine Components and Tools Pvt. Ltd.

Nov, 2016 - Jan, 2017 2 months

Edutel Technologies

Jul, 2016 - Sep, 2016 2 months

Micro Sensors & Chips Manufacturer

Dec, 2015 - May, 2016 5 months

The Purple Turtles - Home Decor & Lighting

Jun, 2015 - Nov, 2015 5 months

AI-interview Questions & Answers

Uh, Yeah. Hi. Uh, myself is Amit. So I have around 8 years of experience in IT industry. So 6 years into Power BI and data modeling, data analysis, and data warehousing. So I have started my career with, uh, report creations using the SQL Server. So I worked on the, uh, stored procedures, functions, views, and, uh, I I used the t SQL to create, uh, some of the reports like sales reports, etcetera reports, uh, using SQL server, and I used SQL scripts over there. So later, uh, I I got interested into Power BI, then I moved, uh, the creating the reports and dashboards into, uh, Power BI and also, like, working back end data for the data warehousing. So I like, particularly in the project, like, uh, mainly, I worked on projects like finance and the HR domains. So in the like, also, we usually collect the data from the that multiple data sources, uh, which the company, uh, which the company use. And from there, we will use those data, uh, and, uh, like, load it to the landing page, then we will transform, clean the data, and improve the quality, remove the nulls, and indexing the data. Everything we do the configuration, and, uh, we will move to the staging space. There, we create the facts and dimensions and create build the model and then move it to the curated curated space. So from the curated space, we we use the data to, uh, create the reports on the dashboards. So in this all this process so, like, we I I use especially, I used the, uh, ADF for the for orchestrating the data, and, uh, I used a skill server for creating, uh, like, upgrading the, um, uh, like, uh, for the incremental load process and the full load process and manipulating the data, transforming the data. So and, uh, we I used the Databricks also for, uh, same for the transforming the data and, uh, like, uh, for, uh, ADF stuff. Uh, so and also for the reporting, I used the tax queries and, uh, m query for the, uh, for the ETL processes. Uh, I used Power BI, uh, in this. Uh, and other than the you, uh, like, uh, I worked on Databricks as well. So and for the storage, we used to use Azure SQL DB and Azure Synapse. Now recently, I have certified for Microsoft also. So we use some, uh, Microsoft Fabric for a few of the things. Uh, okay. So this is all about my experience. Uh, Yeah. So and, also, we use agile methodology. Uh, we follow every 2 week sprint, and there will be a planning retrospective on the demonstrations every 2 weeks. So based on the input from the demo, so we will improve our, uh, changes in the, uh, next sprint. So this is this is how we will follow the project. Thank you.

Balance. Okay. So we will first, uh, I'll first check the volume of the data, what is this data related to, and how quick we can connect it to from the Power BI. We will check that 1. So and and see what what is the type of data it is. Uh, so it's it's a a streaming data or it's, uh, like, a day to day update data like that. So then, uh, we will decide. So, uh, so the main thing is that, uh, so if if so only only thing is that the data model, uh, is, uh, will be, uh, will be affect only the performance. Other than that, I think, uh, we can easily copy data to, uh, the database, and we can create the report.

What techniques could you use to so to probably a different that won't load in the intended application. For, uh, for intended application I mean, in the external application, we will check the XML endpoint, see whether, uh, the XML endpoint is, uh, correct. And, uh, so yeah. So and, uh, XML yeah. I think XML endpoint is the main thing to check uh, for the, uh, data. It's not loading to the external application.

Okay. So, Jose, we take we take the dashboards into its applications using, uh, for the IMEI analytics. So we use the XML end point. So that, uh, end point, we use it in the external application so that, uh, that the users can see the Power BI reports into, uh, the Power BI embedded space.

And would you choose to build a new table or view in the database for a for a report? Okay. So if, uh, we are, like, uh, if, uh, we are, uh, uh, having the data from the different tables and, uh, then we can, like, create a query so that to join between the tables, and we can create it as a view, uh, and it will be a virtual. Uh, so, uh, yeah. So that that will do the processing in the data source itself, uh, not in the Power BI engine. That times we use the views. And if we if we have that data entries happening, particular table, uh, that's a physical space, physical storage, then then only we use, uh, if, uh, if there is a, uh, usage of the the particular users are doing entry in the table, then only we create it as a table or else we create it as a view.

Uh, so for a virtual person control so yeah. Now, like, for Power BI, uh, we can integrate that with the Git, and we can give the Azure DevOps organization, create organization, and create a project, and we can link that to the Power BI, uh, that one. And, uh, so we can create it as a 3 3 or 4 stages, like a dev test, UAT and prod, or dev test and u a dev test and prod, depending on the, uh, come like, a company, uh, hierarchy and, uh, the requirement. So, like, yeah, we can, uh, create a deployment, uh, pipe line over there, and in that, we can create 3 workspaces. 1 is a dev, test, and prod. So we can assign that, uh, 3 workspaces to the 3 pipelines in the deployment pipelines. Then we can, uh, integrate that with the Git, uh, with the project. Uh, then every time you upload it, uh, to the, uh, Power BI service, then, uh, it will, uh, it will ask then you if you map with the commit it with the branch, uh, then you if you map with the committed with the branch, uh, then it will be like, a version will be managed, and you can see the history. So previous, uh, like, the previous changes, whatever you do, you can, uh, see the version history. So this is the one type, and another type is you can integrate with the Azure DevOps. So you can create, uh, like, a a clone that Azure, uh, repose into your local system. From there, you can whenever you do the changes in, uh, that create a service principle using Power BI, and, uh, whenever you do the changes, in the local system, uh, in your branch, so automatically. And if you once you commit and push it to the main branch, uh, sorry, in the main main, uh, GitHub or to the GitHub, then you can see the changes uh, in the Power BI, uh, also. Uh, so once you do all the changes and the ones that it moved to that development test, once the user approve that the changes, then you can get the merge it with the master branch. So, yeah, you can use Azure DevOps, and, uh, the you can use the Azure Git integration inside the Power BI service. So there are 2 options to do it. Thank you.

It's a number of filter on sales sales here. 2022. So here in this formula, so you have the filter, uh, which it particularly do it to the table. So now you are see selected as a sales all sales. So however, you are doing the sales year 2022. So we don't need to use a filter function here, filter. So directly, we can uh, use all sales and, uh, the sales year 2022 because of, uh, filter will only concentrate on a particular table, not on all the tables, uh, existing that particular visual. So you can use some of sales amount, and, uh, all sales and sales here is for 2022.

So let's start from orders in our join customer. Okay. Customer dot country is equal to Germany. Okay. So here, um, yeah. So you can Okay. So only here is the star. So instead of taking the star, we can take only, uh, the select, uh, required columns, uh, inside the query, then then I think, uh, like, the performance will improve because if you take the star, then it will, uh, fetch all the rec all the columns from the orders and, uh, columns from the customer. So instead that we can take only the, uh, required columns in the select statement. Yeah. Thank you.

Design a forward way forward way reporting solution that I can to complex business logic and the accent. Solution. Okay. So, uh, like, for, uh, doing a complex business logic, so in the DAX, especially, we can use the variables. So variables, uh, place important role in the DAX so that, uh, we can use that in a formula so that, uh, it will it will, uh, improve the performance. And, uh, like, you can use the tabular editor. So tabular edit editor is a very good option to check the query performance and to improve it there, uh, like, by doing trial and error, uh, dev test dev test. Uh, that is the 2 options you can use it. And, mainly and the data model. So data model should be, like, uh, it should be the relation between the table should be properly, uh, like, assigned, so designed so that, uh, it will, uh, it has integral, uh, integral security so that, uh, so you you won't it won't, uh, affect the performance on the reports. And, also, on the visual level, you can use the performance analyzer in the pop up desktop. There also you can, uh, improve the performance by checking the visual, uh, to refresh the visual, how much time it is taking. But there also, you can improve the DAX formula by seeing the visual performance. Uh, Yeah. So So 1 is the the data model stage, 1 is, uh, using the variables in the text query, and 1 is, uh, visual, uh, performance, uh, using performance analyzer. And, also, we can use the, uh, DAX Studio to check the to check the schema, so how how it is and what are all the, uh, columns you have required in the schema. So based on that also, you can, uh, improve the Power BI reporting. So Power BI performance. So, uh, the standard format, you can use there And, uh, from DAX Studio, and you can do it. Uh, Yeah. Using this, you can input.

What strategies would you apply to ensure Power BI reports remain accessible during data base maintenance activities. Activities. So, like, uh, if see, uh, Yeah. So so first thing is that, uh, the we we cannot directly link the database with the operational database to the Power BI data. So we we should have some staging space, and we should have some curated, uh, data there. From there, we will connect with the, uh, Power BI. And, of of course, if you are, like, want, like, maintenance, then then improve import import mode is the, uh, import mode is the best way to have it when you are doing the database maintenance activities. But, uh, it's it's it it only work for the, uh, like, small data. It don't work for big data. Uh, for the 2, see, if you have, like, if, uh, like, the the, like, data warehouse so you can create some, uh, some replicated data and you can do the connection to that or using a branch. So you you're having a, uh, data test, uh, and, uh, production environment. So there, you can use a branch and you keep that, uh, data in 1 branch and then, uh, you can do the database maintenance activities. I think so. Yeah. So we use a separate database for the reporting. So, uh, whenever there is a maintenance issues in operation data, then I I don't think it will, uh, affect to the analytical data. In case if you're doing the maintenance activities for the analytical data, then, yeah, using the import mode is the 1 of 1 1 option so that uh, the reports will be continuously running, uh, because it will, uh, store it in the cache memory in Power BI itself, uh, or using a a different environment, uh, like, creating some different environment and, uh, with the same name, uh, and connecting to that data source until the maintenance activities.

How do you incentivize your team to maintain good quality in the development of DAS expressions? Underscale queries. Yes. So, like, uh, yeah. So the scale queries and the tags expressions are, like, uh, it is a different type of, uh, the same same, uh, report but, uh, using a different syntax. So yeah. Yeah. Like yeah. We use a tabular editor. So, like, uh, same like a for SSMS for SQL Server. So for the decks, we use a tabulator. And, of course, in new version of Power BI, we can, uh, do it in Power BI itself. Uh, but maintaining and managing the DAX expressions in tabulator is, uh, much good than, uh, for BI. So yeah. So that 1, uh, we you yeah. We will, like, uh, I like, I it's like, uh, we will check the Yes. So we we checked the like, what is the back end, like, what is the formula exactly used in the SQL queries and how it will be transferred to the tags. So because in tags, the functions is different and, uh, complete, uh, like, syntaxes are different. And, uh, yeah, based on the the back end back end, uh, the, uh, the logic will be the same. So first, we filter that particular data, and then we will apply the aggregate functions on that, uh, grid functions on, um, on that particular data for while doing the reporting. So same same way, uh, like, same, uh, scenario applied to the tax also there. We will check the, first, the particular, uh, requirement, and then we will filter it out the, uh, we filter it out that, uh, capture that filter into the variable, and then we will use a variable in the creating function so that, uh, yeah, this is how we will, uh, do the things. Yeah.