profile-pic
Vetted Talent

Aditi Ghosh

Vetted Talent

At Dell Technologies, we transform complex data into actionable insights, driving operational growth and marketplace efficiency. A Master of Science in Instrumentation Science from Jadavpur University equips me with a robust analytical foundation, maximizing the potential of BI tools like Tableau to enhance business reporting processes.

Our team prioritizes strategic decision-making, ensuring each insight delivers tangible business value. With a skillset that includes Power BI and shareholder communications, my goal is to continuously innovate and contribute to Dell's data-driven success.

  • Role

    Senior Bi Analyst

  • Years of Experience

    7.5 years

Skillsets

  • Training
  • Data Analysis
  • Tableau
  • SQL
  • Hadoop
  • PowerBI
  • Client handling
  • Data Validation
  • Analytical Skills
  • Data Visualization
  • Business Intelligence
  • Mentoring
  • Teradata SQL
  • Good problem solving skills
  • Mssql queries
  • Business Analytics

Vetted For

12Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    Power BI - Team LeadAI Screening
  • 50%
    icon-arrow-down
  • Skills assessed :Oracle, Performance Tuning, Queries, Stored Procedures, Data warehouse, Database structure, DAX, Indexing, PowerBI, Data Modelling, Postgre SQL, SQL
  • Score: 45/90

Professional Summary

7.5Years
  • Senior Bi Analyst

    Dell
  • Visualization Practitioner

    Brillio Technologies
  • Technical Analyst

    Infosys
  • Presales Consultant

    Comtech IT solutions Pvt. Ltd.
  • Pre-sales Consultant

    Goldstone Technologies

Applications & Tools Known

  • icon-tool

    Tableau

  • icon-tool

    Power BI

  • icon-tool

    MSSQL

  • icon-tool

    Hadoop

  • icon-tool

    Ubuntu

  • icon-tool

    Hive

  • icon-tool

    AWS Quicksight

  • icon-tool

    Domo

Work History

7.5Years

Senior Bi Analyst

Dell
    Generated reports from multiple databases using SQL Server Reporting Services. Developed and maintained data models for BI projects. Designed dashboards with interactive visuals in Tableau and Power BI for better decision making. Performed data analysis using Excel, Tableau, Power BI., to provide insights into customer trends. Created SQL queries and views to support reporting needs. Collaborated with IT teams on developing strategies for managing large datasets efficiently. Identified opportunities for improvement in existing database structures and querying techniques. Implemented best practices for data collection, storage, retrieval and analysis across multiple platforms. Worked collaboratively with developers and other stakeholders to ensure successful delivery of BI solutions.

Visualization Practitioner

Brillio Technologies
    Worked on debugging and rebuilding existing Tableau reports. Participated in the Data Governance activities for data cleaning on the tools and databases. Provided layout designs for new and existing reports. Explored new BI tools like AWS Quicksight, Domo etc.

Technical Analyst

Infosys
    Developed and maintained Tableau dashboards with interactive visualizations. Created custom views, charts, graphs, filters, parameters, and data sources to meet specific user requirements. Developed complex calculations using Tableau functions such as LOD expressions and table calculations. Implemented best practices for optimizing the performance of Tableau reports and dashboards. Assisted in designing and developing ETL processes to extract data from multiple sources into a single repository for analysis in Tableau. Analyzed large datasets to identify trends, patterns and correlations utilizing advanced analytics techniques in Tableau. Deployed Tableau Server security settings including authentication methods, permissions, row-level security settings. Optimized SQL query execution times by making use of indexes and partitioning techniques in order to improve report performance. Developed interactive drill down features across multiple levels of detail within dashboards.

Pre-sales Consultant

Goldstone Technologies
    Provided technical expertise on Tableau Products throught POC or demo. Deployed into Tableau development work in client location. Closely worked with the Business Development team to identify and generate geniue customers or prospect looking for Tableau product or services.

Presales Consultant

Comtech IT solutions Pvt. Ltd.
    Market research and analysis on Big Data Ecosystem to be used for any upcoming Big Data project needs. Provide solution architect using IOT and, Big Data tools as per the client requirement. Closely with the BD team for RFP, RFQ and, other sales activities for the US market.

Major Projects

4Projects

Customer Feedback Reports

    I worked for one of the client of Infosys which is a well known Cosmetic company in USA. The database used was MSSQL. We had to fetch the view from the DB and, build customer feedback reports on Tableau. We were a team of three members and, my role was to build the tableau reports, showcase the same to clients for their feedback and, implement the changes required. Also, publish the same reports to Tableau Server.

YTD and MTD COO sales report

    I worked as a Tableau Developer for one of the clients of Goldstone Technologies in the client location. As a Tableau Developer I have to prepare the YTD and MTD reports for the COO by using SAP BW as the primary database. With prior discussion with the client I had to include all the sales KPI that can help them know the current sale and their target starting from the fiscal year till date. Had to write complex date calculations on Tableau as well as troubleshoot the issues faced while uploading the dashboard on to the client server.

Consumer Offers

    The dashboard was build on Tableau and the audience for these reports was Customers from Telecom and Retailers as well as the associated Business heads. It comprised of summary charts in tabular form, Trend charts showing the peaks variations of sales over weeks and years. Used LOD and Parameters to switch between Rolling 13 weeks and Yearly comparison over the same dashboard. Worked on Row-Level Security and dashboard optimization. Build parameter driven Tableau charts and also calender charts to show the highest scorer of the day on a calendar. Use of Teradata as the main source of data. Also, engaged a team of 4 data engineer to work on a specific need to generate filter based pdf of the published report using python script.

Migration of Tableau to Power BI

    Currently working on the conversion of Tableau reports to Power BI report. Created Data Model on Power BI for the report needs. Conversion of LOD calculations to DAX queries. Compare the data between Power BI chart and Tableau to generate the same data maintaining the same granularity in the modified charts. Maintain UI standards by using Json Themes in Power BI.

Education

  • MS/M.Sc(Science)

    Jadavpur University
  • B.Sc

    D H S K Science and Arts College

Certifications

  • Hadoop training from petabytes, mumbai, maharastra

AI-interview Questions & Answers

Hello. So this is Aditi Ghosh, and I'm from Bangalore, India. And, uh, I have been working for Dell Technologies for past 2.4 years. And, uh, in this process, I have also been working on Tableau. That is, uh, my primary skill, and I have worked for companies like Infosys, And previously, I have worked as a presales consultant for some start up and mid level companies. So overall experience I have around, say, 7.5 years. And, uh, I'm looking to engage myself more into business intelligence as well as the upcoming technologies since AI is booming in the market. So I'm also planning to go for an, um, internal course or something to upgrade myself as well. And, uh, in this perspective, like, while I'm working on Dell or Intel, I have been dealing mostly with the stakeholders. They comprises of the senior managers, directors, VPs, and other executives. And the reports that I have built, they have a good exposure, and I have been doing a lot of report, uh, uh, information gathering, building data models, and also a team handling experience that comprised of people around, say, 2 to 3. And, uh, there are also, uh, people who have been working in the technical, uh, expertise or in technical area, like principal consultant, solution architect, etcetera. So this is all about me and my background. I have done my master's in, uh, instrumentation science and graduation in physics, and they are all from, um, you know, from the University of De Bruvo University and from Jatapur University. And I am basically from Assam, but currently, I am in Bangalore, and I'm looking forward to have more interaction with, uh, with MRI. That's all I got. Thank you so much.

So, basically, I would be choosing to build a new table or view in a database for Power BI repo. So that's a question. Now mostly, uh, in a report, whatever we will be uploading, that will be from a table or a view. And, uh, whatever we will be choosing to, um, you know, to represent or to build on a Power BI report, we need to fetch the same information in the table or the view. So that's the understanding that I have. And, um, in this perspective, a data source connectivity is very much important. And, also, uh, in order to build such reports with all the information, you have to have a tab a table or a view, um, to be created, um, maybe in the data source. Or or if you are trying, uh, to build something like, uh, you know, a disconnected table through which you are planning to do some toggling between, say, different matrices. So that can be done on top of the that can be done as a visualization in Power BI report. Otherwise, whatever you need for the report can be done, like, from the data source perspective. So instead of, uh, you know, burdening more by creating complex calculations on Power BI report. It's best to bring such complex calculations from the back end itself. But, of course, there are certain, uh, requirements that needs to be done, um, on the Power BI end using DAX formula. And, uh, if it's at all required, then definitely we have to go for it. So that's the whole scenario. Thank you.

List of factors that you would consider when optimizing SQL Server indexes used by Power BI report. Well, that's a very good question. In this case, uh, definitely, first of all, I will be trying to reduce the volume of the reports that we print. Because when it comes to Power BI or it comes to any kind of a reporting, it very it's very essential that, um, the volume of the data is maintained correctly. Otherwise, what will happen, it will take a lot of time to refresh it. In fact, if you try to publish this report with an extract or as a live connection, it will create a lot of difficulty in uploading it and then viewing this image because that is a direct impact on the performance. So the first thing that you have to keep in mind is that whenever you have this kind of a situation where the volume is very big, first of all, create an extract. Go with an import, uh, go with an import, uh, model of it, and reduce the volume of the information. Only restricted to the report that you are going to build. And, uh, that's the first thing. Uh, create as less, uh, tags or complex queries as possible. Use very limited filter that can actually impact a lot in the performance. And, uh, from the SQL side, uh, definitely, you can create a view specific to the report that will have a minimized number or or only the required set of information. And, uh, that's the best thing you can do. And instead of creating 1 complete denormalized table, you can create a star schema with, uh, you know, 1 fact table and multiple dimension table. Like, it means normalization. And, uh, so that that's the kind of schema that you can opt for and, uh, perform and increase the performance of the SQL Server queries onto Power BI. Thank you.

What is your approach to manage large datasets in Power BI with consideration to both performance and accuracy? Now when it comes to large datasets, first of all, it's, um, something that we cannot help with. If the dataset is really large and all of it is required, the best practice is to create an extract or to import, uh, the connection. So that will create a snapshot of the information, and it will, um, um, not impact the performance a lot. So that is the first thing from the, uh, from the performance perspective and as well as with the accuracy. Accuracy is something that needs to be, uh, dealing with at the very beginning. It's, uh, when you're fetching this information into your, um, Power BI report. And, uh, the first thing is, like, you have to keep on validating this information. Because, uh, when it is a very huge dataset and there are lot of granularities on top of it, it becomes very difficult to understand what should be the exact results. So the best thing you can do is validate the result, uh, uh, before you even get into, uh, Power BI. If the data is accurate, if your model is correct, then definitely your data will be accurate. And like I said before that try to get a star schema instead of getting a Snowflake schema. Or, uh, you know, you can also build, uh, different, uh, dimension tables from where you can fetch these different types of information and then put them in a single model in your Power BI report. So that way, the accuracy is also maintained, and the validation is also, um, little bit easier to do. So that or you can maintain the integrity of the data. That that's what I can say without any loss of information. So that's what I believe would be a a helpful thing in, um, you know, maintaining a performance and accuracy. Thank you.

Explain how you would use index views to enhance performance of Power BI report? Well, I'm not sure about this particular answer because, uh, I do not have much idea about using index views, uh, to enhance performance. But, uh, surely, I will look forward, uh, on this. And, um, since I'm working on Power BI and with a really good amount of data, so this could be something helpful in my work also and as a learning thing. But right now, do not have any answer for this question.

Given this debts formula snippet used in Power BI report, can you explain why the performance might be impacted and how you would optimize it. Calculate some sales filter. So, basically, what like, the the use of this particular information is to get the sum total for that entire, uh, set of information that you have fed into the, uh, Power BI report. And, also, there is a filter condition that is restricting the year to 2022, and it's an all function. So all basically, it will avoid any of the filters, uh, or the fields that is present on the view. So it is like a, uh, a a kind of filter function or a table filter function, you can say, in which the all is used, uh, basically to avoid that particular, uh, scenario. So if, say, uh, if that is the condition, if this is this is what your, uh, if this is if this condition is implemented, then definitely you have excluded only 1 particular year out of it. But what if it has 10, uh, years of data? So out of this is definitely not going to be helpful, and it is going to take a lot of time because the sum is like, it is doing a complete sum of the entire setup table and whatever be the, uh, you know, the granularity of the report. So this is something which is going to impact, uh, the optimization of the, uh, you know, in, uh, performance of the report. And if in case if you really want to exclude this particular, um, year, uh, uh, from your calculation, just use instead of using, um, what to say, um, that filter condition inside the calculation, use it outside. So that is that's how you are actually restricting the information on the view itself. There is no point in using the filter function in this way within the calculation. It is not just, uh, making, uh, to say, it it's it's not going to give you the actual result that you are, uh, looking forward to. And moreover, that will it is, like, excluding just 1 filter and that too on top of the view. So if that is the, uh, what is the result that you want, you can use it onto the filter shelf instead of using it in onto the query. So that's how I would be optimizing it. Thank you.

Review the SQL queries. The pipe below, there is a potential performance issue with it. Can you identify the problem? And suggest how would you rewrite the better performance? So let's start from order in a join. Well, um, in this case, Well, the first thing that I would do in order to optimize it, it's a inner join, and the condition is drawn at the end of this query. So my first approach will be get the information where Germany is excluded from the customer's table at the very beginning itself. And then you can create a inner join with the orders table to fetch the information that you need. And, uh, uh, yeah. So so that that would be my, um, way of increasing the better performance. So that, uh, even before you start, uh, the actual execution, whatever you need for the query is present, and then you can execute it at your end. Or you can also create a function for this because functions perform better than the, uh, you know, than the views or this kind of a data manipulation, or we can create a separate query where, uh, Germany is not included and then use the same thing, um, or or a view which does not have Germany in it and, uh, then use the same thing in your query for doing that inner join. So that way, it can be optimized. So that's what I would have done. Thank you.

What approach do you take for an app links of service reporting that in lines with corporate governance and security policies. Well, I'm not really sure about the answer for this. Um, like, what exactly would be to to enable the self-service reporting. But, definitely, if, uh, we need to give us kind of a security, then cybersecurity has to be taken into account. We have to mask it. We have to establish certain firewalls before we let it out to the, uh, you know, or to, um, to the users in case if there is any kind of a threat. And, of course, uh, every, um, company will have its own security policies and governance rules so that that will be documented. And if you follow them and we see that whatever data we have and whatever process we are implementing or if there is a single sign on, uh, established for that, then, um, that would be great. Like, uh, based on the kind of, uh, you know, a person, whoever is a part of the corporate, uh, should be given the access. And, uh, that kind of security, uh, uh, we can establish onto the report so that, uh, not just it will help only the user to look at the information, but it will mask it from anybody else or any third party to, uh, see the information. So that way, we can prevent, uh, the report, uh, from the security perspective as well as following the governance rule. So that's what, um, I would have done. That would be my approach. Thank you.

What strategies would you apply to ensure Power BI reports remain accessible during database maintenance activities. Well, in this kind of a scenario, like, most of the time, um, if, say, Power BI reports, uh, during a data governance activities, make sure that, um, it does a schedule, um, on top of it, then they don't conflict with the database maintenance activities. Like, if it is, uh, uh, import information or or a import connectivity, then definitely, um, you can actually stop the schedule for that particular instance of time. So that if at any point of time there is anything going on wrong with the database, it will not directly impact the dashboard and show you wrong information or show you blank information onto the Power BI report. And, uh, that's the first strategy that I'm going to take. And, uh, if it's a live connection for example, it's a live connection. Then, uh, for the time being, if there is any way of, uh, you know, disconnecting it from the database or, uh, you know, not to show that particular information. Or or the best practice is to inform these escape users or the users that, uh, you know, do not, um, uh, have any information uploaded or any new information to be checked onto the uh, report as, uh, some data governance or some data maintenance activities are going on. So so that that could be 1 of the scenario where you have to stop the accessibility. But the best practice or the best strategies is to get an import, um, connectivity for the Power BI report because you can, um, disconnect the schedule for the timing until the maintenance activity is going on, and that will not impact the, uh, information that is already existing onto the report. So that's the, uh, strategy that I'm going to take into account.

Assuming historical reporting data is needed, describe your approach setting up and maintaining an efficient data archiving strategy that works well with Power b uh, works with the Power BI. Well, um, in this kind of a situation, it's always best to have a, um, you know, an archival workspace altogether, which doesn't get overwritten. And if there is any way, like, if there is a way of using a power automate to, uh, you know, to upload a new file with the current date to, uh, have a historical information or a file uploaded in that particular workspace, then that's the best strategy I would think that, uh, Power BI can have, uh, for, uh, you know, restoring the, um, historical reports and archiving it so that at any point of time, if a user wants to go back and check what was happening at the previous, uh, when there was a previous version, the previous month or the previous quarter or the previous year, then that would, uh, be very helpful. But for that, you need a completely separate space. And, uh, there has to be a way, uh, maybe through Power Automate, um, to archive or or generate a particular file, which, uh, can be used as a historic file. But since I have not worked on Power Automate yet, so I won't be able to, uh, tell you more about this strategy. But this is what I have in my mind that there should be some automated way of, uh, archiving this information, uh, in Power BI in a different, uh, workspace altogether. And, uh, with the date with the name of the file and along with the date. Because, otherwise, how would you know that, uh, this is a historic, um, information? So this is what I would have done for the archiving. Thank you.