profile-pic
Vetted Talent

Parimal Panda

Vetted Talent
Innovation driven experience in Product Engineering & Delivery Management involving Solution Architecture, Program Management, Data Engineering & Practice Management in Fintech,Banking & Regulatory Compliance products & services.
  • Role

    Director Product Development

  • Years of Experience

    19 years

Skillsets

  • DevOps
  • Salesforce sales and marketing cloud implementations
  • Risk and fraud management systems
  • Technical vendor management
  • Full-stack development
  • Full mobile development life cycle
  • Wealth tech
  • Technical Leadership
  • Snowflake
  • SaaS product development
  • react
  • Product development
  • Payments
  • Java
  • Product Management
  • Data Pipeline
  • Customer Data Platform
  • CRM Implementation
  • Compliance
  • CMS Development
  • cloud architecture
  • Data Analytics
  • AWS - 10.0 Years
  • CI/CD
  • Big Data
  • Python
  • Data Warehousing
  • Data Engineering

Vetted For

19Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    Big Data Engineer with Streaming Experience (Remote)AI Screening
  • 41%
    icon-arrow-down
  • Skills assessed :Spark, CI/CD, Data Architect, Data Visualization, EAI, ETL, Hive, PowerBI, PySpark, Talend, AWS, Hadoop, JavaScript, 組込みLinux, PHP, Problem Solving Attitude, Shell Scripting, SQL, Tableau
  • Score: 37/90

Professional Summary

19Years
  • Jun, 2024 - Present1 yr 4 months

    Director Product Development

    Keenai Global
  • Jul, 2022 - Jun, 20241 yr 11 months

    Principal Technical Product Manager

    Juspay Technologies
  • Jun, 2015 - Jul, 20227 yr 1 month

    Practice Lead

    FICO INDIA PVT LTD
  • Jun, 2005 - Nov, 20138 yr 5 months

    Technical Manager

    Indecomm Technology Pvt Ltd
  • Nov, 2013 - Jun, 20151 yr 7 months

    Data Architect

    TEK SYSTEMS INDIA (CISCO INDIA)

Applications & Tools Known

  • icon-tool

    Apache Spark

  • icon-tool

    Pyspark

  • icon-tool

    Kubernetes

  • icon-tool

    AWS

  • icon-tool

    Redis

  • icon-tool

    Informatica

  • icon-tool

    Prometheus

  • icon-tool

    Clickhouse

  • icon-tool

    Puppet

  • icon-tool

    Jenkins

  • icon-tool

    Kibana

Work History

19Years

Director Product Development

Keenai Global
Jun, 2024 - Present1 yr 4 months
    Drive the technical vision, strategy, and execution of Wealth Tech product development, manage Salesforce CRM integration, lead web and mobile platforms, own DevOps practices, and define data architecture strategy.

Principal Technical Product Manager

Juspay Technologies
Jul, 2022 - Jun, 20241 yr 11 months
    Optimize Cost and Performance of Juspay Payment platform, manage data engineering and cloud architecture, improve CI/CD pipeline, and define product metrics.

Practice Lead

FICO INDIA PVT LTD
Jun, 2015 - Jul, 20227 yr 1 month
    Built and managed Compliance practice, delivered AML/KYC compliance solutions, developed data pipeline platform, and drove full-stack application development.

Data Architect

TEK SYSTEMS INDIA (CISCO INDIA)
Nov, 2013 - Jun, 20151 yr 7 months
    Designed and developed data warehousing platform and optimized ETL platform for business intelligence.

Technical Manager

Indecomm Technology Pvt Ltd
Jun, 2005 - Nov, 20138 yr 5 months
    Developed data pipeline platform, implemented BI and data warehousing projects, and contributed to Java backend development.

Achievements

  • Designing Platform using microservices, client-server, tier internet technologies applications
  • Enhancing CI/CD pipelines
  • Optimization and performance improvement for products
  • Building and managing data engineering and ETL projects
  • Cost optimizations in data warehousing

Major Projects

3Projects

Cost Optimization & Performance Improvement of Juspay SaaS based Payment Processing Platform

    Responsible for optimizing cost and performance of Juspay platform, collaborated with cross-functional teams, and enhanced product metrics.

Transactional & Fraud Risk Management System, AML & KYC Compliance

    Managed delivery and product development of FICO’s Compliance suite solutions. Defined KPIs and drove customer success.

FOCASS

    Futures & Options Clearing and Settlement System for the National Stock Exchange (NSE) derivative market segment.

Education

  • B.Tech/B.E.

    NIT, Orissa (2004)

AI-interview Questions & Answers

Hi. I have got 19 years of experience into data architecture building big data pipeline, uh, using Spark and Hadoop Technologies. Primarily, I have played a role in technical leadership capacity, uh, building a team from scratch. And I have also worked, uh, across SaaS platform, payment systems, managing full stacks application, and on Big Data Technologies.

Hi. I have got 19 years of experience primarily into, uh, technical leadership role, managing medium to large scale team size. I have worked on various domains starting from, uh, managing end to end SaaS application, full tech full stack technologies, uh, managing, uh, big data platform using Spark Hadoop Technologies. Uh, I have worked across multiple domain. I've worked on payment system. I've worked on BFSI domain. I have also managed project on compliance.

You can collect all the logs generated from multiple talent jobs into a Kafka system or Kafka cube. And, uh, from the Kafka, you can have, uh, the different consumer service, which can read those log and upload it into Elasticsearch. You can you can build Kibana on top of the Elasticsearch where it can view all the log metrics.

Well, if you want to, uh, if you have a requirement where you want to process both the batch and the streaming data, uh, That's a case where you should go with a Lambda type architecture, uh, which can handle both the batch data processing and also the streaming applications.

Well, Python comes with the same learning library, uh, MLIP, which can be integrated within Spark, uh, AWS pipeline.

Well, you can have a job which can, uh, uh, which basically compares the data in RDS versus the data what you have in s 3. So you can have a CDC job, uh, and you can have a threshold. So and you collect the metrics between that, and you you compare it with the count. In RDS is higher than, uh, count of record in RDS. If it is higher than what you have in s 3, that means, uh, the data are not in sync between RDS and s 3. So you have to run a CDC job to sync up the data. And while this happened, we also need to generate, uh, logs and metrics through which you can monitor the entire system.

You can do an end check process, uh, to ensure that the file, uh, The file which is which is available the CSV file which is which is that in the source system is complete before you write data into the SD SDFS system.

You can have you you can design the CICD, uh, pipeline. So, basically, what you have to do is, uh, you can have a Jenkins pipeline on a Docker based system. So after a change is committed to get, uh, auto the source code depository, you can trigger the pipeline to build unit test and deploy, uh, and create an image, docker image. That docker image, you can upload that to a docker registry, which can be, uh, deployed into a target system.

You can You can use partition pruning so that you're not scanning the entire partition. Uh, you're reading the partition where where the data is and all you are interested with.