About Me
- Working in Data Analytics with having more than 6.5+ years of IT industry, experienced in Banking, Credit Services, Insurance, Restaurant domain and Retail Analytics
- In Technology, having rich experience of working in –
- ...
- In Technology, having rich experience of working in –
- OnPremise DB - SQL Server 2008R2 / 2012 / 2014 / 2016 / 2017, Oracle, MySQL
- Cloud based databases/datawarehouses - AzureDB, Hive, Redshift, Kusto(Azure Data Explorer).
- ETL Tools - SSIS, ADF(Azure Data Factory)
- Reporting Tools – SSRS, Power BI
- Processing Engine – Spark
- File System - HFDS
- Possess experience in configuration, implementation and testing of Transact, SDS, Tallyman, CRMNext. These are Experian Decision Analytic products used to manage Business flows, Credit risk, Fraud prevention, Commercial Banking Collections, Debt Recovery and Insurance flows.
- Involved in developing the requirements/change requests, Coding, Unit Testing, Maintenance and support.
- Expert in architecting the databases/datawarehouses, proposing the solution models, performance tuning
- Developed end to end enterprise analytical solutions for the clients like Microsoft, Barklays, Experian, BurgerKing, TGI-Fridays
- Hardworking, Customer facing, Detail oriented, fast learner and team player.
Skills
Web Development
Software Engineering
Data & Analytics
Database
Programming Language
Operating System
Others
Portfolio Projects
Company
Microsoft Marketplace
Description
Project Title : : Microsoft Marketplace
Client : Microsoft(Redmond U.S.)
Products Used : SSMS, SSIS, ADF v2(Azure Data Factories), Azure platform, Spark, PowerBI
Database Used : SQL Server2016, AzureDB, Hive
Languages Used : SQL, T-SQL, Scala, DAX, HQL, Scala
Project Description :
Microsoft Marketplace is a well known Microsoft product. On client requirement, we migrated complete system that was developed in SQL Server(legacy) to Hadoop and for client visibility then we pushed data into AzureDB(PAAS). Considering the data volume that Hadoop can handle, it was chosen for warehousing and data processing. For pipelines creation highly advanced tools like ADF v2(Azure Data Factory), SSIS, for transformation logic Spark-Scala, T-SQL, for warehousing Hive, for Blobs Parquet,ORC formats, were used respectively. QA dashboards were created using PowerBI, where we show our different KPI/QA results
Roles and Responsibilities -
- Understanding the Business requirements and extracting the mechanism to implement Business strategy.
- For ETL/ELT created pipelines using Azure Data Factories(ADF v2), SSIS
- Created different Pipelines to extract data based on different logics written logics written in Spark-Scala or SQL StoredProcedure, and pushed them in form of Blobs in HDFS
- Performed indexing, performance tuning , query/code optimization for different jobs.
- Responsible for creating different reports as per the business requirement created using PowerBI.
- Automated different ETL processes and reports to avoid manual intervention.
- Preparing documents prior to development phase such as estimation sheet, query log, low-level design document(LLD).
Skills
Azure Blob Azure DataFactory Azure Key Vault Hive Power BI Scala Apache Spark SQL SQL Server Integration Services (SSIS)Tools
Github IntelliJ IDEA SsmsCompany
TMM(Total Menu Management)
Role
Backend Developer
Description
Project Description :
Perkins and Bojangles are the TMM Clients that used to take complete Pricing/Financial recommendations done via Fishbowl. These clients used to provide data in the form of different zip files, .csv files. We used to load data from these files into the database then process that data into different POS and Fact tables, that analytics team used to do their analysis and in the end recommends customer the pricing of their products and other different services.
Roles and Responsibilities -
- Understanding the Business requirements and extracting the mechanism to implement Business strategy.
- Performing daily ETL tasks and implementing complex SSIS packages such as reading the data from zip files and processing them as per the business requirements.
- Developed different database jobs to import data from different sources.
- Performed indexing, tuning , query optimization for different jobs.
- Responsible for creating different reports as per the business requirement.
- Created different SSRS(SQL Server Reporting Services) reports to check the data quality and make sure that information is consistent..
- Automated different ETL processes and reports to avoid manual intervention.
- Preparing documents prior to development phase such as estimation sheet, query log, low-level design document(LLD).
Company
UNO – One System
Role
Backend Developer
Description
Project Description :
UNO – One System belongs to Insurance Domain. MLI has 18 different systems such as MyMoney, Omnidocs, MyFlow etc. that they wanted to be integrated with each other and their business flows needs to be automated . CRMNext product acted as a one roof , that helped in integration of these 18 system. As a result , all the manual and asynchronous work replaced with different real time and Batch integrations, which leads to great increase in their business and cost saving by increasing the throughput.
Roles and Responsibilities -
- Understanding the Business requirements and extracting the mechanism to implement Business strategy.
- Performing real time and batch integration with different systems.
- Developed different database jobs to import policy, customer, agent data into the system.
- Performed indexing, tuning , query optimization for different jobs.
- Preparing documents prior to development phase such as estimation sheet, query log, low-level design document(LLD).
- Responsible for creating different reports as per the business requirement
- Provided onsite support to MLI after delivery.
- Created different ETL packages to successfully import bootup/incremental data from different systems.