Adarsh N.

Adarsh N.

Expert Data Engineer with 6.5+ Years of experience in Data Modelling\Warehousing, ETL\ELT & BigData

Noida , India

Experience: 6 Years

Adarsh

Noida , India

Expert Data Engineer with 6.5+ Years of experience in Data Modelling\Warehousing, ETL\ELT & BigData

60055.3 USD / Year

  • Notice Period: 60 Days

6 Years

Now you can Instantly Chat with Adarsh!

About Me

  • Working in Data Analytics with having more than 6.5+ years of  IT industry, experienced in Banking, Credit Services, Insurance, Restaurant domain and Retail Analytics
  • In Technology, having rich experience of working in –
      ...
    • Possess experience in configuration, implementation and testing of Transact, SDS, Tallyman, CRMNext. These are Experian Decision Analytic products used to manage Business flows, Credit risk, Fraud prevention, Commercial Banking Collections, Debt Recovery and Insurance flows.
    • Involved in developing the requirements/change requests, Coding, Unit Testing, Maintenance and support.
    • Expert in architecting the databases/datawarehouses, proposing the solution models, performance tuning
    • Developed end to end enterprise analytical solutions for the clients like Microsoft, Barklays, Experian, BurgerKing, TGI-Fridays
    • Hardworking, Customer facing, Detail oriented, fast learner and team player.

    Show More

Portfolio Projects

Microsoft Marketplace

Company

Microsoft Marketplace

Description

Project Title    :            : Microsoft Marketplace

      Client                     : Microsoft(Redmond U.S.)

      Products Used        : SSMS, SSIS, ADF v2(Azure Data Factories), Azure platform, Spark, PowerBI

      Database Used       : SQL Server2016, AzureDB, Hive

      Languages Used     : SQL, T-SQL, Scala, DAX, HQL, Scala

Project Description :

Microsoft Marketplace is a well known Microsoft product. On client requirement, we migrated complete system that was developed in SQL Server(legacy) to Hadoop and for client visibility then we pushed data into AzureDB(PAAS). Considering the data volume that Hadoop can handle, it was chosen for warehousing and data processing. For pipelines creation highly advanced tools like ADF v2(Azure Data Factory), SSIS, for transformation logic Spark-Scala, T-SQL, for warehousing Hive, for Blobs Parquet,ORC formats, were used respectively. QA dashboards were created using PowerBI, where we show our different KPI/QA results

Roles and Responsibilities -

  • Understanding the Business requirements and extracting the mechanism to implement Business strategy.
  • For ETL/ELT created pipelines using Azure Data Factories(ADF v2), SSIS
  • Created different Pipelines to extract data based on different logics written logics written in Spark-Scala or SQL StoredProcedure, and pushed them in form of  Blobs in HDFS
  • Performed indexing, performance tuning , query/code optimization for different jobs.
  • Responsible for creating different reports as per the business requirement created using PowerBI.
  • Automated different ETL processes and reports to avoid manual intervention.
  • Preparing documents  prior to development phase such as estimation sheet, query log, low-level design document(LLD).

Show More Show Less

TMM(Total Menu Management)

Company

TMM(Total Menu Management)

Role

Backend Developer

Description

Project Description :

Perkins and Bojangles are the TMM Clients that used to take complete Pricing/Financial recommendations  done via Fishbowl. These clients   used to provide data in the form of different zip files, .csv files. We used to  load data from      these files into the database then process that data into different POS and Fact tables, that  analytics team used to do       their analysis and in the end recommends customer the pricing of their products and  other different services.

   Roles and Responsibilities -

  • Understanding the Business requirements and extracting the mechanism to implement Business strategy.
  • Performing daily ETL tasks and implementing complex SSIS packages such as reading the data from zip files and processing them as per the business requirements.
  • Developed different database jobs to import data from different sources.
  • Performed indexing, tuning , query optimization for different jobs.
  • Responsible for creating different reports as per the business requirement.
  • Created different SSRS(SQL Server Reporting Services) reports to check the data quality and make sure that information is consistent..
  • Automated different ETL processes and reports to avoid manual intervention.
  • Preparing documents  prior to development phase such as estimation sheet, query log, low-level design document(LLD).

Show More Show Less

Tools

JIRA Ssms

UNO – One System

Company

UNO – One System

Role

Backend Developer

Description

Project Description :

UNO – One System belongs to Insurance Domain. MLI has 18 different systems such as MyMoney, Omnidocs, MyFlow etc. that they wanted to be integrated with each other and their business flows needs to be automated . CRMNext product acted as a one roof , that helped  in  integration of these 18 system. As a result , all the   manual and asynchronous work replaced with different real time and Batch integrations, which leads to great  increase in their business and cost saving by increasing the throughput.

Roles and Responsibilities -

  • Understanding the Business requirements and extracting the mechanism to implement Business strategy.
  • Performing real time and batch integration with different systems.
  • Developed different database jobs to import policy, customer, agent data into the system.
  • Performed indexing, tuning , query optimization for different jobs.
  • Preparing documents  prior to development phase such as estimation sheet, query log, low-level design document(LLD).
  • Responsible for creating different reports as per the business requirement
  • Provided onsite support to MLI after delivery.
  • Created different ETL packages to successfully import bootup/incremental data from different systems.

Show More Show Less

Tools

JIRA Ssms
Share: