Kapil Y.

Kapil Y.

Data Engineer

Jaipur , India

Experience: 2 Years

Kapil

Jaipur , India

Data Engineer

20018.4 USD / Year

  • Notice Period: Days

2 Years

Now you can Instantly Chat with Kapil!

About Me

Data Warehousing Expert with 1.5+ years of experience in Architecting, Developing, and Optimizing Data Warehousing and Modelling project. Well-versed with Azure Synapse,and Azure Data Factory....

Show More

Portfolio Projects

Description

Role: ETL Developer

Environment: Azure Data Lake, Azure Synapse Analytics, Power BI, and SQL

Key Responsibilities:

  • Involved in data gathering requirements and data analysis
  • With the help of the infra team set up Self Hosted Integration on the client’s machine.
  • Creation of Azure data factory pipeline for multiple logic, for example: truncate load, the incremental load, and insert update load for 250+ tables acquired from on-premises SQL server and ingested data from the on-premises SQL server to ADLS and Azure Synapse.
  • Did data transformation of their Qlik scripts into SQL.
  • Created various views and stored procedures on the business logic provided and further provided the views to the Power BI team to create the required dashboards.

Show More Show Less

Description

Role: Data Engineer

Environment: Azure Data Lake, Azure Synapse Analytics, Azure Cosmos DB, and SQL

Key Responsibilities:

  • Creation of Azure data factory pipeline for multiple logics, for example: truncate load, the incremental load, and insert update load for 200+ tables acquired from Salesforce.
  • Create stored procedures to calculate the KPI's logic in synapse and store that data in tables which will load in Cosmos DB once the data gets loaded and aggregation will get completed.
  • Did data validation for all the KPIs created with the business team for all the businesses.
  • Made the final table which contains all the KPIs for a particular business for the Cosmos DB containers.
  • Automation for monitoring and troubleshooting failed pipelines using a logic app and SQL stored procedures etc.

Show More Show Less

Description

Creation of Azure data factory pipeline for multiple logics, for example: truncate load, the incremental load, and insert update load for 200+ tables acquired from Salesforce. Create stored procedures to calculate the KPIs logic in synapse and store that data in tables which will load in Cosmos DB once the data gets loaded and aggregation will get completed. Did data validation for all the KPIs created with the business team for all the businesses. Made the final table which contains all the KPIs for a particular business for the Cosmos DB containers. Automation for monitoring and troubleshooting failed pipelines using a logic app and SQL stored procedures etc.

Show More Show Less

Description

Involved in data gathering requirements and data analysis With the help of the infra team set up Self Hosted Integration on the clients machine. Creation of Azure data factory pipeline for multiple logic, for example: truncate load, the incremental load, and insert update load for 250+ tables acquired from on-premises SQL server and ingested data from the on-premises SQL server to ADLS and Azure Synapse. Did data transformation of their Qlik scripts into SQL. Created various views and stored procedures on the business logic provided and further provided the views to the Power BI team to create the required dashboards.

Show More Show Less