Ketan K.

Ketan K.

Data Engineer on cloud with BI/ETL experience

Mumbai , India

Experience: 1 Year

Ketan

Mumbai , India

Data Engineer on cloud with BI/ETL experience

30857.2 USD / Year

  • Start Date / Notice Period end date: 2020-02-24

1 Year

Now you can Instantly Chat with Ketan!

About Me

  • 15 + years of IT experience in multiple technologies.
  • Hands on experience in Azure data fatory ( by developing end to end pipeline) , 
  • Experience on Apache beam and Google DataFlow pipeline loading data in BigTable...
  • Experience on Apache beam and Google DataFlow pipeline loading data in BigTable
  • Spark Developer with ability to process data in real time or  batch. 
  • Have also designed reports in tableau , qliksense and python for our customers/business users.
  • Have also worked as a performance optimization consultant on hadoop platform by tuning spark programs ,hive and impala queries.
  • Understands ETL and warehousing design and has experience on ETL tools like Abinitio, Informatica BDM.
  • Have worked on dev ops tools like jenkins and git. and workflow co-ordinators like oozie, autosys  and Air Flow
  • Worked on Agile environment. 

Show More

Portfolio Projects

Google Cloud engineer

Company

Google Cloud engineer

Role

DevOps Engineer

Description

Our project involved capturing bonus points for all our users from various discrete sources and third party channels

This data comes in flat files, json files, web pages etc.

We have to capture , cleanse, process, transform and load the data on cloud and then generate analytics for the business users.

  • Working on Dataflow, BigQuery, Cloud spanner, Cloud-SQL, BigTable, Datastore.

  • Have implemented one complete migration project with ETL pipeline from on-premise hadoop set up to GCP cloud.

  • Working on setting up Apache Beam pipeline on google cloud.

Show More Show Less

Azure Data Factory developer

Company

Azure Data Factory developer

Description

Our project involved capturing service now data.

This data helped reflect the type of incidents, requests and changes within the organisation.

We have to capture, load and process the data for analytics.

Our tasks included :

  • Setting up data pipeline on Azure environment.
  • Worked on Azure data factory team to help set up the pipeline on Azure
  • Also Worked on spark , spark streaming , kafka and python to process the data on Hadoop cluster on Azure.

Show More Show Less

Hadoop Performance Specialist

Company

Hadoop Performance Specialist

Description

  • Working on tuning hadoop and spark jobs and hive queries
  • Applying best practices to tune the jobs
  • Working closely with developers to help fix the performance of their spark jobs on hadoop environment
  • Working on impala queries and with end users to tune their queries so that their dashboards can work / load.

Show More Show Less

Hadoop admin

Company

Hadoop admin

Description

Working on managing hadoop cluster.

Working on helping users and applications to deploy and run the jobs

Ensuring that the cluster is available and stable.

Show More Show Less

Tools

hadoop

ETL developer

Company

ETL developer

Description

  • Working as an ETL developer ( abinitio ) 
  • Capture data from various sources and load on database.
  • working with end users on their analytical queries

Show More Show Less

Tools

Git autosys

ETL production support specialist

Company

ETL production support specialist

Description

  • Worked on Shell scripting in developing various applications
  • Worked on supporting these jobs
  • Production Support 24 * 7 * 365 environment
  • worked on developing ETL scripts on abinitio also
  • Supported the project from Richmond, VA

Show More Show Less