Now you can Instantly Chat with Rajat!
About Me
Experienced Cloud Data Engineer with Fortune 500 banking and telecom clients, startup organizations, and freelance work. Contributed to open-source development projects....velopment projects.
Show MoreSkills
-
-
-
-
-
-
- 4 Years
Advanced
-
-
-
-
-
-
- 6 Years
Expert
-
- 4 Years
Intermediate
-
-
-
-
- 1 Years
Beginner
-
-
-
-
- 6 Years
Advanced
-
-
-
- 3 Years
Beginner
-
-
-
-
-
- 5 Years
Advanced
-
-
- 4 Years
Advanced
-
-
- 3 Years
Intermediate
-
-
-
-
- 3 Years
Intermediate
-
- 2 Years
Intermediate
-
- 3 Years
Intermediate
-
- 2 Years
Intermediate
-
-
-
- 6 Years
Advanced
-
-
-
-
- 3 Years
Intermediate
-
-
-
-
- 7 Years
Expert
-
-
-
-
- 2 Years
Beginner
-
- 5 Years
Advanced
-
-
- 3 Years
Advanced
-
- 6 Years
Advanced
-
- 1 Years
Intermediate
-
- 1 Years
Intermediate
-
- 5 Years
Advanced
-
- 1 Years
Beginner
-
-
-
-
-
-
- 6 Years
Advanced
-
- 1 Years
Intermediate
-
-
- 4 Years
Advanced
-
-
-
-
- 3 Years
Intermediate
-
-
- 1 Years
Intermediate
-
-
-
-
- 2 Years
Beginner
-
-
-
- 2 Years
Intermediate
-
-
-
- 1 Years
Intermediate
-
- 3 Years
Intermediate
-
-
- 2 Years
Intermediate
-
- 2 Years
Intermediate
-
-
-
- 2 Years
Intermediate
-
- 1 Years
Beginner
-
-
- 4 Years
Advanced
-
- 2 Years
Intermediate
-
-
-
- 3 Years
Advanced
-
- 1 Years
Beginner
-
- 1 Years
Intermediate
-
- 2 Years
Advanced
-
-
-
-
-
-
-
-
- 1 Years
Advanced
-
- 2 Years
Intermediate
-
- 1 Years
Intermediate
-
-
-
-
-
-
-
-
-
- 3 Years
Advanced
-
- 2 Years
Intermediate
-
-
-
- 2 Years
Intermediate
-
-
-
-
-
- 3 Years
Intermediate
-
- 6 Years
Advanced
-
-
- 3 Years
Intermediate
-
-
-
-
- 2 Years
Intermediate
-
- 3 Years
Intermediate
-
-
-
-
-
-
- 2 Years
Intermediate
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- 1 Years
Intermediate
-
- 4 Years
Advanced
-
- 4 Years
Expert
-
-
-
Positions
Portfolio Projects
Description
• Implementing new ingestion pipelines and features in Spark to extract meaningful value from semi-structured data
• Tuning performance optimization for existing Spark data pipelines
• Migrating multiple data sources via AWS DMS
• Setup of data warehouse using Amazon Redshift, creating Redshift clusters and perform data analysis queries
• Migrating previous on-premise RDBMS data sources into AWS storage (S3)
• Performed analysis on data by implementing various machine learning and deep learning algorithms
Description
Description: Automate the current IVR process to remove dependency on Customer Care Executive
Tech Stack: Confluent Kafka, Hive, Spark, Python, Machine Learning, IBM RTC, Jenkins
• Handled Data Ingestion tool Kafka for real time processing of the data with Spark Streaming for large scale data processing
• Integrated multiple data sources via Kafka Data Ingestion tool
• Played a key role in transformation by Spark scripts for data transformation, mapping, standardization & data characterization
• Deployed various transformations by Spark scripts to extract meaning and value from structured & semi-structured data
• Performed analysis on data by implementing various machine learning algorithms in Python
• Improved algorithms by deploying best hyperparameter, deployed GridSearchCV for tuning
Description
Description: To create datalake which shall be the single & comprehensive source of information to improve decision making
Tech Stack: Sqoop, PySpark, Python, AWS(EMR)
• Setup environment on AWS EMR for development purposes
• Oversaw data extraction from Charging System nodes Oracle Exadata to HDFS using Data ingestion tool Sqoop
• Played a key role in writing the Python scripts
• Performed analysis on data by Spark scripts to extract meaning and value from structured data
Description
Description: Constant update in tariff raises persistent need to track the impact on ROI which determines end user satisfaction
Tech Stack: Docker, Sqoop, Hive, PySpark, Airflow, Tableau, AWS Redshift
• Setup environment on Docker for development purposes
• Played a key role in writing the Data ingestion Sqoop scripts from MySQL database to HDFS
• Performed analysis on data by Spark scripts to calculate traditional & ad-hoc KPI’s from structured & unstructured data
• Managed Data ingestion using Sqoop, maintenance, cleaning, and manipulation of data using Spark scripts
• Exported processed data to AWS Redshift and prepared Tableau dashboards for management.
Verifications
-
Profile Verified
-
Phone Verified
Preferred Language
-
English - Fluent
-
Hindi - Native/Bilingual
Available Timezones
BROWSE SIMILAR DEVELOPER
-
Doug O
Multi-Cloud, Big Data, Data Analytics and Solutions Architect
-
Art F
Data Architect with Teradata, Linux, SQL, and Python experience
-
SUBBA RAO D
DIRECTOR – PROJECTS / GM – ITC / DELIVERY HEAD / SR. PROGRAM MANAGER/ MIS HEAD/ IT HEAD/ CIO
-
Steven T
Have coded almost everything from firmware through apps, dev to valid to customer suppport
-
Terry L
SAS Consultant
-
MARK O
SENIOR SOFTWARE ENGINEER
-
Nelson L
Senior IT, Telecom, and Operations Executive
-
Olaf C
Senior AI, Cognitive & Automation Architect Azure/Quantum Hybrid Architect
-
BALAJI I
Chief Analytics Officer and Founder
-
Aquiles Alejandro B
Data Scientist