Now you can Instantly Chat with Nilesh!
About Me
Seeking a position to utilize my skills and abilities that offers Professional growth while being resourceful, innovative, and flexible always keeping the organizations goal as the priority. Total 9+ years of experience in Big Data and ETL. -> 3 year...
Show MoreSkills
Positions
Portfolio Projects
Description
Involved in transferring files from RDBMS to Hadoop filesystem using Sqoop.ØInvolved in writing queries with HiveQL and Oracle SQL. ØCreated Hive table schemas, data loading, report generation. ØUsed windowing and analytical functions, Partitioning, bucketing in hive to get better performance.
· Good understanding of importing and exporting data using SQOOP from HDFS to Relational Database Systems and vice-versa. ØInvolved in key design decisions of the project Data warehouse Migration to Hadoop Eco System.
· DB Languages-SQL.
· Bi tools like Pentaho.
· Experience in Requirement Gathering, Development, Testing, Deployment and Post Deployment support.
· Knowledge of AWS
· Tuning existing Spark codes time to time based on the business requirement.
· Writing SQL queries for business for getting insight of data
Show More Show LessDescription
Lead migration Bow project.
• Automated Volumetric Analysis through Pentaho ETL tool.
• Worked on completely new flow TSY and DRD and delivered the projects on time. Received appreciations from higher level management.
• Prepared scripts to update OLA time for DDA project to meet the SLA for downstream.
• Worked closely with Business team and provided the solution to the stakeholders of upstream and downstream.
• Worked on procedures to send data to downstream systems.
• Documented all the requirements and procedures code change.
• Provided trainings about project to new team members.
• Provided end to end support on Production.
Description
Earlier, Sales team used to download the files from particular website and then add more attributes in that with many logics and generates reports for management.
• I have automated this entire process by implementing ETL such a way that it downloads the file through cronjob, puts the file in specific folder, then another job will run which pick the file and do the processing (i.e., adding new columns, look up on tables and fill the new columns, generates new file to particular location and send that file to management in email, sending logs to ETL team, making run time entries in database after each transformation finishes it logic).
• This entire process is automated which is scheduled in server and running fine. It saves 2 hours of manual work.
• Currently, team is using PHP code (CURL) which takes lot of time to download the files and insert the static data of all the hotels suppliers.
• Management wanted entire process to be automated. so that they can avoid dependency PHP developers.
• ETL Job will download/FTP files in particular location and then another Job will pick the file and process and send an email with update of count of changes in tables to management. It will also copy respective tables from one database to another. Error handling is also taken care of.
• I have automated 60+ suppliers Jobs which are running fine. No manual work is required for this.
Show More Show LessDescription
Worked on creating new stored procedures, functions as a part of Report Extract.
• Worked on performance tuning and query optimization.
• Worked on code changes for the existing packages in the database.
• Worked on PL/SQL scripts to select or update or delete the data from database tables.
• Collaborated with various departments and perform research on data processing function.
• Worked on ETL mapping change for performance improvement.
• Worked closely with customer and business users to meet the business requirements.
Show More Show LessDescription
Lead migration Bow project. Automated Volumetric Analysis through Pentaho ETL tool. Worked on completely new flow TSY and DRD and delivered the projects on time. Received appreciations from higher level management. Prepared scripts to update OLA time for DDA project to meet the SLA for downstream. Worked closely with Business team and provided the solution to the stakeholders of upstream and downstream. Worked on procedures to send data to downstream systems. Documented all the requirements and procedures code change. Provided trainings about project to new team members. Provided end to end support on Production.
Show More Show LessDescription
Involved in transferring files from RDBMS to Hadoop filesystem using Sqoop. Involved in writing queries with HiveQL and Oracle SQL. Created Hive table schemas, data loading, report generation. Used windowing and analytical functions, Partitioning, bucketing in hive to get better performance. Good understanding of importing and exporting data using SQOOP from HDFS to Relational Database Systems and vice-versa. Involved in key design decisions of the project Data warehouse Migration to Hadoop Eco System. DB Languages-SQL. Bi tools like Pentaho. Experience in Requirement Gathering, Development, Testing, Deployment and Post Deployment support. Knowledge of AWS Tuning existing Spark codes time to time based on the business requirement. Writing SQL queries for business for getting insight of data
Show More Show LessDescription
Earlier, Sales team used to download the files from particular website and then add more attributes in that with many logics and generates reports for management. I have automated this entire process by implementing ETL such a way that it downloads the file through cronjob, puts the file in specific folder, then another job will run which pick the file and do the processing (i.e., adding new columns, look up on tables and fill the new columns, generates new file to particular location and send that file to management in email, sending logs to ETL team, making run time entries in database after each transformation finishes it logic). This entire process is automated which is scheduled in server and running fine. It saves 2 hours of manual work. Currently, team is using PHP code (CURL) which takes lot of time to download the files and insert the static data of all the hotels suppliers. Management wanted entire process to be automated. so that they can avoid dependency PHP developers. ETL Job will download/FTP files in particular location and then another Job will pick the file and process and send an email with update of count of changes in tables to management. It will also copy respective tables from one database to another. Error handling is also taken care of. I have automated 60+ suppliers Jobs which are running fine. No manual work is required for this.
Show More Show LessDescription
Worked on creating new stored procedures, functions as a part of Report Extract. Worked on performance tuning and query optimization. Worked on code changes for the existing packages in the database. Worked on PL/SQL scripts to select or update or delete the data from database tables. Collaborated with various departments and perform research on data processing function. Worked on ETL mapping change for performance improvement. Worked closely with customer and business users to meet the business requirements.
Show More Show Less