Chinmayee N.

Chinmayee N.

Senior Data Engineer

Pune , India

Experience: 5 Years

Chinmayee

Pune , India

Senior Data Engineer

11476.2 USD / Year

  • Immediate: Available

5 Years

Now you can Instantly Chat with Chinmayee!

About Me

An incisive professional with 4.5+ years of total experience in Various Domain. 2+ Yeas of Bigdata testing experience with Hadoop , HDFS, Map Reduce, Sqoop, Pig, Hive, HBase, Spark, Scala. Good Exposure to Big Data/Hadoop Testing. Have knowledge on H...

Show More

Portfolio Projects

Description

Responsibilities:

  • Test Plan preparation and Review.
  • Creation of Estimations (L0/L1/L2), Resource Planning (Consolidated Resource planning), Test Plans and Test Closure Reports.
  • Involve in Frontend (Application) as well as backend (ETL) testing.
  • Involve in ETL (Extract >> Transform >> Load) testing as well as E2E testing.
  • Good Exposure to Big Data/Hadoop Testing. Have knowledge on Hive, HBase & Solr.
  • Review of SRS, FRD document and Preparing Test data as per Requirements.
  • Preparing Test cases against System Requirement Specification or change requests.
  • Involved in Test Execution (Sending Daily Status & Weekly Status Reports to Clients)
  • Worked with Components like input file, output file and component for Partitioning and repartitioning etc.
  • Defect management related activities like preparing defect logs, updating the defect logs after regression testing.
  • Preparing the daily and weekly status report.
  • Reported test results and bugs found during testing effectively.
  • Checked the full dataflow of the orders per system test plan.
  • Cross checked the database to see that the correct credentials of the orders.

Show More Show Less

Description

MetLife is a web based software application, used to process and settle claims raised by registered customers. The application allows faster processing, tracking and settlement of claims. When registered customers have an incident in which the product is damaged, then they lodge a claim. If the product is damaged and it can be repaired, then to verify the extent of damage and amount that can be paid, details of the incident are entered in different screens of MetLife. Once these details are entered, then the Product can be sent for repairs to authorize garages from MetLife network Database.

Show More Show Less

Description

Show More Show Less

Description

Responsibility:

  • Wrote different pig scripts to clean up the ingested data and created partitions for the daily data.
  • Import data using Sqoop into Hive and Hbase from existing SQL Server.
  • Support code/design analysis, strategy development and project planning.
  • Create reports for the BI team using Sqoop to export data into HDFS and Hive.
  • Develop multiple MapReduce jobs in Java for data cleaning and preprocessing.
  • Involve in Requirement Analysis, Design, and Development.
  • Export and Import data into HDFS, HBase and Hive using Sqoop.
  • Involve in create Hive tables, loading with data and writing Hive queries which will run internally in MapReduce way.
  • Work closely with the business and analytics team in gathering the system requirements.
  • Analyzing and Developing Spark programs using Scala API's to compare the performance of Spark with Hive and SQL.

Show More Show Less

Description

  • Installed and configured Hadoop MapReduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
  • Migration of ETL processes from Oracle to Hive to test the easy data manipulation.
  • Responsible for developing data pipeline using Sqoop, MR and Hive to extract the data from weblogs and store the results for downstream consumption.
  • Worked with HiveQL on big data of logs to perform a trend analysis of user behavior on various online modules.

Show More Show Less