Mithu W.

Mithu W.

Big Data Developer at Softenter India Pvt Ltd

Pune , India

Experience: 3 Years


Pune , India

Big Data Developer at Softenter India Pvt Ltd

5142.85 USD / Year

  • Immediate: Available

3 Years

Now you can Instantly Chat with Mithu!

About Me

  • Having 3+ year of IT programming experience in complete software development life cycle with developing, supporting applications using Java Technologies and Hadoop. 

  • Currently working with Softenger India pvt ltd...

  • Currently working with Softenger India pvt ltd as Big Data Engineer.

  • Experience in installation, Development implementation of Hadoop

  • Experience in dealing with Apache Hadoop components like HDFS, Hive, Sqoop, Apache Spark, Scala, Apache NIFI, SpagoBI, JasperSoft Rreporting Tool.

  • Capable of processing large set of structured, semi-structured data and supporting systems application architecture.

  • Experience in developing Streaming Application using Spark and Scala

  • Experience in Fetching Open source API data and processing using Apache NIFI- Google news API, MAS FX Daily rates

  • Flexibility and adaptability in regards with new Technologies and Environments.

Show More

Portfolio Projects

Regulatory Big Data Reporting Project


Regulatory Big Data Reporting Project


Description: Regulatory BigData Reporting Project is Data Warehouse Project of Future Generali Insurance Company Mumbai for generating IRDA Report.


➢ This Project consists of two primary components:

A) A building a data warehouse

B) providing an end-user query/reporting front-end to the data warehouse.


➢ The data warehouse is built on a Hadoop cluster of Hortonwork platform . ➢ Hive is used as the Data warehouse for storing the OLAP data. ➢ Also Sqoop is used for one time-full loading the data from RDBMS system to Hive at month end . ➢ Hive uses Apache Tez as processing engine . ➢ Once loading the data from RDBMS to Hive , we are transforming and processing the data using spark and make base table in hive which is called as BAP and then loading the result set data of spark sql in to BAP table of Hive. ➢ After preparing the BAP, we hitting the BAP using presto query for generating the number of report This query result represented to the end user using the Jaspersoft BI tool. ➢ End user requesting the report for particular period by entering the input , it getting the result and download it in the form of excel sheet.  Responsibilities: ➢ Import data from RDBMS system to Hive using Sqoop. ➢ Development of ETL in Spark Sql for Extracting the required data on Hive. ➢ Implement the logic for developing the IRDA Report from the Data which is stored in Hive in presto SQL. ➢ Developing the Pestro queries for IRDA report on Hive. ➢ Development of Jaspersoft Report using that Data. ➢ Run the ETL at the Monthly, Quarterly and Annualy .

Show More Show Less


Big Data


Apache Hive





Description: Moven is platform independent mobile application for Banking system.


➢ Design a connector to Transfer data securely across Amar Bank’s systems to Moven Mobile Application. ➢ Amar Bank’s back-end system will extract the required data and transfer data flat-files(CSV file with pipe delimiter) on Alibaba cloud. ➢ NIFI file watcher job pickup file from that location, process it and dump data into MongoDB. ➢ Mysql is used to maintains all configuration table. ➢ Python script used to post MongoDB record into Moven API. ➢ User got the email notification statistics report after completion of batch. ➢ All deployment done by using Docker on Alibaba cloud.

Show More Show Less


Big Data


  • Profile Verified

  • Phone Verified

Available Timezones

  • Eastern Daylight [UTC -4]

  • Central Daylight [UTC -5]

  • Mountain Daylight [UTC -6]

  • Pacific Daylight [UTC -7]

  • Eastern European [UTC +2]

  • Eastern EST [UTC +3]

  • Greenwich Mean [UTC ±0]

  • Further EET [UTC +3]

  • Australian EDT [UTC +11]

  • Australian CDT [UTC +10:30]

  • Dubai [UTC +4]

  • New Delhi [UTC +5]

  • China (West) [UTC +6]

  • Singapore [UTC +7]

  • Hong Kong (East China) [UTC +8]