Now you can Instantly Chat with MADHUMARAN!
About Me
Enthusiastic and well-organized java and bigdata engineer with 6+years of experience, practicing agile business model and developing solutions for clients using Java, python, SQL, Scala, shell and Bigdata technologies. I really enjoy programming and ...ents using Java, python, SQL, Scala, shell and Bigdata technologies. I really enjoy programming and always open to adopt technologies that evolve. Ive always been identified as a quick learner. Strong team player with good analytical, people management, communication skills and ability to think and act independently.
Show MoreSkills
-
-
-
-
-
-
- 3 Years
Advanced
-
-
-
-
- 3 Years
Advanced
-
-
-
-
- 2 Years
Intermediate
-
-
-
-
-
-
-
-
-
-
-
-
- 4 Years
Advanced
-
-
-
- 6 Years
Advanced
-
-
- 2 Years
Intermediate
-
-
-
-
-
-
- 4 Years
Advanced
-
-
- 1 Years
Beginner
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- 2 Years
Advanced
-
-
- 4 Years
Advanced
-
- 2 Years
Intermediate
-
-
-
- 4 Years
Advanced
-
-
-
- 2 Years
Intermediate
-
-
-
-
- 2 Years
Advanced
-
-
-
-
-
- 2 Years
Advanced
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- 1 Years
Intermediate
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- 1 Years
Intermediate
-
- 3 Years
Advanced
-
- 3 Years
Advanced
-
-
-
-
-
Positions
Portfolio Projects
Description
Project Description:
- North bound: Fetching raw Json data from home analytics tool impact (Rabbit MQ) and data cache happens in internal Kafka.
- South Bound: Kafka integration using connect and connectors with respect to customer DL and S3
- Enabling telemetry Data cache (multitenant) and Data visualization through spark SQL in Zeppelin
Responsibilities:
- Requirement Gathering: Develop using java framework and follow agile methodology to deliver or integrate user story as a helm deployment in Open stack cloud environment
- Automation: Using Raddish Framework with python we automated the above requirement as a system test, and we run through Jenkins nightly build
- Monitoring: log monitoring using fluentd and Kibana EFK
Description
- Project Description:
- Helogics Should Merge Assisting nurses in the nursing management of each patient through ongoing monitoring and evaluation of the effectiveness of the patient’s treatment plan
- And paper report documents actions taken and/or results taken compared to the 9 essential steps of wound healing
- This new effort and design will provide an opportunity to more effectively provide patient care and adhere to compliance requirements for patient related health care data.
- The tool will employ patient treatment order data captured in the Helogics source data architecture as identified by bi and downloaded to a separate Helogics it defined data architecture to be accessed within each clinic on a daily basis. Utilization of electronic mediums, such as an iPod or android tablet, laptop or desktop, will be instituted in order to eliminate the paper reporting currently in place.
- The tool will provide for all instances of patients’ wound(s) in a pictorial, visual data representation of weekly wound healing progress and treatments where wounds are measured and compared against the Helogics 9 steps of healing and wound care treatments are guided and directed by same, along with captured provider notes, enhancing case management and safeguarding patient data.
Responsibilities:
- BI integration Layer: Batch Job - downloading (.csv,.pdf,Json) files from sftp location using Spark Scala, persisting data to Apache Cassandra
- Service Layer: Spring Restful web services understand the business requirements and drive the application development of the project, sharing and documenting Json: req/Res, Unit Testing with DEV environment, moving build to QA-AWS (Linux) Machine
- Build: Dev and QA environment (AWS Linux server)
- Participated in Agile ceremonies - Sprint planning, Daily scrum meeting, Sprint Retrospection
Description
Description:
- will read data from sftp server
- will parse the data and validate busness logic
- then will post to kafka
- spark will read the data from kafka-topic and will do some transformations
- save the data in the hive
- Spark batch job: to read data from hive and aggregate save the data in hbase
- will read data from phoenix client(Hase) and pass it to UI throgh Rest call from UI
Role:
- scheduling the tasks in jira will pass it to juniours and guide them accouringly
- writing streaming and batch jobs using spark
- will write aggrigation job's in spark
Description
Helogics Should Merge Assisting nurses in the nursing management of each patient through ongoing monitoring and evaluation of the effectiveness of the patients treatment planAnd paper report documents actions taken and/or results taken compared to the 9 essential steps of wound healingThis new effort and design will provide an opportunity to more effectively provide patient care and adhere to compliance requirements for patient related health care data.The tool will employ patient treatment order data captured in the Helogics source data architecture as identified by bi and downloaded to a separate Helogics it defined data architecture to be accessed within each clinic on a daily basis. Utilization of electronic mediums, such as an iPod or android tablet, laptop or desktop, will be instituted in order to eliminate the paper reporting currently in place.The tool will provide for all instances of patients wound(s) in a pictorial, visual data representation of weekly wound healing progress and treatments where wounds are measured and compared against the Helogics 9 steps of healing and wound care treatments are guided and directed by same, along with captured provider notes, enhancing case management and safeguarding patient data.
Show More Show LessDescription
weve developed innovations that help customers run their home even more efficiently, like Hive, an app which allows people to control their heating and hot water from their mobile, giving them greater convenience and control.These days, the distinctive blue British Gas vans are out doing much more than fixing boilers - theyre providing innovative solutions to make our customers homes more energy efficient, helping them save energy and money.
Show More Show LessDescription
This application is used to maintain Data about the warehouse and to maintain the details about the History of Inbound, Outbound, Service Order Management, and Repair-Maintenance. The project includes creation of Item Code, Alternative Item, Warehouse Detail, Sub-Inventory, Item Organization Mapping, Inbound, Outbound, Service Order Management, and Repair-Maintenance for each order.
Show More Show Less