Now you can Instantly Chat with Balasubramanian!
About Me
Overall 8+ years of experience in Developing & Delivering distributed application including experience in working with large scale Big- Data platforms and solutions Having 2+ years of experience in Developing and Supporting application and helping or...
Show MoreSkills
Portfolio Projects
Description
In There in DHL datalake we are maintaining various structured and unstructured data from various businees units like DGF,NORAM,PCT, GEOCODING and IOT data so there are over 80 MS SQL databases in MS Azure which needs to be available on the Data Lake. The structure of each databases is similar (only small specificities for some countries). Data lake tables should mirror the MS SQL tables. There are significant differences in size and frequency of changes between the DBs.
Show More Show LessDescription
Centralized Transaction Framework, CTF is basically an interface that integrates various kinds of customers who acquire different set of services and agencies that provide those services. It is a controlling framework to process and monitor the transactional requests posed by several user groups. It carries a distributed environment to regulate each and every component involved in the framework.
Show More Show LessDescription
Ezy2Ship specializes in logistics and fulfillment services to businesses, which helps to prepare and manage domestic, International shipments faster and easier .Customer can print shipment labels and documents, schedule for collection or drop off or fixed schedule at any post office counter, create online address book, view shipment history, track your shipments online, generate reports, all within a few clicks from the comfort of their desk. With just a few simple clicks, you can book, pay for, and manage your delivery from the comfort of your own home or office. ezy2ship also allows you to schedule for courier collections.
Show More Show LessDescription
- Worked on VA platform which will forward requests from the channels you desire to your bot application
- Worked on creating dialog flow agents which act as a Natural Language Process (NLP).
- Worked on Backend application which integrated back end application with chat bot using NodeJS and Java
- Integrated multiple channels like, Facebook messenger, slack, Alexa and slack
- Developed analytics tool which will give report about the Chabot related statistical data and analysis using Python
- Implemented on Human take over, Download chat history in PDF and Feedback responses
- Deployed in cloud using Google cloud functions and manage all the application using separate dockerized container platform.
Description
- Worked on data load data from aptean.
- Worked on all calculation logic part by supporting multiple formulas and condition.
- Implemented adding different unit of measure in all fields.
- Worked on multiple currency and conversion supported by application.
- Deployed many services in docker based on Micro services architecture
- Worked on performance side improving memory and reduce the api response to 3 milliseconds.
Description
EzyCommerce (EZC) is an automated e-commerce fulfillment offering aimed at Small and Medium Enterprises (Merchants) in Asia Pacific to help them sell online, with increased productivity. As Merchants establish online presence and increase the number of online sales channels, EZC helps them to scale by outsourcing end to end fulfillment process to a trusted, reliable and cost-efficient service provider like Singpost. Using EZC, Merchants can manage orders from multiple online sales channels using a single platform and get access to world-class fulfillment capabilities.
Connect to online sales channels: Merchants can connect seamlessly with their existing order capture systems such as marketplaces, shopping carts or use semi-automated methods such as file uploads for order submission to the ezyCommerce platform
Responsibilities
- Developed many camel routes which used send messages to different systems.
- Inbound/Outbound fulfilment orders received from online sales channels
- Worked on Ebay, Shopify and Amazon API services to connect online sales
- Involved in creating Blueprint and spring type camel routes.
Description
Roles and Responsibilities:
- On call 24/7 as a L3 engineer.
- Worked with a team of 5 engineers to build in house capabilities to administer and monitor full thousands of Hadoop nodes.
- Deeply instilled DevOps culture across a 100+ sized engineering team and moved away from 3 months release cycle to a 1-click release.
- Identify various data product use cases, data sources and governance requirements
talking to business stake holders
- Developed data application using Spark as the processing layer to deliver analytical
Power BI reports to Clients
- Design and implement robust data pipeline for real-time analytics stream processing and machine learning with MapR platform and Spark Structured Streaming
- Performed Data ingestion to DL platform with many Data sources like mysql, oracle and MSSQL
- Involved in Both Real Time and Batch Data
- Work with data science team in implementing churn prediction models with SparkML
- Automate deployment workflows with Jenkins and docker
- Created and scheduled spark jobs in Airflow