Subhajit  K.

Subhajit K.

Data Architect with 12 years of experience

Bengaluru , India

Experience: 12 Years

Subhajit

Bengaluru , India

Data Architect with 12 years of experience

57600 USD / Year

  • Immediate: Available

12 Years

Now you can Instantly Chat with Subhajit !

About Me

Hello Everyone,

I am currently working as Cloud cum Big Data architect in one the leading CPG companies. I have worked on multiple Cloud platform like Azure, Google Cloud.  I am a corporate trainer in Edureka where I provide training on...

Please do contact me for more details.

Thanks,
Subhajit

Show More

Portfolio Projects

Description

It is an enterprise data warehouse project whereOracle Data Integrator (ODI) is used as ELT tool for transporting data from multiple applications across various stages in enterprise data warehouse built on top of Oracle Database.

Key Responsilibities are

  • Understanding of the business requirements
  • Design and implementation of the ODI based applications/features
  • Design of various features of the DWH application using ODI e.g. Reconciliation, Error handling, Auto-Rerun in case of failures.
  • ODI KM Customization – To implement common components and improve performance of the application.
  • Design and Implementation of SCD and Aggregated fact using ODI KMs
  • Bug Fixing

Show More Show Less

Description

Worked as Data Architect to design & build the Global Data Lake for the entire organization. Data Lake is built on top of Azure Cloud and Talend ETL is used for data ingestion & processing. Key Responsilibities are

  • Design & Build Data Lake Platform
  • Design & Build ETL/Data Pipelines to ingest & process data
  • Design Unified Data Model
  • Mentoring & guiding multiple project teams to deliver projects on Global Data Lake.

Show More Show Less

Description

Worked as Big Data Architect to transform/upgradeTraditional Data Warehouseto modern technologies. Entire Data Warehouse is upgraded to Big Data & Cloud Technologies whereon-premise environment is setup to process ~800 TB of data. Both batch & streaming jobs are built to cater various kinds of business needs like Operational reports, real time analytics.

Key Responsilibities are

  • Design & Build Upgradation strategies to migrate data & rebuild applications in new technologies
  • Design & Build Frameworks using Big Data technilogies like Spark/Hive Sql to do ETL work
  • Worked as Performance Lead to optimize the data loading/Processing steps including tuning ofBig Data jobs written in HQL, SparkSQL

Show More Show Less

Description

It is a business transformation project where client is moving from legacy system (VisionPlus) to WAY4. Plan was to build the platform of a universal bank and the same platform would be used to migrate all the banks (both Issuing & Acquiring) to the new platform i.e. WAY4.

Key Responsilibities are

  • Understood the business requirements & designed the architecture/framework fordata migration(DM).
  • Interacted with client and was responsible for sign off/ approvals of the technical deliverables
  • Built Migration platform for Universal bank migration using Talend DI & Swisscom Migration Suite.
  • Prepared Technical specifications and lead the team of 17 of different nationalities
  • Prepared detailed Estimation/effort required for DM stream
  • Developed/rebuilt an existing 3rd party tool for DM activities
  • Built Data Lake using Cassandra for analytical purpose. Data ingestion is done from AS400 DB to Cassandra DB using Talend ETL tool.
  • Developed an in-house tool to maintain the DM artifacts.
  • Provided technical guidance to the Performance testing team.
  • Played an active role to explain all the designs and solutions to the Head of the Partner banks

Show More Show Less