Now you can Instantly Chat with AFZAL!
About Me
11 years of experience in IT Industry with exposure to various projects in different domains Expertise in SDLC phases including requirements gathering, design, development and testing Expertise in Big data and Java/J2EE technologies Expertise in draw...
Show MoreSkills
Portfolio Projects
Description
EdgeCache is specifically designed to meet the requirements of Charter Communications LLC, which incorporates lambda architecture. System contains a batch layer to import data from Oracle to Hive and a speed layer for analysing the user information and channel subscription in real time and save the data into HBase.
Show More Show LessDescription
Existing implementation of the system has had performance issues with data retrieval of time sensor data at HBase layer and was unable to meet the SLA timelines. The new approach brought the concept of data aggregation to improve the retrieval time at HBase layer. User will be able to generate the reports for Daily, Weekly, Monthly, Yearly etc. with the data aggregation approach and meet the SLA timelines.
Show More Show LessDescription
Ace Hardware plans to migrate a set of tables in Teradata to HDFS. The Tables migrated to HDFS will be stored in Hive/Impala tables. The migrated tables will be queried and joined together by Webfocus module to produce the required reports. Migration activity includes scheduling mechanism to pull the data at regular intervals. Incremental logic, compression techniques are incorporated to improve the overall performance of the system.
Show More Show LessDescription
Data ingestion framework is a module which is integrated to an existing product of Cognizant. The product is aimed to leverage the capabilities of SparkSQL to migrate the data from various types of source to various types of destinations. The module will also provide flexibility for the user to perform various types of transformations like sorting, aggregation, filters etc in between before the source data is being transferred to destination. The module will optimize the whole pipeline flow by analyzing and merging the intermediate transformation phases together, if possible.
Show More Show LessDescription
Support Project
- Analyze the existing application and provide support to the clients
- Providing the on call support for high priority issues and guide it to the resolution
- Worked as part of ATG, Content Administration modules
- Daily catch up with the clients for gathering the top priority issues f
- Mentor the subordinates for the assigned tasks completion
- Identify the value adds which can be added to the existing production system
Description
- Analyze and gather the requirements for the product implementation
- Design the proposed solution for the data migration framework
- Prepare the unit test cases for the workflows
- Plan and coordinate local and offshore team on design and development
- Prepare the class and sequence diagrams for the identified use cases
- Plan and estimate the tasks from the gathered requirements document
- Mentor the subordinates for the assigned tasks completion
Description
EdgeCache is specifically designed to meet the requirements of Charter Communications LLC, which incorporates lambda architecture. System contains a batch layer to import data from Oracle to Hive and a speed layer for analysing the user information and channel subscription in real time and save the data into HBase.
Show More Show LessDescription
Ace Hardware plans to migrate a set of tables in Teradata to HDFS. The Tables migrated to HDFS will be stored in Hive/Impala tables. The migrated tables will be queried and joined together by Webfocus module to produce the required reports. Migration activity includes scheduling mechanism to pull the data at regular intervals. Incremental logic, compression techniques are incorporated to improve the overall performance of the system.
Show More Show LessDescription
British gas currently stores their enterprise dataware house on teradata systems. Teradata system has maxed out and unable to meet SLAs. The proposed project will offload all the data from Teradata to Hadoop so that it will meet the SLAs, reduce data latency, near real time data availability, avoid costly appliance upgrade, reduce dependency on licensed/costly products
Show More Show LessDescription
Data migration framework is a tool to import bulk amount of data from various source systems to various destination systems. The project has connectivity towards all major databases like Oracle,MSSQL,MySQL, DB2, Postgres etc. It also has connectivity towards mainframe systems, sftp/ftp servers and extracting data towards destinations systems like HDFS/Hive/Hbase.
Show More Show LessDescription
Profile Search is a tool to trace the social network identity of candidates and match them with specific job descriptions. The users were able to search and find suitable candidates for jobs using various UI interfaces. The data will be crawled from the web into mongo DB and from there it will be extracted into HDFS for analysis using mapreduce concepts. Individual responsibility was to check the feasibility to improve the performance of the existing map reduce implementation.
Show More Show LessDescription
Epsilon application is the replacement of existing LAMDA application which handles Investments for savings, voluntary annuities, personal protection products. Existing LAMDA application was developed using VB 6.0 which now converted to Epsilon application using J2EE and MS SQL Server 2000 by retaining its functionality, database structure and user interface look and feel to the extent possible with the new technology.
Show More Show Less