Ajay N.

Ajay N.

Senior Module Lead

Pune , India

Experience: 9 Years

Ajay

Pune , India

Senior Module Lead

34560 USD / Year

  • Immediate: Available

9 Years

Now you can Instantly Chat with Ajay!

About Me

7.9 years of extensive experience in architecture, design and development of J2EE,Big Data(Hadoop) application. OCJP , SCWCD , NCFM Module 1 Certified. Working in Saba from Dec 2017 till date. Working in technologies like Java, JSP, Servlets, Struts,...

Show More

Portfolio Projects

Description

Saba is the learning product and has been in used by various customer.

As part of learning team I am working on re-write some existing functionality so that it can support concurrency. Primarily Saba used SMF product which is proprietary framework which is used to process large set of data. Currently this product has lots of drawback and facing lots of db locks and concurrency issue .To overcome the drawback I am building a framework so that it can process large set of data without any concurrency.

Created a framework

Which will process large set of data. Creating a java project using which message will get publish on hazelcast.I am creating a shell script through which number of jvm will get trigger.This trigger jvm will listen to hazelcast queue and consume the message .After consumption of message and processing of message,it will returns sql CRUD operation.This sql CRUD operation will get publish on hazelcast queue. DB processor jvm will consume this messages and update the database using jdbc batch.

Show More Show Less

Description

Matrix is in-house Deutsche bank(DB) framework which is used to calculate exposure for listed directives DB trades. Exposure is calculated in different backtesting ways.

  • Forward looking backtesting
  • Statistical Historical backtesting
  • Hypothetical Backtesting.

I as an individual contributor helps business to perform various backtesting run and work on business change need in backtesting .Created a framework

Which will do valuation for 90+k trades.Created a java project using which trade will get publish on HazelCast queue and spawned n number of jvm using java API and spawned jvm will listen to hazelcast queue and get trade data from hazelcast queue. Spawned jvm will do the pricing of using DBANA API.Result of DBANA API will be stored in DB and final result is shared with business.

Show More Show Less

Description

Counter Party trade evaluation is done by Portcalc(PC) Team. PC send this data and I have written a framework which will consume this data and store in Hbase.This data is store in google protobuf format and in single column family. Size of 1 row is close to around 10MB .Volume is around 40+ million.

Once the data is stored I have written a MapReduce job will create another family using same row key and store meta data in same table.After completing this job I have created another java process which will fetch only meta data of the message and store in csv file .Once the csv is generated this data is upload in Oracle DB and final table is consume by Matrix application.

Show More Show Less

Description

Setup Hadoop and Hbase for storing CMS data in hbase.CMS data is stored in hbase and this result is consumed by CMS system(Alfresco).I have modified alfresco API which was storing the data on NFS,after this implementation data is stored in hbase as well as retrieved from hbase.

Written below scheduler for hbase maintenance.

  • Compaction
  • Restart hbase master,region server after 3 weeks

Show More Show Less

Description

Saba is the learning product and has been in used by various customer.As part of learning team I am working on re-write some existing functionalityso that it can support concurrency. Primarily Saba used SMF product which isproprietary framework which is used to process large set of data. Currentlythis product has lots of drawback and facing lots of db locks and concurrencyissue .To overcome the drawback I am building a framework so that it canprocess large set of data without any concurrency.Created a frameworkWhich will process large set of data. Creating a java project using whichmessage will get publish on hazelcast.I am creating a shell script throughwhich number of jvm will get trigger.This trigger jvm will listen to hazelcastqueue and consume the message .After consumption of message andprocessing of message,it will returns sql CRUD operation.This sql CRUDoperation will get publish on hazelcast queue. DB processor jvm will consumethis messages and update the database using jdbc batch.

Show More Show Less

Description

Matrix is in-house Deutsche bank(DB) framework which is used to calculateexposure for listed directives DB trades. Exposure is calculated in differentbacktesting ways.Forward looking backtestingStatistical Historical backtestingHypothetical Backtesting.I as an individual contributor helps business to perform various backtestingrun and work on business change need in backtesting .Created a frameworkWhich will do valuation for 90+k trades.Created a java project using whichtrade will get publish on HazelCast queue and spawned n number of jvmusing java API and spawned jvm will listen to hazelcast queue and get tradedata from hazelcast queue. Spawned jvm will do the pricing of using DBANAAPI.Result of DBANA API will be stored in DB and final result is shared withbusiness.

Show More Show Less

Description

Counter Party trade evaluation is done by Portcalc(PC) Team. PC send thisdata and I have written a framework which will consume this data and storein Hbase.This data is store in google protobuf format and in single columnfamily. Size of 1 row is close to around 10MB .Volume is around 40+ million.Once the data is stored I have written a MapReduce job will create anotherfamily using same row key and store meta data in same table.Aftercompleting this job I have created another java process which will fetch onlymeta data of the message and store in csv file .Once the csv is generated thisdata is upload in Oracle DB and final table is consume by Matrix application.

Show More Show Less

Description

Setup Hadoop and Hbase for storing CMS data in hbase.CMS data is storedin hbase and this result is consumed by CMS system(Alfresco).I havemodified alfresco API which was storing the data on NFS,after thisimplementation data is stored in hbase as well as retrieved from hbase.Written below scheduler for hbase maintenance.CompactionRestart hbase master,region server after 3 weeks.

Show More Show Less

Description

AKC Stands American kennel club is used to organize event for differentkinds of pets in USA.AKC is the largest purebred dog registry in the world. Itoffers programs, events and enriches the lives of dog.As part of this project I was involved in 3 modules.1) Creating REST services for Breeds, judge and Events module.2) Develop Login using Spring Security: As part of this module I createAuthentication services for LDAP,GIGYA and DB.3)Daily Syncup:As part of this Module I have to write a script which will pickup affected rows in each table of oracle and update it new Mysql db.

Show More Show Less

Description

Enhanced Messaging is the campaigning product and this product target toindividual customers of us.hsbc.com customers. Automated system iscreated to upload the the campaign data of us.hsbc.com customers on dailybasis.At Front end side , portlet has been created which read the data fromdatabase and render response.Json response is consumed by Contentmanagement team which generates campaign pages.

Show More Show Less

Description

Previously, www.us.hsbc.com was using MapQuest as vendor in BranchLocator Application to show branch details on Map. Now , we have replacedMapQuest with Google, We have used Google GeoCoding APIs , GoogleMap Rendering APIs to show branch details in Branch Locator Application.

Show More Show Less

Description

Yodlee is third party vendor which has information of all Banks customersdata. This is web based project use for communication between Yodlee andHSBC.Yodlee need data from HSBC to show this data on Easyview Screen.

Show More Show Less

Description

EasyView is third party application. Its basic functionality is to show all bankdata within and outside HSBC on a single screen. Used SAML open sourcetechnology to launch Easy View Application.

Show More Show Less

Description

American disability act is a compliance which state, all Web applicationshould be usable to blind and poor eyesight . Developed new screens toaccommodate ADA rule on Bank to bank application. Redesigned PIBscreens for the same.

Show More Show Less

Description

As part of this project we have to launch this product using SSO Server.SSOserver is a entity which store user data for limited period.Developed an APIwhich to create Token(Unique value) , so that instead of sending all userdata , sending only Token value would make application work.

Show More Show Less