Madhukar S.

Madhukar S.

Database Analyst

Bangalore , India

Experience: 15 Years

Madhukar

Bangalore , India

Database Analyst

34285.7 USD / Year

  • Immediate: Available

15 Years

Now you can Instantly Chat with Madhukar!

About Me

9 years of experience in the IT Industry. Have exposure to wide array of technologies and tools. Database proficient, optimized various sql scenarios Handled DBA role as part of maintenance within project Guiding team in developing pro-active solutio...

Show More

Portfolio Projects

Description

  • Created data pipeline using Azure data factory
  • Gathering the data stored in Azure data store, optimizing it and joining with internal datasets to gather meaningful information.
  • Data cleaning using U-Sql code on the Azured data lake.
  • Data aggregation on terabyte datasets done using U-Sql which saves IO/CPU time
  • Adopted DAX for business logic calculations in providing analytics of system up time on periodical basis

Show More Show Less

Description

Facilitated insightful daily analyses of 100GB to 1TB of server data collected by server logs. Spawning recommendations and tips that increased page performance by 38%.

  • Developed spark scala programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
  • Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
  • Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data.
  • Provided design recommendations and thought leadership to sponsors/stakeholders that improved review processes and resolved technical problems.
  • Managed and reviewed Hadoop log files.
  • Tested raw data and executed performance scripts.
  • Shared responsibility for administration of Hadoop, Hive and Pig

Show More Show Less

Description

    • Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
    • Supported code/design analysis, strategy development and project planning.
    • Created reports for the BI team using Sqoop to export data into HDFS and Hive.
    • Developed multiple MapReduce jobs in Java (spark) for data cleaning and pre-processing.
    • Assisted with data capacity planning and node forecasting.
    • Collaborated with the infrastructure, network, database, application and BI teams to ensure data quality and availability.
    • Administrator for Pig, Hive and Hbase installing updates, patches and upgrades.

Show More Show Less

Description

· Performed analysis of current optimization model and provided recommendations to move to Oracle’s cost-based optimization. Lead the stress test initiative to evaluate the success of this approach

· Established and wrote the Performance Tuning guidelines for the current module to enhance the quality of our product for all current and future code development

· Steered data mart and data warehouse development using 10.2, taking full advantage of new RDBMS features as they become available and stable

· Followed OMS standards (as per CMMI 1.2) in developing and building new applications

Show More Show Less

Description

Tech refresh is a module in Prepaid Cards Treasury services area, it involves maintenance of various card users across globe with different flavors of prepaid cards such as Navy cash, Health Service Account, Carnival Cruise. Tech refresh database of this module is being populated with data on usage of these various cards at ship, ATM, IVR and web. Current application enhancement involves moving encryption methodology from c++ binaries to database encryption with DBMS Crypto oracle utility and usage of oracle 11g new features of deferred segment creation and range partition interval

Show More Show Less

Description

DbDEPOT is a data warehouse capturing GME transactional data. Out of dbDEPOT warehouse, various OATS compliance reports generated and submitted to FINRA DEPOT captures data from front office trading systems. A trading system specific adaptor translates trading system message protocol to dbFIX. dbFIX is an in-house message protocol developed by dbDEPOT team. Adaptors publish dbFIX messages to dBUS JMS topics, which are bridged to queues.

Show More Show Less

Description

The spider system is a Mapping and Routing message system. It has the components of a middleware software. It has connectivity to Bloomberg and various exchanges like Eurex, Telematico, AEX, Xetra, CBOT, EEX etc. The messaging system of spider is very robust. Spider supports certified as well as uncertified messages and it is repository system.

Show More Show Less

Description

Project involves 3 different systems for Global Documentation for the OTC derivatives business which mainly involves online trading. GOLD (Global Online Documentation) Edocs (Emerging Documentation) TGW (Transmission Gate Way) Gold is an integrated imaging, document management and workflow system which provides global online access to deal based documentation. Edocs is an entity through which Gold receives confirmation documents in order trades to be in workflow. Because of heavy network traffic few trades will be missing from workflow, which needs to be monitored and addressed into the right workflow. TGW is another entity where in documents flowing from different process groups (Locations) are stored into the Gold application i.e. basically documents scanned in or faxed out from other part of locations hence it acts as a Gateway for documents flow.

Show More Show Less