Srinivas T.

Srinivas T.

Big Data Hadoop

Warangal , India

Experience: 15 Years

Srinivas

Warangal , India

Big Data Hadoop

66477.1 USD / Year

  • Notice Period: Days

15 Years

Now you can Instantly Chat with Srinivas!

About Me

Having 7 yrs on Big Data Hadoop. Worked on multiple Hadoop eco systems. Data ingestion with Sqoop and Flume. Data modeling with Hive. Data processig with Spark. Down stream data base with Hbase and Cassandra. For indexing solr. for streaming spar...

Show More

Portfolio Projects

Description

Client : Wells Fargo, USA

Technologies : Eclipse, EJB, Windows NT and Web logic AppServer.

Role : System Analyst

Project Description:

AES application is meant to automate the enrollment process for the various products the Wells Fargo Bank offers to the customers. It involves bank personnel filling up the Sales Order Forms and Product Enrollment Templates (PETs) in AES to enroll the customers to different products. The rationale for such a solution is to expedite the enrollment process by tracking and managing it through AES and thus reduce the overall time spent on enrollment per customer. It acts more of as an infrastructure tool across the enterprise rather than a specific application for a business group in the enterprise.

Responsibilities :

    1. Interacting with onsite team in daily meetings and take up the production bugs and fix them

2 Did Unit Testing for testing the CR’s assigned to me.

3. Deploying the resolved bugs code changes on to production server the bug

list and new CR's

Show More Show Less

Description

Technologies : Java 1.4, Struts 1.1, EJB, JMS, Pramati 4.1,

Oracle 9i.

Project Description :

Sales Resource Automation System is designed to cater all kinds of products like loans, liability accounts (Savings A/C, Current A/C, etc.), Credit Cards that are offered by ICICI bank. The system is made so dynamic and flexible, that it can also cater to future products without making any change / modification to the existing system.

The leads, which are generated through various channels of ICICI bank, are routed to lead owners based on the product and pre-defined routing parameters like location, branch etc. which are set for that product and tracked centrally to reduce the turnaround time. Leads are converted to deals with the help of lead fulfillers, who act as a facilitator between the institution and the customer. Unattended leads are escalated automatically.

Maintaining the data centrally will facilitate instant retrieval of customer information / status and also help in enhancing the relationships with customer. Sales Force Automation provides an integrated view of leads across all product lines and consolidates the core lead information, which in the current scenario is spread across disparate Business groups.

Responsibilities :

  1. Developed Transaction module and Global Master module
  2. Did Unit Testing for testing the CR’s assigned.
  3. Involved in Integrating all modules
  4. Did the application deployments on test and production servers

Show More Show Less

Description

Technologies : Documentum 4.1,5.3 Linux

Role : System Analyst

Project Description :

Documentum is a system that can create various types of Documents if it is given the necessary information needed to create the document. The iTrade system receives file from ARENA called DDF files. These files contain information needed to create Word documents (confirmations, deal tickets, resets) in iTrade

All 4 Documentum applications make use of a scanning infrastructure. Documents can be scanned into the applications and stored there. The software used to scan in the documents is Kofax’s Ascent Capture. This software is not Documentum software, but integrates into the 4 Documentum apps. All documents scanned via this software is eventually released and stored in one of the 4 apps

Responsibilities :

        1. Created the templates like FRA, Equity Option
        2. Did the Unit Testing, and Unit test part of testing for the CRs developed.
        3. Used to deploy the new/updated templates on production server and ensure the changes are reflecting on the system.
        4. Given production support

Show More Show Less

Description

Technologies : IBM Websphere Portal 6.1, RAD, DB2

Role : Developer

Project Description :

Asian Paints intends to introduce to its dealers a web based portal that can be used for numerous reasons that dealers contact Asian Paints for. Currently there are alternate ways to reach Asian Paints for most of these requirements, but there is no single way that exists over the internet.

The dealer portal is a step in this direction and as mentioned, would encourage the dealers to use it for various activities for which they would contact Asian paints like placing orders, logging issues, view status updates, information, etc. The scope of this project is to build such a portal that can be used by the Asian Paints dealers to communicate with Asian for various functionalities like placing orders, logging and tracking complaints, viewing financial details of business done with Asian Paints, etc.

Responsibilities :

    1. Involved in account and complaints module development.
    2. Did the Unit Testing for the CRs developed.
    3. Integrate all the modules
    4. Deploy the application on test and production servers

Show More Show Less

Description

Technologies : Lift web, Scala, Hibernate, Java Mail, JMS, Jetty

server, webservices, Apache Maven, Hadoop – Mapreduce, hbase, Linux

Role : Team Lead

Project Description :

A learning management system (commonly abbreviated as LMS) is a software application for the administration, documentation, tracking, and reporting of training programs, classroom and on line events, e-learning programs, and training content. The application utilizes Documentum, to provide Document Management and distributed work flow management functionality that assists in the automation of the Credit Administration process and documenting the necessary compliance controls

Responsibilities :

  1. Developed the Credit Management, Transcripts functionality
  2. Developed the Email notification module using Java mail
  3. Developed reading and updating the XML from scala
  4. Performed Unit Testing
  5. Integrating all modules
  6. Do application full build and deployments on production server

Show More Show Less

Description

Technologies : Java, Servlets, JSP, Webservice,Struts, Oracle Role : Technical Lead

Responsibilities :

Handled multiple projects as a Technical Lead at client location.

Show More Show Less

Description

Technologies: Java, Nifi, HDFS, Spark, Cassandra, Hive, Kafka, CDH 5.10, Erwin

Description of the project:

Data Performance, any action in RDBMS (insert, update or delete) will collect from Kafka messages and from there messages are handled from Spark engine. In NOSQL database will divided archive and conform zones. Spark engine, the data will have copied as it collect from kafka to Archive where as in Conform zone it will get stored transformations.

Responsibility :

    1. Involved in Requirement gathering, use case preparation and preparing design documents.
    2. Involved in creating data model in Cassandra
    3. Read and write in to Cassandra using scala
    4. Created Archive and Conform zone tables
    5. Created CUCUMBER test cases.
    6. Developed Transformation module

Show More Show Less

Description

Technologies: Hive, Spark, AWS, S3,

Role: Senior Consultant (Architect)

  1. Should understand the client requirement
  2. Data stored in S3
  3. Created data lake in AWS with multiple source systems
  4. Mentoring team members
  5. Do appraisal for the team

Show More Show Less

Description

Technologies: Teradata, Sqoop, Hive, HDFS, Kafka, Spark,

Scala,HDP

Project Description:

Customer is having data from multiple sources like structured data (Teradata, Orcle) Three

File systems (Adobe, SalesForce, Epsilon) and Customer wants to process this in US

region with quick / speed access. And mphasis has given the solution on Hadoop with

below Hadoop stack. Project is about to build Data lake, to build Data Lake, 2 structure

data base (Teradata,Oracle) and 3 files systems (Adobe,Salesforce, Epsilon). Injecting

source data using Sqoop and place it HDFS location. Create Hive tables from the

HDFS as part of RAW layer. In Processing layer, take entity relation ship

between the table in Raw layer

Role : Big Data Lead & Architect

Responsibility :

    1. Designed data lake with multiple data sources (Teradata, Oracle, Adobe, Sales Force, Epsilon)
    2. Requirement gathering, use case preparation and preparing design documents.
    3. Created data lake on Hive environment like Raw Layer and ETL Layer
    4. Write the message/logs in Producer and read message/logs in consumer using API
    5. Written Data Frames, windows functions using spark with scala
    6. Explicitly scala functions like Higher Order Functions, case clases
    7. Written Sqoop scripts for extract the data source to Target
    8. Created raw layer and ETL layer hive tables
    9. Involved in Data Validation.

Show More Show Less

Description

    1. Involved in Requirement gathering, use case preparation and preparing design documents.
    2. Involved in creating data model in Hive
    3. Created Archive and Conform zone tables
    4. Production support BBRM model.
    5. Bug fixing.

Show More Show Less

Description

  1. Requirement gathering, solutioning, sizing based on the data volume
  2. Effort estimation, support for code-related problems and for routine, technical reviews of code and test plans created by team members
  3. Assign and monitor work done by the team. Delivery of module as per project plan and quality standards.
  4. Interacting client for clarification on Use Cases
  5. Create Hive tables

Show More Show Less

Description

  1. Effort estimation, support for code-related problems and for routine, technical reviews of code and test plans created by team members
  2. Writing camel routers read the XSD files
  3. Created HIVE tables and Hive partitions for query performance
  4. Written some UDF scripts
  5. Assign and monitor work done by the team.Delivery of module as per project plan and quality standards

Show More Show Less

Description

  1. Requirement gathering, solutioning, sizing based on the data volume
  2. Created Hive and Hawq tables for data processing
  3. Assign and monitor work done by the team.Delivery of module as per project plan and quality standards.
  4. Onsite coordination for the client interview of new offshore developers
  5. Timesheet management of offshore development team.

Show More Show Less

Description

1. Involved in Requirement gathering, use case preparation and preparing design documents.
2. Involved in creating data model in Cassandra
3. Created Archive and Conform zone tables
4. Created CUCUMBER test cases.
5. Developed Transformation module

Show More Show Less

Description

1. Involved in Requirement gathering, use case preparation and preparing design documents.
2. Involved in creating data lake
3. Created raw layer hive tables
4. Involved in Data Validation.

Show More Show Less