Akepati S.

Akepati S.

Hadoop developer, pyspark developer

Hyderabad , India

Experience: 6 Years

Akepati

Hyderabad , India

Hadoop developer, pyspark developer

400368 USD / Year

  • Start Date / Notice Period end date: 2022-01-25

6 Years

Now you can Instantly Chat with Akepati!

About Me

6.3 years of extensive experienced in IT with 4+ years of professional with demonstrated history of working in US Health care, Insurance and Ind. Skilled in Hadoop , Big data, Hive, HBASE, Sqoop, Pig, Spark SQL, Talend Open Studio Big data ( ETL )...

Show More

Portfolio Projects

C360 (Consumer 360)

Company

C360 (Consumer 360)

Description

The business objective of this project is building a product which will provide the complete 360 view of a consumer with single click. This gathers data from various sources and presents the holistic information of a consumer using Standard APIs which are used by our Customer Service Representatives. This enables our Customer Service Agents to serve the consumer better since they now have access to every bit of details of the member.

Lead and designed the Talend ETL & OOZIE work flows which extract data from Hadoop (Data Lake) environment, transform the data to XML/JSON and ingest into Marklogic/Cassandra.
• Developed complex Hive queries to extract the data from Hadoop environment applying necessary business rules

Ideated and converted data extractions from Hive HQL (map-reduce) to Spark-SQL leveraging in memory computing features of Data Frames and Data Sets, which reduced the extraction times drastically and the overall ETL process (4 hrs extraction time in Hive has been reduced to 20 Mins in Spark-SQL).
• Developed PYSPARK programs for reading PARQUET files and transforming the data and loading data into hive/csv/json combining data from multiple source systems.
• Implemented POC to migrate map reduce jobs into Spark RDD transformations to extract data from MapRDB (JSON format),transform them to enterprise canonical's and ingest to MarkLogic/Cassandra.
• Implemented python/pyspark programs to read data from layer 7 and converted json response data into hive tables.

Show More Show Less

Media

EPMP

Company

EPMP

Description

Experience in designing and developing POCs in Spark using python to compare the performance of Spark with Hive and SQL/Oracle.
• Experience working with HBase for storing and retrieving Reference data, Meta-data, Data Reconciliation and Reporting.
• Experience in DevOps Effectively handled the conversation with the 12 downstream for any unstable situation like environment down, WAR ROOMS, Lake refresh etc. Proactively took the responsibility to effectively manage code changes, Handshakes on jobs and data loads in diff environments like Alpha, bravo, Master and stage environments from single TALEND TAC without conflicting and burdening the cluster.

Show More Show Less

Mortgage(ARIS, PennyMac, NewDay)

Company

Mortgage(ARIS, PennyMac, NewDay)

Description

Loan Origination System (LOS) automates the entire lending process

from origination through funding. It provides ease of taking loan

applications by automating the most complex, inefficient and labor intensive

aspects of lending; including automated underwriting,customizable workflow,

compliance, document preparation management,rule based assignment

system, vendor interfaces and

customer relationship management.

Worked as Java Analyst and Development for Mortgage Clients Retail

& Correspondent). Product’s Framework is integration of Technologies

Java, JSP, IBM Web Sphere, XML and MS SQL Server 2008.

Handling Support and Development related issues/requests.

Providing work around solution, temporary fixes in case of issues viz.

hard stop for business and requires quick turnaround resolution time

while maintaining responsibilities and commitments.

Created quarter and annual comparison reports of the company

related to production and client details.

Walk through the client and provide training about various modules

either over the call or over the Web-ex.

Report specifications, implementing the metadata layers including

Physical.

Maintained Employee and Purchase order details of the company.

Ability to analysis develops, establish, & maintain efficient workflow.

Show More Show Less