Venkata A.

Venkata A.

Senior Software Engineer

Hyderabad , India

Experience: 9 Years

Venkata

Hyderabad , India

Senior Software Engineer

69397.1 USD / Year

  • Notice Period: Days

9 Years

Now you can Instantly Chat with Venkata!

About Me

9 Years of IT professional experience in varies Industries including 4+ Years of experience as Bigdata Developer using AWS Cloud, Apache Hadoop and Spark technologies. Good experience on AWS cloud services Amazon EC2, EMR, IAM, S3, Redshift, AWS Lamb...

Show More

Portfolio Projects

Description

DICE is an end-to-end process for the countries from sales data Integration to end user dashboards built in Qlik Sense. Sales Integrator as part of the DICE to deliver harmonized sales data considering data integrity checks, market definition rules, data transformation rules etc.All data is stored in AWS Cloud cluster.

Show More Show Less

Description

DICE is an end-to-end process for the countries from sales data Integration to end user dashboards built in Qlik Sense. Sales Integrator as part of the DICE to deliver harmonized sales data considering data integrity checks, market definition rules, data transformation rules etc. Sales Integrator is a software product of IQVIA and is installed on the Novartis servers. Sales Integrator is used to load the different data sources, harmonizing them, and then producing two outputs: a star-schema for self-service analysis, and aggregated views to be used by Qlik Sense for standard reporting. Application data has three layers: raw, Enriched and data mart. Physically all data is stored in HDFS and permissions are managed with standard Hadoop group security. All data is stored in AWS Cloud cluster.

Show More Show Less

Description

- Confidential -

Show More Show Less

Description

Project : Global Channel Partner Program (Vistex Data integration)

Client : Dell International, Hyderabad, India

Role : Bigdata Developer

  • Confidential

Responsibilities:

  • Developed Spark code in Scala using Spark SQL & Data Frames for aggregation.
  • Created schema in Hive with performance optimization using bucketing & partitioning.
  • Experienced in working with spark eco system using Spark SQL and Scala queries on different formats like Text file, CSV file.

Show More Show Less

Description

Project : Quote Store & COINS (Compensation and Incentives)

Client : Dell International, Hyderabad, India

Role : Bigdata Engineer

Responsibilities:

  • Having experience in writing Hive queries & Hive Scripts.
  • Handled importing and exporting data to HDFS and Hive using Sqoop.
  • Load and transform large sets of structured and semi structured data.
  • Written Hive queries for data analysis to meet the business requirements.
  • Involved in processing the data by using apache Hive from HDFS.
  • Used Spark API over Cloudera Hadoop YARN to perform analytics on data in Hive.
  • Configured Control-m jobs to automate the data flow.

Show More Show Less

Description

- Confidential -

Show More Show Less

Description

  • Confidential -

Responsibilities:

  • Worked as an individual contributor at Oracle IDC Bangalore and involved in:
  • Working with an Agile, Scrum methodology to ensure delivery of high-quality work with every monthly iteration/Sprint
  • Analyzed the functional specs provided by the data architect and created technical specs documents for all the mappings.
  • Defined various facts and Dimensions in the data mart including Fact-less Facts, Aggregate and Summary facts.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Show More Show Less

Description

Legality is an ERP System, extract data from Inventory and Sales using different applications perform analysis and generate forecasted Data for Sales and Inventory. Data will be loaded from Legality to stating daily, staging archive and Data Warehouse using Informatica. All reference data will be looked up and transformed in SFS Report, and aggregate data will be loaded into target tables. These tables will be used by packages and Reports for generating required metrics. These metrics and reports have been used by the Business in performing Business Intelligence and Analytics on Ship from Store Model.

Show More Show Less

Description

Supply chain Management consists of Sales and operations, pricing, Analytics & Inventory management. It mainly focuses on business transformation process analysis & operational benefits projects to organization. This project will process historical data using ETL Jobs, feeds to Dashboard, generate the required reports and data conversion using EDI tools. Device Management Database (DMD) and Verizon Wireless Integrated Inventory Application are used by all internal and external clients to retrieve device details and availability.

Show More Show Less

Description

INMS is a solution that will correlate event feeds from multiple systems to determine root -cause failures across network domains, end-to-end service and provide real-time interfaces to customer service management systems. Surveillance Manager is an open and configurable fault monitoring application. It collects fault conditions from numerous sources, analyzes them according to root-cause analysis algorithms and executes different actions based on the results of the analysis. The purpose of the fault analysis is to identify the root cause of the problem and to report about the services and customers impacted by the fault condition.

Show More Show Less