Abhishek S.

Abhishek S.

Data Engineer - Big Data Developer with 6+ years of relevant experience

Bengaluru , India

Experience: 6 Years

Abhishek

Bengaluru , India

Data Engineer - Big Data Developer with 6+ years of relevant experience

50713.3 USD / Year

  • Immediate: Available

6 Years

Now you can Instantly Chat with Abhishek!

About Me

  • B.E (Information Science Engineering) with 6.1 years of valuable exposure working with product-based organization Anthem Legato health Technologies, American Express GBT and Atos Syntel Pvt Ltd..., American Express GBT and Atos Syntel Pvt Ltd as a Senior Software Engineer.
  • Received multiple Spot Awards and Kudos from the client and senior managers.
  • Acquired sound knowledge and understanding of Hadoop Components, Unix shell scripting, Python,SAP BO(Reporting Tool), MySql, core java, Azure HDInsight( Microsoft BigData Cloud),PySpark.
  • Good experience in Hadoop components such as AWS,S3,HDFS, Hive,Python,SAP BO(Reporting Tool), Apache Nifi and Minifi, Apache Kafka CLI, Spark Batch Data, HBase(NoSQL), Apache Atlas and Ranger, Oozie, Crontab, Azure HDInsight(Microsoft BigData Cloud),Azure Data Factory, Apache Kylin, Sqoop.
  • Good experience in UNIX shell scripting,Python and MySql.
  • Basic Knowledge components such as Map-Reduce, Hbase.
  • Basic understanding of core java.
  • Experience in software tools and utilities like PUTTY, Mobaxterm, Cloud BigData AWS, Cloud BigData (Azure HDInsight), Azure Data Factory, SQL Server and Eclipse, PyCharm.
  • Gained valuable exposure in developing ETL applications, Data Transformation, Creating CI/CD Data pipelines, GIT as version control, testing, interfacing with clients, and working under a stipulated timeline.
  • A team player with exceptionally good organizational, analytical, training, and interpersonal skills.
  • Having strong analytical skills and the ability to provide quick and optimum solutions.
  • Possess good communication skills with an analytical mind to grasp new concepts easily & quickly.

Show More

Skills

Portfolio Projects

ASO Shared Shavings

Company

ASO Shared Shavings

Role

Full-Stack Developer

Description

Description

Sending monthly revenues of Anthem US health insurance members to the client that is related to how much and which kind of health facility they have opted and how much of dollar amount is being claimed etc.

Roles and Responsibilities

  • Understanding requirement gathering and analyzing business requirements for each change or new request.Creating hive tables for monthly and yearly reports.
  • Creating and Sending different kind of SAP BO Reports to client.
  • Developing a complete end to end report in pyspark.Performed unit testing, SIT’s and made code ready for UAT’s.Defect resolutions and tracking till closure.
  • Preparation of Weekly status report for coverage of functionality.

Show More Show Less

Traveler Snapshot

Company

Traveler Snapshot

Description

Description

Sending yearly and monthly revenues of travelers to the client that is related to Traveler traveled how many times,hotel booked,flight booked etc of valuable customer so the business can know which traveler in year and in last month which kind of booking and type of trip booked when customer travel around the globe.

Roles and Responsibilities

  • Understanding requirement gathering and analyzing business requirements for each change or new request.Creating hive tables for monthly and yearly reports.
  • Developing a complete end to end report in pyspark.Performed unit testing, SIT’s and made code ready for UAT’s.Defect resolutions and tracking till closure.
  • Preparation of Weekly status report for coverage of functionality.

Show More Show Less

Emp_hierarchy process

Company

Emp_hierarchy process

Description

Description

Sending weekly report to the client that’s in employee details of respective corp ids (corporations) if employee have any changes in their profile and as per client (corporations) requirement generating the reports as static(only latest data) or fluid record(history + latest data) and stamping active and inactive dates along with client who left org.

Roles and Responsibilities

  • Understanding requirement gathering and analyzing business requirements for each change or new request.creating hive hqls and shell script as wrapper scripts.
  • Developing a complete end to end report.Performed unit testing, SIT’s and made code ready for UAT’s.Defect resolutions and tracking till closure.
  • Creating UI from where any one can put the corp_ids,type of report they want(fluid/static).

Show More Show Less

Azure HDInsight

Company

Azure HDInsight

Description

Description

As our client wants to move from Rackspace HDP to cloud we have options to migrate all our projects from HDP Rackspace environment to Azure HDInsight (Microsoft BigData cloud).

Roles and Responsibilities

  • Moving all our projects from the Production environment to Azure Different clusters based on technology.
  • Make availability of data over azure and make a common Azure data lake storage (ADL).
  • Configured hive to be as centralized metastore for all cluster so that every Cluster can read data from table.
  • Checked performance tuning and validation of jobs of Interactive query cluster, base cluster, Spark cluster at azure.

Show More Show Less

MI French

Company

MI French

Description

Description

Sending yearly and monthly revenues of French Market to the client that is they are losing their trusted valuable customer or not and the growing increase in the percentage of their market when customer travel both via flight and Rail.

Roles and Responsibilities

  • Understanding requirement gathering and analyzing business requirements for each change or new request.
  • Developing a complete end to end report.Performed unit testing, SIT’s and made code ready for UAT’s.Defect resolutions and tracking till closure
  • Preparation of Daily/Weekly status report for coverage of functionality.

Show More Show Less

AEGBT Data Ingestion

Company

AEGBT Data Ingestion

Description

Description

Global Business Travel (GBT): It is a full-service travel management company that can support your business and your Business Travelers, across the globe and around the clock It provides traveling compliances to customers.

The framework is metadata-driven which reduces the need to update the code every time. Scheduling of the scripts to sqoop import data from MySql and Netezza to datalake are done scheduled via crontab and oozie including notifications for success and failure.

Roles and Responsibilities

  • Contributed to the development of the hive tables having partitions and buckets.
  • Involved in Code changes –Shell Scripts and  DDL’s required to make the application in sync with the requirement,bring data from Neteeza and MySQL and put in datalake.
  • Developed oozie workflows for scheduling different jobs.

Show More Show Less

DTR

Company

DTR

Description

Description

This is a real-time data ingestion project in which every second data generated at our source on local system by using MINIFI we are pushing those relevant travel data to NIFI cluster, then using NIFI processors filtering those data on bases of requirement and pushing particular client data to Kafka producer and consumer to take it and if for any client that data is misplaced or not formatted a email action trigger and finally we are ingesting those data to HDFS.

Roles and Responsibilities

  • Installation of NIFI and MINIFI.
  • Contributed towards the development of the data ingestion pipeline flow.
  • Involved in configuring NiFi processors and Remote system protocol (site to site communication) required to make the application in sync with the requirement.
  • Created Kafka producers and consumers and topics.
  • Involved in Unit testing to check whether the data is loaded correctly.

Show More Show Less