Prashant G.

Prashant G.

Senior Level Hadoop

Bangalore , India

Experience: 5 Years

Prashant

Bangalore , India

Senior Level Hadoop

42306 USD / Year

  • Notice Period: Days

5 Years

Now you can Instantly Chat with Prashant!

About Me

Overall 5+ years of IT experience, as a Hadoop administrator and Pentaho BI Tool and MySQL. Having 2.8 years of hands on experience as a Hadoop Administrator in MapR and Hortonworks Hands on experience on ecosystem components Hive, Sqoop, Pig, HBase,...

Show More

Portfolio Projects

Description

Cluster maintenance, commissioning & decommissioning data nodes.
Installation and configuration of MapR/Hortonwork Hadoop cluster , Design & develop MapR DR setup, and manage data on MapR/ Hortonwork cluster
End-to-end performance tuning of MapR clusters and Hadoop Map/Reduce routines against very large data sets, working with MapR cluster along with MapR-Table(creation, import, export, scan, list)
Managing & monitoring cluster.

Performed data balancing on clusters

Applications PROD Support as roaster and Hadoop Platform Support.

Working on Name Node high availability customizing zookeeper services.


Managing quotas to Mapr File System.

Recovering from node failure and troubleshooting common hadoop cluster issues.

Responsible for Mapr File system data rebalancing.

Responsible for performing the backup and Restoration of data from MFS to SAN and Tapes as per Retention Policy.

Show More Show Less

Hadoop Admin

Description


Responsibilities:
? Design, develop, and manage data on MapR/Hadoop cluster, Addition of node on MapR cluster End-to-end performance tuning of Hadoop clusters and Hadoop Map/Reduce routines against very large datasets
? Hands on experience in installing, configuring and using ecosystem components like Hadoop MapReduce, HDFS, MapR-FS HBase, ZooKeeper, Oozie, Hive, Sqoop, Pig, Flume.
? Monitor Hadoop cluster job performance and capacity planning, manage nodes on Hadoop cluster Installation, Hadoop Administration/development as well as Pig, Hive, HBase, MapR, Flume, Sqoop, implementing bash shell scripts to automate the services and processes on servers.
? Administration of Linux Servers on Centos, Ubuntu
? Managing application servers on different zones like Production & Staging.
? Actively monitoring idle threads, JVM, CPU utilization, connection pools & Troubleshooting.
? Hadoop cluster connectivity and security, Implement new Hadoop hardware infrastructure
? HDFS support and

Show More Show Less

Description

This project involves tracking network fault complaints from the customers. And also to examine information flows formally and informally within a customer complaint handling process, and to identify possible improvement areas to strengthen the network signal.
Data in the MYSQL database is transformed loaded into HDFS. Later this data is analysed using Hive that exposes data in HDFS in a Distributed Query enabled platform. Sqoop is used to extract data from internal

Show More Show Less

Description

Unique identification project was initially conceived as an initiative that would provide Identification for each resident across the country and would be used primarily as the basis for Efficient delivery of welfare services.

Show More Show Less

Description

This project involves tracking network fault complaints from the customers. And also to examine information flows formally and informally within a customer complaint handling process, and to identify possible improvement areas to strengthen the network signal. Data in the MYSQL database is transformed loaded into HDFS. Later this data is analyzed using Hive that exposes data in HDFS in a Distributed Query enabled platform. Sqoop is used to extract data from internal structured data stores and load into HDFS. Data volumes range in Gigabytes, which makes it challenging for regular analytics platform

Show More Show Less

Description

Exigent has a software product roadmap to build a Contract Management product that can be implemented both on-cloud and on-premise for Exigents customers.

Show More Show Less