Sudarshan T.

Sudarshan T.

Python | Spark | Hadoop | Big Data Professional

Nashik , India

Experience: 5 Years

Sudarshan

Nashik , India

Python | Spark | Hadoop | Big Data Professional

90000 USD / Year

  • Notice Period: 90 Days

5 Years

Now you can Instantly Chat with Sudarshan!

About Me

  • Experienced professional with a bachelor's degree and demonstrated history of working in the information technology and services industry.
  • Currently holding active H1B visa and working with Infosys as Technology Analyst.
  • ...

Show More

Portfolio Projects

Integrated Planning Solutions Production Support

Company

Integrated Planning Solutions Production Support

Description

Description: This is an end to end maintenance and support project with responsibilities ranging from system administration, batch management, incident and problem management, to change management.

 

Role and Responsibilities:

    • Participate in business and system requirements sessions
    • Requirements elicitation and translation to technical specifications.
    • Provide inputs on solution architecture based on evaluation/understanding of solution alternatives, frameworks and products.
    • Interact with clients to elicit architectural and non-functional requirements like performance, scalability, reliability, availability, maintainability.
    • Coordinate with business analysts and client managers to resolve complex issues, troubleshooting errors and fix bugs in the production environment.
    • Work closely with Stakeholders, Data Warehouse team and Project Managers for delivery requirements and estimates.
    • Working on cubes, dimensions, rules, turbo integrator processes and chores in TM1 architect to explain complex business queries and calculation.
    • Debugging and troubleshooting Turbo Integrator processes in Cognos TM1 to ensure that the data is loaded properly into cubes.
    • Done critical analysis for finding the root cause of the issues related to TM1 reports, Cognos reports and Teradata tables.
    • Prepared various documents (technical design document, impact analysis document, requirement specification document, delivery note, test matrix, MOM)
    • SPOC and incident Compliance anchor of the team. Mentored Fresher's/New Comers in the team.

Show More Show Less

Integrated Planning Solutions

Company

Integrated Planning Solutions

Description

Description: This is an end to end enhancement, maintenance and support project with responsibilities ranging from system administration, batch management, incident and problem management to change management.

Role and Responsibilities:

    • Performed data analysis and gathered columns metadata of source systems for understanding requirement feasibility analysis.
    • Created Logical Data flow Model from the Source System study according to Business requirements on MS Visio.
    • Worked on the Teradata stored procedures and functions to confirm the data and have load it on the table.
    • Developed procedures to populate the customer data warehouse with daily transaction data, monthly summary data and historical data.
    • Worked on optimizing and tuning the Teradata views and SQL queries to improve the performance of batch and response time of data for users.
    • Working with business users to design and develop Cognos BI 10.1 reports using Report Studio with data drill-down and slice-and-dice options, drill through functionality.
    • Creating, enhancing and maintaining Framework Manager Models to support business requirements.
    • Perform development of Data visualizations and Dashboards using Tableau Desktop.
    • Custom SQL preparation, blended data from Multiple Databases into one report by selecting primary keys from each database for Data Validation in Tableau.
    • Development of storytelling dashboards in Tableau Desktop and publishing them to Tableau Server.
    • Support implementation of design by resolving complex technical issues faced during development, deployment and support.
    • Knowledge Management Initiative - Prepared various documents (technical design document, impact analysis document, requirement specification document, delivery note, test matrix etc.)

Show More Show Less

NextGeneration Analytical Platform - Enterprise Data and Analytics

Company

NextGeneration Analytical Platform - Enterprise Data and Analytics

Description

Description: The project is part of the IT Optimization program and involves maintenance and support of client’s Enterprise Data-Warehouse, which holds information and reports on the entire Supply Chain from Product Creation, Demand and Supply Planning, Order Management and Delivery, Sales, and also Human Resource Management. The scope of reporting and analytics ranges from daily operational reports to strategic senior management dashboards, which enable the business in taking critical decisions related to the entire supply chain, which includes creation of new products, daily demand and supply management, and organizational management.

This is an end to end maintenance and support project with responsibilities ranging from system administration, batch management, incident and problem management, to change management.

 

Role and Responsibilities:

    • Participate in business and system requirements sessions for client communications and requirement gathering.
    • Providing input into developing and modifying systems to meet client needs and develop business specifications to support these modifications.
    • Is responsible for setting up, maintaining, and evolving the cloud infrastructure of web applications using various Amazon Web Services (AWS) like EC2, S3, EMR, ELB, Athena, Auto Scaling etc.
    • Creating Spark jobs for data cleansing, transformation and aggregation and load it into AWS S3 data storage buckets.
    • Develop, refactor, fix, test, review and deploy new software functionality, improvements and bug fixes to Snowflake data warehouse, databases and schemas.
    • Working on Airflow platform to programmatically author, schedule and monitor workflows as directed acyclic graphs (DAGs) of tasks.
    • Creating Python scripts for design and development of Airflow DAGs.
    • Is responsible for maintaining performance of Airflow schedulers and workers using Unix/Linux shell scripting.
    • Done critical analysis for finding the root cause of the issues related to Airflow DAGs, Ad-Hoc Cognos reports, Tableau Dashboards, Python scripts, Snowflake tables.
    • Support implementation of design by resolving complex technical issues faced during development, deployment and support.
    • Knowledge Management Initiative - Prepared various documents (technical design document, impact analysis document, requirement specification document, delivery note, test matrix etc.)

Show More Show Less
Share:

Verifications

  • Phone Verified

Preferred Language

  • English - Fluent

  • Hindi - Native/Bilingual

Available Timezones

  • Eastern Daylight [UTC -4]

  • Central Daylight [UTC -5]

  • Mountain Daylight [UTC -6]

  • Pacific Daylight [UTC -7]

  • Eastern European [UTC +2]

  • Eastern EST [UTC +3]

  • Greenwich Mean [UTC ±0]

  • Further EET [UTC +3]

  • New Delhi [UTC +5]

  • Australian CDT [UTC +10:30]

  • Australian EDT [UTC +11]