Swanand S.

Swanand S.

Well-organized and customer-focused with proven skills in on-premise system to Azure cloud

Pune , India

Experience: 9 Years

Swanand

Pune , India

Well-organized and customer-focused with proven skills in on-premise system to Azure cloud

24960 USD / Year

  • Notice Period: Days

9 Years

Now you can Instantly Chat with Swanand!

About Me

Microsoft certified Azure professional. Offer variety of experience ranging from traditional databases such as Oracle to MPP as Teradata to Hadoop file storage. Well-organized and customer-focused with proven skills in technology platform migratio...

Skills:

  • Microsoft Azure. Have worked on different services like Databricks, ADLS, Azure data factory, Logic app, Synapse Analytics (Sql Datawarehouse), key vault, BLOB, Virtual Machine etc
  • Cloud Datawarehouse tuning and best practises for Synapse Analytics
  • Worked on OLTP and OLAP systems of big companies like Maersk, Morgan Stanley, BNY Mellon Technologies, Reliance etc
  • Scripting languages includes Unix, Python for GIS
  • Have extensive experience on ETL - Informatica PowerCenter, Informatica B2B
  • Good knowledge on Informatica B2B
  • Good knowledge on reporting tools - SAP BO, Power BI
  • Extensive experience in Customer communication, requirement gathering, technical feasibility studies and working in agile methodologies

Show More

Portfolio Projects

Description

  • Existing platform includes Informatica PowerCenter + SQL database. New platform includes various services of Azure – Databricks, ADLS, Azure data factory, Logic app, Synapse Analytics (Sql Datawarehouse), key vault, BLOB, Virtual Machine etc
  • Responsible for maintaining the cloud data warehouse on Azure and keeping cost of the program below threshold of given budget. This includes automating start/stop functions for Azure services, tuning DTU’s for synapse Analytics, adopting best practises for different services etc.
  • Responsible for designing new module for existing BI for global inventory which will enable Maersk to share its asset inventory across globe. This includes requirement gathering, data modelling, designing data pipelines for data loads, mentoring team to carry out development tasks etc.
  • Implemented data security model in order to restrict data access across different terminals.
  • Have done POC’s on PowerBI, Kafka, REST API’s
  • Working experience of migrating existing storage systems to Data Lake & BLOB's with different data formats like CSV, Avro, Parquet & ORC & various compression formats like Snappy.

Show More Show Less

Description

  • Worked as a consultant for Morgan Stanley.
  • Developed data loads which are faster and better tuned compared to existing ones
  • Successfully replaced Informatica mappings with BTEQ scripts
  • Successfully completed the client's requirement to load data faster and reduce Informatica Dependency to minimal
  • Created generic shell scripts for data ingestion on Hadoop Data lake
  • Responsible for design and development of Data loads to central data warehouse
  • Responsible for providing production support during initial runs to assist production support team

Show More Show Less

Description

  • Responsible for designing and developing of mappings, sessions and workflows for load the data from source to target database/File System using Informatica PowerCenter
  • Tuned mappings and SQL queries for better performance and efficiency.
    Involved in ETL Lifecycle from refining requirements, development, and unit testing
  • Used different debugging techniques like the Informatica debugger, creating temporary tables and session logs to resolve complex issues
  • Completed POC to read unstructured data (PDF) using Informatica B2B and load it into database.
  • Communicated with Technical BA and business customers to discuss the issues and requirements
  • Effectively worked in Informatica version-based environment and used deployment groups to migrate the objects
  • Have developed UNIX scripts to transfer target files across various platforms. Also implemented file validations to ensure correct files being Ftp’d to target destination
  • Have developed UNIX scripts to accept dynamic inputs from mainframe scheduling tools and configuring Informatica workflow runs as per the dynamic mainframe ESP inputs
  • Involved in feasibility studies (Technical) for requirements with Technical BA
  • Responsible for walkthrough meetings for design review, detailed design and finalizing the application design
  • Responsible for creating governance documents for Informatica development as per the company standards
  • Part of Technical Dress Rehearsal team to carry out simulation activities for Informatica applications before production deployment
  • Responsible for providing production support during initial runs to assist production support team

Show More Show Less

Description

•Worked at Reliance on behalf of Inteliment for entire tenure.

Responsible for designing and developing of mappings, sessions and work flows using Informatica Power Center.
• Developed mappings using various transformations like update strategy, lookup (Connected and unconnected), stored procedure, router, joiner, sequence generator and expression transformation.
• Performed Unit testing and validated the data.
• Involved in ETL process from development to testing and post production environments.
• Participated in design sessions where decisions are made involving the transformation from source to target.
• Monitored data warehouse month-end loads to ensure successful completion.
• Also worked on SAP Business Objects 3.1 (Including Info-view and Universe designer)
• Also worked on Greenplum database (Have developed Informatica mappings and also developed YAML scripts)

Show More Show Less