Now you can Instantly Chat with NAG!
About Me
Over 15 years of experience in Information Technology and experience in Dell BOOMI Integration and EDI Development Experience with Enterprise Data Integration Platforms like Dell Boomi Experience with B2B/EDI terminology and data standards like X12, ...
Show MoreSkills
Portfolio Projects
Description
- Extract data from different legacy systems and migrate to HANA using SAP Best Practices in BODS.
- Involved in implementing migration for Benefits, Payroll, employee Data, Foundation Data using SAP Best Practices. Develop reconciliation reports to business for Source Target data comparison. Connect various source systems such as IBM DB2, SAPBW, MS SQL Server, Infinium to extract data
- and then apply cleansing, business rules and load data into SAPERP6.0.
- Use BI tools - SAPBODS BI Launch Pad and SAPBO IDT to create BI objects, to be used by business users for their decision making, such as universes, tabular reports, cross tabular reports, graph
- reports, chart reports etc. by connecting to DW environment data.
- Creating local and central repositories as per client requirements.
- Handling administration related issues for SAP BODS and Business
- Object Data Quality Management.
- Extensively worked on management console.
- Creating Users and providing access to them.
- Configuring Job Servers with repositories.
- Scheduling the jobs in Management Console, and Creating the Global
- Variables at Job level.
- Troubleshooting day to day technical issues.
- Maintaining BODS Security as per the client specifications.
- Interacted with client on daily status calls.
- Resolving the tickets within a specified SLA time
- Responsible for building Interfaces using Dell Boomi as per the business requirements
- Analyzed the requirement specifications provided by the clients and discuss with client to clearly understand the functional and technical requirements
- Responsible for Integration development, involving requirement gathering, analysis, Process Development, testing, and implement mapping
- Deployment of the interfaces on Cloud according to Client specification
- Good working experience in creating custom objects and trying to integrate them with the SQL server DB
- Customized and configured SQL queries and actively using DB connectors for various interfaces.
- Developed Proof of concept (POC) and provided work/time estimates for design and development efforts
- Responsible for creating XML schemas, WSDL's for the web services and configured SOAP based web services using JMS and HTTP transport protocols
- Developed few interfaces to replace existing systems which will require JMS Publisher subscriber mechanism
- Developed few integrations with asynchronous mechanism exposing as webservices and sending data to rest webservices
- Responsible for creating XML schemas, WSDL's for the web services and configured SOAP based and HTTP web services
- Unit test creation using SOAP UI
- Created automation of Boomi interfaces to testing load of EDI and IDOC's
- Worked on EDI's like 850, 855, 875, 880, 846, 944, 812
- Worked on EDI inbounds and outbounds for 3PL, SAP IDOC’s, and generating emails as required by business needs
- Collaborated with internal customers, other functional teams, development teams, and other stakeholders to identify user requirements
- Interact with Infra teams to solve network/database issues, which in turn may affect Boomi Integrator
- Obtained feedback from the SMEs and QA team for assessing the quality of the artifacts and the risks associated with moving into the next phase
- Participated in testing status meetings with production team to identify and resolve any outstanding defects as part of releases
Description
- Worked with business process owners, stakeholders to understand and document the requirements for GTS source system integration
- Experience in SAP Data services, Data Marts, Data Integrator, Data Cleansing (Address Cleansing), Data Profiling (Data Insight), Debugging, Performance Tuning, Business Objects installation experience
- Created complex Jobs, Workflows, Data Flows, and Scripts using various Transforms like Integrator, Quality, and Platform to load data from multiple sources into HANA
- Developed data extraction, profiling, cleansing, de-duplication, standardization, conversion, transformation, and loading data into Oracle and HANA
- Strong experience on SQL and PLSQL
- Good functional knowledge of customer, vendor, product/material, and master data in S/4
- Integrated SAP and Non-SAP Data through SLT and Data Services into HANA DB
- Loaded data into HANA using splitting loads into various batches
- Used data warehousing techniques for data cleansing, Slowly Changing Dimensions SCD and Change Data Capture CDC
- Implemented delta logic where required
- Oversaw the BODS work done by offshore and peer reviewed the code before submitting it to the client
- Based on the Requirements, Extracted the data from different sources Salesforce, CSV files, Excel, Oracles Server, Flat files) and loaded into HANA
- Replicated data into HANA using SLT Replication server & Sybase Replication server
- ETL (Extract, Transform, and Load) and Data Profiling experience from source systems including Flat Files, Legacy systems and SAP using SAP Data Services (BODI/BODS 4.2)
- Experience in Installing, Configuring SAP Data Service and HANA instillation
- Used Data services in loading data into HANA from flat files using IDOC
- Experienced in data profiling and data cleansing needs like duplications, validating data, harmonization, missing data
- Used different tools for data manipulation tools such as TOAD, Access, Excel, SQL server, Oracle, and HANA
- Created cleansing package using the cleansing package builder to standardize customer data by creating package list using SAP Information steward
- Used Parsing strategy to parse data on whitespaces and transitions for special characters
- Defined category and attributes on customer data following the out of box attributes suggested by Information steward
- Defined Standard forms and variations to select an attribute to standard form
- Created rules to ensure the data complies with your business requirements using Data Insight
- After creating rules After binding the rule to a specific column so that the score can be calculated
- Created data quality scores using the source data in the workspace view under Data Insight
- Calculated the aggregations required for the scorecard to display results using Data Insight
- In Central Management console created the Users, groups and assigned the security in The Central Management console
- Using Data Services Management Console to schedule and execute jobs, manage repositories
- Executed admin tasks like Job Server Configuration, Repository creations local/central, CMC, Management console and scheduling as required
- Used BODS code migration using Check-In/Out of Central/Local Repository to promote code from QA/Dev/Production using filtering and other techniques
- Experience in Installing, Configuring SAP Data Service and HANA instillation
- Backup of repositories, configuration files & Data Cleanse files:
- Create a Check list of Number of Batch Jobs/Real Time Jobs, Number of Access Server, Number of Job Server & Configuration Path (Datastore) of Source & Target Systems
- Create a Check list of Jobs Scheduling in SAP Business Objects Data Services Management Console
- Create a Check list of Groups & Users Available in SAP Business Objects Data Services Management Console
- Backup of Local, Profile & Central Repository
- Backup of Following Configuration Files
- Used Data services in loading data into HANA from flat files using IDOC
- Experienced in data profiling and data cleansing needs like duplications, validating data, harmonization, missing data
- Used different tools for data manipulation tools such as TOAD, Access, Excel, SQL server, Oracle, and HANA
- In Central Management console created the Users, groups and assigned the security in The Central Management console
- Using Data Services Management Console to schedule and execute jobs, manage repositories
- Export ATL files are required for back up using exclude and other techniques
- Used Dell Boomi AtomSphere tenant cloud integration platform for connecting cloud and on-premises data
- Used Dell Boomi to design cloud-based Atoms and transfer data between cloud and on-premises applications
Description
- Involved in the requirement definition and analysis in support of Data Warehousing effort
- Interpreted logical and physical data models for Business users to determine data definitions and establish referential integrity of the system
- Experience in Data Migration techniques like migration with AIO and data quality transformations, receiving and sending Idoc and integration with SAP ABAP to Data Services
- Developed ETL procedures to ensure compliance with standards and lack of redundancy, translates business rules and functionality requirements into ETL procedures using different transforms like Query, CASE, Map operation, merge, Table compare, History preserve
- Extensively used BODS Designer and Management Console
- Involved in the implementation of this application, which involved the Extraction, Transformation, and Loading of Client specific data into Oracle database
- Extensively worked on sales activity for every major distribution channel, including mail service, major retail stores and chains and mass merchandisers
- Worked on Management console to schedule jobs and monitor the logs
- Developed and tested all the backend programs and update processes
- Created Batch Jobs and workflows using BODI Designer
- Extensively worked in the performance tuning of the programs, ETL Procedures and processes
- Responsible for daily verification that all scripts, downloads, and file copies were executed as planned, troubleshooting any steps that failed
- Discuss requirements with business, other developers and offshore resources and assign tasks to offshore and review their transformations before deploying them to Test Environment
Description
- Created Mapping and Extraction documents for the development to work on extraction and transformation logics
- Worked with client’s team in creating ETL architecture, ETL design, analysis, and development, writing technical specifications
- Write SQL, PL/SQL queries to verify the extraction and transformation job in Oracle and MS SQL Server
- Worked on End-To-End implementation from Extraction data from Legacy database to transformation and loading the files using EMIGAL structure
- Implemented data migration using SAP BPDM Best Practices for Data Migration on Utilities
- Working on different migration objects like Move in, Security Deposit Request, Security Deposit payments, Budget billing planning, Interaction Records, Contract Accounts, and Business Partners
- Implemented extraction and transformation with control loads and with and without validations
- Data analysis on error records and defects in HPQC to analyzed and implement the solution to regenerate the corrected load files
- Executed technical leadership on the use of the technology platform and tools mentoring the offshore for optimal use of BODS in a SQL Server, SAP utilities, and HANA environment
- Established the best modeling and recommended strategies for BODS environment and architecture for SAP HANA4
- Created R/3 dataflow using ABAP and Data Transport
- Pushed ABAP to production with built in Generate ABAP code in directory
- Created Transformations, Update Rules and Transfer Rules using ABAP routines in start routines and end routines
- Populate fields that was not provided by the Source Systems and modified existing data to meet Business Requirements using ABAP
- Provided the cleansed data to load data into SAP CRM to ABAP resources using EMIGAL
- Setup the ABAP data flow for data extracts and setup the shared directories for data loads and ABAP programs for SAP CRM and IDOC interface using Data Services
- Set up RFC server connections in both BODS and BW sides and open Hub in BW side
- Managed onshore and offshore development team and review their project assignments by checking the jobs from Central Repository and executing the jobs manually
- Co-ordinated with offshore and assigned tasks to team and make sure to review the code before check-in central repository
- Review the TUT and FUT documents from offshore to make sure all the code standards are maintained
- Participated in requirement gathering sessions and developed Business Objects Data Services jobs specifications on excel sheets based on information acquired from analysis of source data, user requirements, Business rules and Enterprise standards
- Used Business Objects Data Services for ETL extraction, transformation and loading data from heterogeneous source systems for all migration related activities
- Used various transformations like Case, Map Operation, Merge, Query, Row Generation, SQL, Validation, Data Cleanse, Address Cleanse, and Match
- Used Business Objects Data Services Cleansing transformations extensively for de-duplication, Standardization and Address Parsing Information Steward and BODS (Data Services)
- Extensively involved in Data cleansing and Address Standardizing using USA and Geo Code Latitude & Longitude transforms in Business Objects Data Services Data Quality
- Used Business Objects Data Services interactive debugger to test the jobs and fixed the bugs
- Experience in debugging execution errors using Data Integrator logs (trace, statistics, and error) and by examining the target data
- Verified the extraction jobs to make sure the extraction criteria meet the requirements in BODS by running the workflows and the individual dataflows
- Verified the validation based the EMIGAL structure to validate lookups, mandatory columns, and validate format with check tables from EMIGAL structure
- Verify the enriched structure to make sure it meets the EMIGAL standard to load the file into SAP Utilities
- Verified the generation of ISMW file structure to load the file into EMIGAL
- Verified the enriched tables to validate the requirements
- Worked on SAP tables, doing data conversion, moving data from source to loading table and then to staging table and then target database
- Defined separate data store for each database to allow Business Objects Data Services to connect to the source or target database
- Used Metadata management module in Information steward to discover metadata in universe, reports, dashboard, and SAP BW systems
- Used Impact analysis to view the objects that are affected by data within an object
- Perform Unit testing and Document the results of Unit testing and ETL process as per requirements
- Administering and maintaining BO Data Services management console and Repository management
- Developed various Workflows and Data flows where the data is extracted from the sources like Oracle, SQL Server, and SAP ECC then loaded into staging tables from staging tables data loaded into relational tables and from Relational tables