Shekar M.

Shekar M.

Qa manager

Bangalore , India

Experience: 17 Years

Shekar

Bangalore , India

Qa manager

42705.9 USD / Year

  • Notice Period: Days

17 Years

Now you can Instantly Chat with Shekar!

About Me

Professional Summary

Motivated, result oriented IT professional, forward thinking, capable and committed test manager having 16 plus years of experience with a proven ability to meet agreed deadlines, co-ordinate work and...

Testing Tools                         Test Methodology/Practices                               Test Types        

Selenium                                 Agile/Scrum                                                       Functional Test Automation

HP Performance Center            CD/CD                                                               Performance testing        

Jenkins                                   Devops                                                              API/Web Services testing 

Maven                                     TDD, BDD and POM framework                           Lean testing

JIRA                                         Service Virtualization                                         Pairwise testing                  

SOAPUI                                    Workload Modelling                                             Integration testing           

GitHub                                     Test Data Management                                       System testing

Cucumber                              Weblog Analysis                                                   Regression testing

JMeter                                    Bottleneck Analysis                                          User Acceptance testing

Show More

Portfolio Projects

Description

Managed the overall quality aspects of the application with respect to functional, performance & webservice testing

Analyzing and reviewing business and system requirements to create and finalize Test Plans & Test Coverage incorporating input from all the technical & User communities

Directly worked with team in preparation, delivery of test artifacts - plan/strategy, automation scripts, reports, metrics. Analysis of data trends, to initiate and drive quality management upstream and influence change across the group. Measure the quality of the deliverable using the defined quality metrics and addresses the issues that impact the quality of the deliverable from the engineers.

Create/Assist/Troubleshoot the development of Test Automation Framework, Standards, Procedures, Processes, and Best practices as related to Test Automation

Facilitating Scrum Ceremonies including Sprint Planning, Daily Stand-ups, Sprint retrospectives, Sprint Demos, Sprint Grooming, and Release Planning, Agile games such as Planning poker.

Working in a global team environment and communicating effectively across multiple cultures

Manage changes in project scope, identifying potential crises, and devising contingency

Setting up Continuous Integration systems. Integrating and scheduling the automation tests to run with Jenkins

Implemented Automation frameworks, Data Driven, Keyword Driven and Hybrid (Page Object Model) from scratch.

Assisted the team to do weblog analysis, workload modelling, performance test scripts creation, test data predations, executing load tests and generating reports for the same

Manage the Daily/Weekly activities of the Test Automation Team Members

Anticipate release risks and work with team to perform dependency management, mitigate/resolve impediments and perform course corrections in order to meet commitments

Maintain and guide Test Automation Suites Execution and undertake analysis of results to ensure that software meets or Exceeds specified standards and/or client and technical requirements

Managed multiple projects/efforts while maintaining high quality deliverables & quality

Worked closely with QA Team members and collaborate on designing, developing, and executing testing metrics

Responding to immediate production issues, investigating fixes needed with developers and business and create test cases to test the changes made for immediate deployment in production.

Reported severe defects to management whenever needed and follow up with Dev team to get the critical defects fixed and deployed for Testing

Interacted with client staff or users to gain better understanding of issues when reported

Actively involved in the technical discussions in the teams and added value to the same with experience. Helped team members to set their objectives and prepare their competency improvement plan. Guides the team members by providing the appropriate opportunities to achieve their objectives and improve competencies to help them grow in their career

Closely observes the performance of the engineers on both the deliverables and competencies. Provides the appropriate feedback (both the appreciations & focus areas) in a timely manner as part of the performance appraisal process

Worked with inter-group teams to implement ongoing quality improvement initiatives

Developed strategies that align with organization vision & objectives. Anticipate changes and implements operational plans that are achievable

Keep abreast of latest technologies, best practices and apply relevant to the org, Lead the change in adoption of new process / technology

Show More Show Less

Description

F Worked as a Test Consultant handling technical activities like project consultation, gathering requirements, test planning and estimation across different target applications

F Mentored junior members in all areas of QA which includes specification understanding, Test design, Test Case Review and Automation tool/ framework selection

F Worked closely with remote stakeholders to understand requirements well to design effective test strategies

F Lead and manage multiple teams matrixed across products and specializations (functional, technical, performance, automation, monitoring)

F Liaised with product, development, and infrastructure teams to collaborate on delivery of high quality, high performance, and high availability applications

F Developed quality programs and processes to push towards zero-defect software releases

F Motivate, Encourage, Guide Teams maintaining equilibrium and satisfaction level.

F Gave technical/logical solutions to problems faced by the teams and remove hurdles blocking releases.

F Continuously Identify and develop initiatives & Process Definitions to improve the test unit capabilities

F Highlight risks proactively to project stakeholders, work collaboratively with them to mitigate any project risks, to resolve any critical issues and try to bring the project on track during testing phase

F Ensure all the test results/evidences are available, defect logging and make sure the correctness of defect classification by the team as per the guideline document

F Ensure the quality of the DSR, WSR & MSR prepared by the team/self, and make sure the reports contain, concise & precise testing progress update, risks & issues and relevant testing metrics

F Ensure that test leads and teams work closely with the delivery functions and comply with testing, configuration and release management processes for the projects responsible

F Provide inputs to management to enhance the quality of testing, testing efficiency, new testing approach, productivity improvement and thereby reduce the testing window/timelines

F Ensure documentation of software product defects and track it to closure

F Participate in the full cycle of pre- sales activities: direct communications with potential customers, RFP processing, development of proposals for implementation and design of the solution, presentation of the proposed solution to a customer, participation in meetings with customer representatives

Show More Show Less

Description

F To gather the requirements and SLA’s with respect to the Teliasonera R12 application Study the compatibility of the QTP tool with respect to Oracle R12 application

F Prepare a complete test plan of the whole testing activity which involves defining the scope, SLA’s, strategy, test bed set up, risk management, test estimation and schedule, types of tests etc

F Leading and Managing testing efforts on projects, estimating schedules, resources and dependencies

F Understanding the non-functional requirements from business

F Analyzing the critical business scenarios and deriving the service level agreement/objectives

F Developing the test scripts and enhancing the scripts by creating/implementing

o Parameterization

o Checkpoints

o Transactions

o Reusable functions

o Correlations

o Actions/Blocks

o Iterations

o Pacing

o Think Time

F Designing workload model, conduct test execution and Test Monitoring

F Analyze test results and coordinate with development teams for bug fixes.

F Generate test summary reports for management review.

F Analyze root causes of performance issues and provide corrective actions.

F Conduct job trainings and provide assistance to Junior Test Engineers as needed.

F Monitored and tracked project progress and reports findings to leadership.

F Managed all the deliverables to ensure adherence to deadlines and specifications. Implements performance KPIs and metrics and prepares period reports and/or proposals.

Show More Show Less

Description

F Conduct a prestudy of the application by performing Proof of concept and suggest a cost effective and optimized tool

F Collect the functional and architectural design documents to study the complexity of the application and how various components interact with each other

F Prepare a test strategy document which details about the various approaches to carry out the performance testing

F Prepare a detailed test plan which defines the scope, SLA’s, test estimation, test bed set up, risk management, test estimation and schedule, types of tests etc

F Creation of the performance test scripts which involves parametrization, correlation, customization and thus make the scripts more robust and adaptable considering the dynamism involved in the application

F Prepare the test data required during the various kinds of test execution by running the queries in the TOAD tool

F Develop the performance testing scenarios for all the tests which involves defining the user ramp up, user ramp down, user load pattern etc

F Conduct the load tests with varying amount of users

F Gathering the performance test results and preparing a detailed report mentioning the various issues faced, hardware and software bottlenecks identified during the tests, Tuning recommendations with respect to the design aspects and parameter settings in various servers involved

F Conducting regular meetings with the customers during all phases of the performance test life cycle

Show More Show Less