|
原始链接号请见:
http://www.logigear.com/newslett ... _best_practices.asp
Introduction
The top five pitfalls encountered by managers employing software test automation are:
* Uncertainty and lack of control
* Poor scalability and maintainability
* Low test automation coverage
* Poor methods and disappointing quality of tests
* Technology vs. people issues
Following are five "best practice" recommendations to help avoid those pitfalls and successfully integrate test automation into your testing organization.
1. Focus on the Methodology, Not the Tool
A well-designed test automation methodology can help to resolve many of the problems associated with test automation. It is one of the keys to successful test automation. The methodology is the foundation upon which everything else rests. The methodology drives tool selection and the rest of the automation process. The methodology will also help to drive the approach to any offshoring efforts that may be under consideration, helping to guide locating the "appropriate" pieces of the testing process both on- and offshore.
When applying a methodology, it is important that testers and automation engineers understand and accept the methodology. Also, other stakeholders such as managers, business owners, and auditors should have a clear understanding of the methodology, and the benefits that it brings.
2. Choose Extensible Test Tools
Select a test tool that supports extensibility, a team-based Global Test Automation framework (team members are or may be distributed), and offers a solid management platform.
Surveying test tools can be time consuming, but it is important to choose the best tool to meet your overall test needs. Before beginning the survey, however, you should have a good idea of what you need in the first place. This is intimately tied to your overall test methodology.
Make sure your chosen test tool has an "appropriate" automation architecture. Whatever tool is used for the automation, attention should be paid to how the various technical requirements of the test case execution are implemented in a manageable and maintainable way. In looking at tools and considering your methodology, you should ask the basic questions of how well these tools address reusability, scalability and team-based automation (a driver for productivity quantitatively), maintainability (a driver for lowering maintenance cost), and visibility (a driver for productivity qualitatively and a vehicle for control, measurability and manageability).
You should strongly consider tools based on Action Based Testing (ABT). Action Based Testing (ABT) creates a hierarchical test development model that allows test engineers (domain experts who may not be skilled in coding) to focus on developing executable tests based on action keywords, while automation engineers (highly skilled technically but who may not be good at developing effective tests) focus on developing the low-level scripts that implement the keyword-based actions used by the test experts.
Care should be taken to avoid simplistic "Record-playback 2.0" tools that claim to do test automation with no coding. There is nothing against being able to automate without having to code - it is in fact a good benefit to have. However "Record-playback 2.0" tool’s bottlenecks quickly show as you start getting deep into production.
3. Separate Test Design and Test Automation
Test design should be separated from test automation so that automation does not dominate test design. In test automation, it is preferable to use a keywords approach, in which the automation focuses on supplying elementary functionalities that the tester can tie together into tests. This way, the complexity and multitude of the test cases do not lead to an unmanageable amount of test scripts.
The testers (domain experts) should fully focus on the development of test cases. Those test cases in turn are the input for the automation discipline. The automation engineers (highly skilled technically) can give feedback to the testers if certain test cases are hard to automate, suggesting alternative strategies, but mainly the testers should remain in the driver’s seat, not worrying too much about the automation.
In general, no more than 5% of the effort surrounding testing should be expended in automating the tests.
4. Lower Costs
There are three ways that you can look to lower costs:
1. You can use labor that costs less than your local team
2. You can use a tool that costs less
3. You can use training to increase the tool productivity
It is important, however, when addressing costs, not to focus on one dimension too closely without keeping in mind the overall methodology and considering the impact of any decision on other parts of the process. For example, lowering one cost such a labor by outsourcing may actually increase the total costs if that labor that does not have the proper skills.
5. Jumpstart with a Pre-Trained Team
Jumpstart the process with a pre-trained outsourcing partner that knows more about test automation success than you do, and that has a competent, well-trained staff of software testers, automation engineers, test engineers, test leads and project managers.
A pre-trained team can:
* Reduce your overall project timeframe, because you don’t need to include training at the beginning of the project schedule
* Reduce risk, because you don’t need to worry about how well the team members will learn the material and how skilled they will be after the training is complete
Conclusion
To summarize the preceding in a simple list, the five suggested best practices for test automation success are:
1. Focus on the methodology, not the tool
2. Choose extensible test tools
3. Separate test design and test automation
4. Lower costs
5. Jumpstart with a pre-trained team |
|