Test automation support
Posted: Fri Feb 06, 2009 10:57 am
Dear TestLink team,
we are looking from the point of view "automatic integration testing" on TestLink since we develop for several years an open source framework for integration tests (www.ivalidator.org). We would like to use TestLink to define test suites that are tested automatically.
From this point of view we propose some extensions on test case specification that we miss:
- A test case often consists of several test steps (as several examples show). This is not yet visible in the data structure and the screens. For test automation each test step needs different automation - so they should be distinguishable.
- Test steps are often reused in several test cases, e.g., logging in, entering some customer data, checking the result data. These steps should be named and reusable ("keyword-driven testing"), maybe using different parameters/data on each occurrence. Note that this concept is different from the keywords you assign to a test case - these allow not to reference a test case (or test step) from another test suite (or test case). A named test step can be mapped to an automatic test step (script, class/method, ...), depending on the test automation used.
- Test data is a central point in automatic testing. Test data should be entered separately from test suites, test cases, and test steps. Often test data is reused so it is a good idea to have named test data sets that may be referenced from different test suites/cases/steps. For each test step it should be possible to reference data needed for this step - and name data that this step creates when it is needed in another step (input/output data for each test step). Footnote: For automation we need a structured (parsable) data representation but that is beyond the scope of TestLink. This depends on the test automation used.
Regards,
Gerald
gerald@ivalidator.org
we are looking from the point of view "automatic integration testing" on TestLink since we develop for several years an open source framework for integration tests (www.ivalidator.org). We would like to use TestLink to define test suites that are tested automatically.
From this point of view we propose some extensions on test case specification that we miss:
- A test case often consists of several test steps (as several examples show). This is not yet visible in the data structure and the screens. For test automation each test step needs different automation - so they should be distinguishable.
- Test steps are often reused in several test cases, e.g., logging in, entering some customer data, checking the result data. These steps should be named and reusable ("keyword-driven testing"), maybe using different parameters/data on each occurrence. Note that this concept is different from the keywords you assign to a test case - these allow not to reference a test case (or test step) from another test suite (or test case). A named test step can be mapped to an automatic test step (script, class/method, ...), depending on the test automation used.
- Test data is a central point in automatic testing. Test data should be entered separately from test suites, test cases, and test steps. Often test data is reused so it is a good idea to have named test data sets that may be referenced from different test suites/cases/steps. For each test step it should be possible to reference data needed for this step - and name data that this step creates when it is needed in another step (input/output data for each test step). Footnote: For automation we need a structured (parsable) data representation but that is beyond the scope of TestLink. This depends on the test automation used.
Regards,
Gerald
gerald@ivalidator.org