Hi,
I'd like to use TestLink's requirements features for TC/Req traceability, but I have a question about best practice.
The Short Version
---------------------
I see that TestLink Requirement IDs cannot be duplicated. Is the expectation that we should create a new TestLink project for each new set of requirements produced for each release of a given product?
More Details
---------------
Let's say my company produces a product called "Product A". I want to manage all requirements & test cases for this product as it matures over time. I create a TL project called "Product A" to hold requirements and test cases for this product. A collection of regression test cases will eventually build up within my "Product A" project, and I will do many rounds of system testing for each release, each with new requirements and test cases.
In my organization, the business team creates requirements documents for each release like this:
RELEASE A (2nd quarter)
1.0 Feature A
1.1 An aspect of the feature
1.1.1 A facet
1.1.2 Another facet
2.0 Feature B
2.1 Some aspect of this feature
2.1.1 A facet
2.1.2 Another facet
RELEASE B (third quarter)
1.0 Feature C
1.1 An aspect of the feature
1.1.1 A facet
1.1.2 Another facet
2.0 Feature D
2.1 Some aspect of this feature
2.1.1 A facet
2.1.2 Another facet
So as you can see... the requirements document for each release always starts with 1.0 and has similar numbering. How do I capture these requirements and trace them to my test cases if TestLink doesn't allow duplicate requirement IDs? Is my only option to create a new TL project for each release (i.e. "Product A- Release A", "Product A- Release B", etc.)? If I do that, I can't re-use 1 common set of regression test cases for each release unless I do some kind of export/import process to carry the regression test cases forward.
Am I misunderstanding something about how requirements are expected to be used?
Requirements: Should I create a new project for each release
Moderators: Amaradana, TurboPT, TL Developers
Re: Requirements: Should I create a new project for each rel
IMHO a simple solution (but can be wrong) is to name REQ this way
RELEASE A => REQ 1.0 => USE 'RA 1.0' IN testlink
RELEASE B => REQ 1.0 => USE 'RB 1.0' IN testlink
Because IMHO REQ ID has to be UNIQUE for the whole TestProject no matter the release
hope can help
RELEASE A => REQ 1.0 => USE 'RA 1.0' IN testlink
RELEASE B => REQ 1.0 => USE 'RB 1.0' IN testlink
Because IMHO REQ ID has to be UNIQUE for the whole TestProject no matter the release
hope can help
Re: Requirements: Should I create a new project for each rel
Thanks. I suspected that would be the workaround, and I have explored that in the past. However, a different challenge arises from this approach:
There is clearly significant value in specifying how many TC are needed to satisfy a given requirement, and the "Requirements Overview" page should be a very useful tool to visually determine whether we have achieved 100% requirement coverage. I really enjoy and depend on that feature.
If we add prefixes to the IDs as suggested above (to enable us to hold requirements for multiple releases in a single project without ID collisions)... *all requirements* will load every time we visit the "Requirements Overview" page (i.e. requirements for my current release plus all past releases). In 1 year's time, we might have 4-8 releases, each potentially having 100-300 requirements... and if *all requirements* load every time I hit this page, then the initial page load time will eventually become unusable, will it not? I see we can collapse sections once the page loads, but there is no way to suppress all the older requirements from loading. Furthermore... the most recently created suite of requirements always displays at the bottom of the "Requirements Overview" page (regardless of its order within the Requirements Specification tree), so that will become inconvenient over time.
I'm not complaining nor requesting an enhancement here, just wanting to confirm a negative side effect from this workaround, which again leads me to believe that a TestLink project was only intended to hold the requirements for a single release.
If that is true... then my thoughts shift to trying to use separate projects to track each release's requirements & TC... but then a different problem arises where my pool of regression test cases for that product will never be globally accessible from release to release, unless I export/import (duplicate) them with each new release/project. I vaguely remember seeing someone mention that we can include test cases from a separate (common regression) project into a test plan of my current release project. Is that possible? I just tried it in v1.9.13 and didn't see an obvious way to do it.
Is my overall workflow non-standard, or has anyone come up with an elegant way to hold multiple releases in a single project, OR is there an easy way to include regression test cases from a different project and track their execution metrics in my current release project?
There is clearly significant value in specifying how many TC are needed to satisfy a given requirement, and the "Requirements Overview" page should be a very useful tool to visually determine whether we have achieved 100% requirement coverage. I really enjoy and depend on that feature.
If we add prefixes to the IDs as suggested above (to enable us to hold requirements for multiple releases in a single project without ID collisions)... *all requirements* will load every time we visit the "Requirements Overview" page (i.e. requirements for my current release plus all past releases). In 1 year's time, we might have 4-8 releases, each potentially having 100-300 requirements... and if *all requirements* load every time I hit this page, then the initial page load time will eventually become unusable, will it not? I see we can collapse sections once the page loads, but there is no way to suppress all the older requirements from loading. Furthermore... the most recently created suite of requirements always displays at the bottom of the "Requirements Overview" page (regardless of its order within the Requirements Specification tree), so that will become inconvenient over time.
I'm not complaining nor requesting an enhancement here, just wanting to confirm a negative side effect from this workaround, which again leads me to believe that a TestLink project was only intended to hold the requirements for a single release.
If that is true... then my thoughts shift to trying to use separate projects to track each release's requirements & TC... but then a different problem arises where my pool of regression test cases for that product will never be globally accessible from release to release, unless I export/import (duplicate) them with each new release/project. I vaguely remember seeing someone mention that we can include test cases from a separate (common regression) project into a test plan of my current release project. Is that possible? I just tried it in v1.9.13 and didn't see an obvious way to do it.
Is my overall workflow non-standard, or has anyone come up with an elegant way to hold multiple releases in a single project, OR is there an easy way to include regression test cases from a different project and track their execution metrics in my current release project?
Re: Requirements: Should I create a new project for each rel
>> I'm not complaining nor requesting an enhancement here
It's no matter of complaining or not. Important thing is presenting a clear use case, because this can allow to think about a possible solution.
What probably can be done is to do some changes on the req report to hide req with some status.
In this way you can set older req to OBSOLETE.
This does not means that solution is now ready to be used
regarding
>< I vaguely remember seeing someone mention that we can include test cases from a separate (common regression) project into a test plan
This can not be done, what you can do using ghost feature is use links to common test project in other test project
It's no matter of complaining or not. Important thing is presenting a clear use case, because this can allow to think about a possible solution.
What probably can be done is to do some changes on the req report to hide req with some status.
In this way you can set older req to OBSOLETE.
This does not means that solution is now ready to be used
regarding
>< I vaguely remember seeing someone mention that we can include test cases from a separate (common regression) project into a test plan
This can not be done, what you can do using ghost feature is use links to common test project in other test project