Scenario testing - ways to create good scenario

    [Me]:-  I tried writing some scenarios for testing. But i was lost midway. can you guide me ways to create good scenarios

[Guruji]:- Well , while you create scenarios, bear in mind the blow pointers. It will tremendously help you.

List possible users, analyze their interests and objectives
Consider disfavored users: how do they want to abuse your system?
List system events. How does the system handle them?
List special events. What accommodations does the system make for these?
List benefits and create end-to-end tasks to check them.
Interview users about famous challenges and failures of the old system.   Study complaints about the predecessor to this system or its competitors.
Work alongside users: see how they work & what they do
Read about what systems like this are supposed to do.
Convert real-life data from competing or predecessor app
Look at the specific transactions that people try to complete. What are all the steps, data items, outputs, displays, etc.?Look for sequences: typically do task X in an order. What are the most common orders (sequences) of subtasks in achieving X?

Designing Scenarios

[Me]:-  Guruji, How do we design scenarios for testing?

[Guruji]:- Scenario Designers rely on similar information to requirements analysis, but use it differently.

[Me]:- oh, so how does a requirement analyst differ from Tester in this aspect.

[Guruji]:- That's a good question.

Requirement Analyst Tester
The requirements analyst tries to foster agreement about the system to be built The tester exploits disagreements to predict problems with the system
Has to make conclusions and recommendations as to how the product should work The tester doesn't have to reach conclusions or make recommendations about how the product should work. Her task is to expose credible concerns to the stakeholders
Can make product design tradeoff's by exposing consequences The tester doesn't have to make the product design tradeoffs. She exposes the consequences of those tradeoffs, especially unanticipated or more serious consequences than expected
Has to respect prior agreements with stake holders The tester doesn't have to respect prior agreements. (Caution: testers who belabor the wrong issues lose credibility.)
Work is exhaustive The scenario tester's work need not be exhaustive, just useful

 

Test Plan - Must Not's

[Me]:- Guruji, can you tell me what are the items we should avoid in a test plan?

[Guruji]:- Well a  test plan is a very important document and it is very imperative to have a Plan that should not include

Test plans should not repeat information which can be found in other documents such as Functional Specs, Design Docs, QA Overview etc    

Overall product platforms, languages, and other product dependencies, integration points, and configurations should be noted in the product’s QA Overview Document    

General tools/infrastructure used across teams should not be part of Test Plan but should be noted in the QA overview document    

Actual details of the tool design & use should be in a separate document    

Scenario Testing

[Me]:-  Guruji, I have been hearing a lot of things about Scenario Testing. Can you tell me what it is?

[Guruji]:-  Sure , A scenario test is a test based on a scenario. A scenario typically involves a sequence of steps or tasks. The tests are based on a story about how the program is used, including information about the motivations of the people involved. Usually the story involves a complex use of the program or a complex environment or a complex set of data. The story needs to be realistic. It not only could happen in the real world but also make the stakeholders believe that something like it will probably happen. The story needs to be motivating, only then will a stakeholder with influence would push to fix a program that failed this test.

[Me]:-  But why Scenario test?

[Guruji]:- Well, you can use Scenario tests to 


    Expose failures to deliver desired benefits     
    Connect testing to documented requirements    
    Bring requirements-related issues to the surface, which might involve reopening old requirements discussions (with new data) or surfacing not-yet-identified requirements    
    Detect errors in the interaction of two or more features 

Test Plan - What is it?

[Me]:- Guruji, what's a test plan?

[Guruji]:- A test plan is a document, written prior to actual testing. A good test plan helps organize and manage the testing effort and contains a high level overview of the strategies for testing the feature or component, list of any specific tools required for testing and the test oracles to be used during testing the feature or component.

[Me]:- But what are its benefits?

[Guruji]:- The benefits of having such a document are many, but I like  them for these


Test plans facilitate the technical tasks of testing     
Test plans improve communication about testing tasks and process     
Test plans provide structure for organizing, scheduling and managing the test project

Test Automation code review guidelines

[Me]:-  Guruji, we have some automation in place. Are there any review guidelines that are similar to Code review standards we have in Dev?

[Guruji]:- Yes, automation typically is code or script . This needs to be reviewed. And yes, there are a few guidelines you can follow for effectivness

[Me]:-  Sounds good. Can you share some code review guidelines?

[Guruji]:- There are several guides, i have listed a few for your convenience

Duplicate code - code that is the same or almost the same or even similar to code elsewhere 1.Creating 2 separate test case methods for similar test conditions such as min/max values, nulls or other boundary conditions for parameters. Instead, extract out the common logic in the two test cases and make it a method that both can call
2.Testing the same functionality dropped in 2 different releases by rewriting existing test cases just for the new release. Existing test code should be written in a way that it works across multiple releases.
3.If an existing test case changes from one release to another, the use of pre-compile directives can be employed. Or, if the test case becomes fundamentally different then a new test case could be written
Duplicating functionality - Re-writing existing libraries is expensive to write and maintain 1.Creating a separate random or project data generation when a new product revision is released. Instead, data generation libraries should be re-used as much as possible across projects
2.Writing separate releases of library code to handle multiple different namespaces of the same product. Pre-processor directives or reflections or generics should be used to make code independent of project namespaces. This applies specifically to code which will be used to test multiple releases of the product
3.Re-writing functions which have part of the same functionality implemented by other methods should be avoided
Incorrect Exception handling - "swallowing" exceptions is the worst practice, but even solely logging is bad. In general, test code should not need exception handling, as the underlying test harness will automatically deal with it. The exception is if you have significant information to add to the exception message, and in that case, the original exception should be passed as an InnerException 1.System resource leaks - This is especially common when using SqlConnection, SqlCommand etc. statements. The best way to use these is to wrap it with a "using" statement. Anything that implements IDisposable should be wrapped in this statement
2.Any resources used should be opened pessimistically, i.e. they should be opened in such a way that they are automatically closed once the usage is complete. This includes opening files in read-only mode unless read/write is required, and closing database connections immediately after usage
3.In the case of unmanaged pointers, system resources need to be properly released. It is a good design approach to implement a SafeHandle derived class to manage this for you. This would force you to implement the releasehandle code which can include memory clean-up related code
4.Un-managed code usage should be avoided as much as possible and C# should be used as much as possible. If unavoidable, then usage should be called out and clearly documented
Non-use of verification functionality - Test code should not throw exceptions but rather use Assert.* methods, and Assert.Fail() if nothing else matches. Don't hesitate to incorporate Assert.* calls into your libraries - some of the most efficient test cases relegate all verifications to library methods. All test cases should rely on some form of verification 1.Verifications should be performed for all possible fields of the object being tested. This should include core fields and also audit fields such as created, updated date etc
2.Verifications should be bundled together for an entire object and implemented hierarchically, for example if a customer object has a emails collection then there should be a separate verification method for verifying email collections which should be called from the customer verification method
3.
Magic numbers and the like - Hard coded values can be a maintenance nightmare down the road. Centralize these settings somewhere in your code instead 1.All configuration and other data values should be in a configuration file. Initially, these can be exposed by a separate "Settings" class with static fields or methods, where the values are inline. Later, this class can be modified to get the values from a different source, like a configuration file
2.Settings such as web service URL's, database connection strings and assembly names/paths are another example
3.All paths should use relative pathsinstead of absolute paths. Use Reflection to determine the execution directory and configuration paths.

In addition to these,A summary document in the solution or code comments can help significantly  and not to loose out on ReadMe file, this file should include any external dependencies, URL's special set-up instructions or note configuration values that need to be changed for a local machine. Testers could generate a .chm file to give an extra professional look to it, especially in the cases where an adopting tester will use it.

www.CodeNirvana.in

Powered by Blogger.

Translate

Total Pageviews

Copyright © T R I A G E D T E S T E R