Category: Uncategorized

  • Examples of Validation

    Examples of Validation are one of the easier items to find as long as the definition referred to in the some of the earlier posts is used. Validation covers all active testing where code has been generated and we are able to run it actively.

    So it includes at least the following list of Test Phases:

    • Unit Testing
    • Integration Testing
    • System Testing
    • Acceptance Testing

    And the following list of Test Types:

    • Initial Testing
      1. Configuration Testing
      2. Compatibility/Conversion Testing
      3. Installability Testing
    • On-Going Testing
      1. Functionality Testing
      2. Facility Testing
      3. Security Testing
    • Performance Testing
      1. Volume Testing
      2. Stress Testing
      3. Load Testing
      4. Speed of Response Testing
      5. Storage Testing
    • Post Functional Testing
      1. Usability Testing
      2. Reliability Testing
      3. Recovery Testing
    • Post Installation Testing
      1. Serviceability Testing

    Despite the length of the two lists above, they are not that hard to deal with since they break the testing into several disparate groups and make it easier to ensure that everything is addressed and not omitted from consideration.

    The two key points are to determining into which phase the Types of Testing fall and whether they are worth doing. The second question is much harder to answer than the first.

    If you ask the stakeholders for most projects, they will simply answer that all Types of Testing should be full completed with all possible depth and speed. If you ask them to budget for it, you usually end up with a minimization statement asking for the cheapest testing. The best answer we have found to deal with this type of conundrum is to provide the following:

    1. Provide the above list of Validation testing types.
    2. Define each one briefly.
    3. List the recommended phase.
    4. Provide the recommendation for inclusion or exclusion.
    5. Provide the risk of failure to include them.

    This provides the stakeholders with all the information they need to make the correct decision and weigh the various merits of completing the various types of testing. In the event that they still cannot come to a decision, then add a recommendation based on the project.

  • Validation Applicability in Software Testing

    Many people believe that Validation is Software Testing and that is all there is to the profession. As was stated in our earlier blogs Validation is only half of it. It is the more expensive ‘half’ but has had far more effort put into it by many people.

    Validation is applicable in Software Testing as soon as you have any code at all to test. So it cannot launch until some development has been completed and some code generated. However, once that is completed, it is possible to launch Validation techniques and start to apply them.

    Validation can be applied at the Unit or Module level. This is the most cost effective place in which to use the technique since any error discovered at this stage is reasonably easy and cheap to fix and retest. A single module can be tested by the use of Drivers and Stubs and executed while the tester watches the progress of individual variables and conditions.

    Validation can also be applied at the Integration level where individual modules are strung together. There will still be a need for Drivers and Stubs to drive the calling and called modules but the effort will center on testing the interfaces between the modules.

    Validation can next be applied at the System and Acceptance Testing Levels although there will be more emphasis placed on some of the non-code aspects of the system.

    Lastly Validation can be applied to the Non-Functional testing aspects of the system. This requires some consideration as to what is important and what is cost-effective to test at this stage. A lot of funds can be expended with little results if due consideration is not given to the payback. If you have questions about this type of Validation Testing and would like to determine what to do, give us a call or email us at neil@nvp.ca.

  • Validation

    Quality Assurance has a number of subdivisions; one of those being Quality Control (sometimes referred to as Software Testing). We have been discussing Verification for the last four weeks. See our blog. Today we start into Validation

    We define Validation as being the active part of testing where completed, compiled, and promoted code is being run using data and generating results while Verification is the static part of testing. Note that promoted includes Unit testing on the developer’s workstation. So the promotion may not be a very formal one nor may the compile amount to a lot.

    Validation includes All aspects of active testing including the major divisions of White Box and Black Box (along with all the various Grey boxes in between). The fact that the code has to be completed means that it is difficult to launch into Validation before a lot of work has already been done. This is what adds to the cost of completing this type of testing and ensures that any defects that are found tend to be expensive to correct and retest. Hence our emphasis on Verification as being more generally cost effective.

    A lot of work has been completed to enable Validation to proceed smoothly and effectively. Many testers start their careers in this area and then move into various aspects of the process as they become more specialized. Some go towards Automation, others move into specialised areas like Performance or Security testing.

    While the concept is simple, the identification of the most cost-effective place to use Validation techniques is not always easy. There is no point in completing Validation without some end in view and a positive ROI. Give us a call at 416-927-0960 or visit our website at NVP.ca to find out where you would benefit from the implementation of Validation techniques in your organisation.

  • Quality Assurance Supports Verification

    Quality Assurance supports verification for a number of reasons not least of which is the potential for cost savings for you! The key is to know where to apply the appropriate level of verification. It is quite possible to verify too many items and lose the benefits of reduced rework since it is all used up in the actual verification process. It is equally possible to verify too few times and lose the momentum and the benefit of corporate knowledge.

    Some projects have problems identifying the correct place to implement verification techniques and either wait until it is too late in the project or start too intensely too early. The last mistake that some organizations make is matching the correct verification technique to their level of process maturity. There are prerequisites that are needed to make the verification process effective and without those in place, the positive impact is limited. It is also necessary to have the correct stakeholders involved so that they get their input at the correct point and with sufficient weight.
    For all of the above reasons, it is critical that an assessment be carried out on the existing processes to determine the following:

    • The appropriate place to use verification techniques in your projects.
    • The payback realised from the implementation of verification techniques.
    • The level of Verification carried out for each project artifact.
    • The relevant stakeholders who gain from the process and can see the benefits.
    • The use of the correct techniques based on SDLC maturity.

    Once all of the above are known; then the prerequisites can be put in place and an effective verification process implemented for your benefit. NVP Software Solutions completes that assessment and provides Recommendations and a Roadmap to get you there. You realise instant benefits in reduced rework and reduced cost.

  • Verification Applicability in Software Testing

    Last week we introduced the concept of verification and defined it. This week we want to discuss verification applicability. Many people agree that verification should be done but come to a halt when deciding where they can apply it and the degree of formality that should be used. Verification really covers everything that can be done in the project in terms of testing excluding the actual Quality Control or active testing. In other words we can verify almost everything.

    • Business Objectives – can be verified for feasibility, correctness, completeness and how accurately they reflect what the business needs
    • Requirements – can be verified for correctness, completeness, feasibility, testability, and how well they reflect the underlying business objectives
    • Design – can be verified fir correctness, completeness, feasibility, testability, and how well they represent the defining requirements
    • Code – can be verified for adherence to standards, readability , level of comments, structure, whether it reflects the design, maintainability, portability, testability
    • Test Plans – can be verified for readability, correctness, applicability to the testing at hand, ability to execute what is required, scope
    • Test Cases – can be verified for coverage, maintainability, readability, ability to execute, maintainability, portability
    • Test Data – can be verified for coverage, security (anonymity), maintainability, portability, retention
    • Test Reports – can be verified for usefullness, readability, completeness

    The above are only a partial list of the major items that can be verified. The list can include any artifact that may be produced during the project like Minutes of Meetings, Online Conversations, transcribed verbal conversations, and emails. If it can be read it can be verified!

    The issue is to find the most cost-effective places in your development process to implement verification techniques. There is no point in completing Verification processes without some end in view and a positive ROI. Give us a call at 416-927-0960 or visit our website at NVP.ca to find out where you would benefit from the implementation of Verification techniques in your organisation.

  • Examples of Verification in Software Testing

    In our last blog post, we included a list of Examples of Verification without going into too much detail. In this blog we will take a couple of those verification list items and expand on them. The examples of verification that most tend to impact software testers with the best return are Test Plans, Test Cases, Test Data and Test Results.

    Test Plans

    Applying verification techniques to a Test Plan can save hours of effort. The three methods we can use to review a test plan are:

      • Walkthrough – in this method, the author of the Test Plan ‘walks’ one or more of their peers through the test plan, explaining the sections and what is meant by the content. The role of the peers is to find problems, omissions, extra content, incomplete items, inconsistent items and to add things that they might feel were missed. The intent is to have a better document more accurately reflecting the needs of the project stakeholders. A new version is issued after all the errors are corrected and it is used going forward.
      • Document Review – In this method, a review committee (preferably selected from the interested stakeholders) reviews the documents and records the same items as listed above for the walkthrough. Once they are finished their individual reviews, they come together to create a final list of problems which need to be corrected. A new version is issued after all the errors are corrected and it is used going forward.
      • Inspections – In this method, formalized roles are defined and assigned and a procedure is followed to ensure proper inspection of the document. The intent and result is the same as the previous two methods. The only difference is the degree of formality.

    What’s the point in all of this? The payback from reviewing the plan (using any of the methods above, more than pays for itself in terms of less errors going forwrd, less work to be undone, redone and redone (again).

    If you know your test plan is poor, or you’re not even sure where to start give us a call at 416-927-0960 or visit our website at NVP.ca to find out where you would benefit from the implementation of Verification techniques in your organisation.

  • Verification

    Quality Assurance has a number of subdivisions; one of those being Quality Control (sometimes referred to as Software Testing). One of the, largely underrated, aspects of Quality Control is Verification.
    We define Verification as being the static part of testing while Validation is the active part of testing where completed, compiled, and promoted code is being run using data and generating results. Verification refers to aspects of testing that do not involve actually executing the code and includes: Inspections, Walk-throughs and Reviews.

    Since we do not need code, it is possible to launch into Verification much earlier in the process without waiting for code to be completed. As someone once pointed out, the first time you pick up some artifact from the project whether it be a document, minutes of meetings, a communication or even a conversation and start to look at it, you are performing your very first test of that system.

    The three major subdivisions mentioned above differ in their degree of formality and where they may be applied. In general Inspections are the most formal of the processes and have a well developed methodology. Reviews can be formal or informal and can be done at various points depending on how necessary they are and what is the intent. Walkthroughs can be very informal and used with minimal effort and training.

    The major impact of this type of testing is to discover issues, misconceptions, lack of clarity, incompleteness and misinterpretations as soon as possible.
    While the concept is simple, the identification of the most cost-effective place to use Verification techniques is not always easy. There is no point in completing Verification processes without some end in view and a positive ROI. Give us a call at 416-927-0960 or visit our website at NVP.ca to find out where you would benefit from the implementation of Verification techniques in your organisation.

  • Test Tools – Selection is Important

     

    Test Tools are integral to all the work we complete these days. There are thousands of test tools in existence. A google search obtained 941,000,000 results https://www.google.ca/#q=test+tools and some of the initial results are lists of tools for particular technologies or development platforms. No doubt searching for such a generalized term has brought a lot of results that are not really relevant to the particular task at hand but even a few of these sites will yield a lot of choice. Hence the title “Test Tools – Selection is Important” since we need to choose between all of these.

    The key questions to ask:

    • What do we want the test tool to do?
    • What is our current testing process?
    • What is our budget?
    • What are our current competencies?
    • Who wants to use the tool?
    • How will it aid their work?

    These are all important questions to ask before we run the search listed above. There are specific needs and probably conflicting needs existing in the acquiring organisation that need to be balanced. The question is how to balance those competing requirements.

    Quality Assurance will make the decision about what to acquire much more obvious. The focus on process analysis will identify the areas or steps that are costing a lot in time or resources and the statistics gathered in the process analysis allow immediate identification of the largest possible return on investment. Balancing the expected return against the acquisition cost, implementation cost, training cost and support cost provides an instant upper limit to the budget. There will no issues deciding whether something will pay for itself.

    Obviously, in addition to the above process orientation, there are going to be technical questions (compatibility with existing software and hardware) that further restrict the candidate list to a manageable number. These are not Quality Assurance questions.