Tag: Quality Control

  • RCA – Why – Part 2

    Three weeks ago we asked why we would bother with Root Cause Analysis. We asked some questions and provided some answers. However that analysis concentrated on a single defect and the use of Root Cause Analysis in that case. The power of Root Cause Analysis really applies when we can solve a whole class of defects.

    There is a good chance that a problem that occurs in one place in the code has been repeated elsewhere. Errors have a tendency to be repeated! This came up recently when we were asked to look into a fix that had been implemented 20 years ago. It seemed to be unravelling. This is what is sometimes called a latent defect (one out in production code that suddenly appears). Suffice to say it was the same fix throughout the code and that Root Cause Analysis at the time it originally occurred would have been very helpful. It would have stopped the expensive and time consuming fix required recently. Root cause analysis would have paid for itself multiple times over.

    So the next time you have an error, take a look at it and see if you can put something down in the defect description indicating where it might have originated. Some classification will go a long way to describing where we need to concentrate more resources.

    October is Quality Month. Sign up for our newsletter, to see some things about Quality Month or request a copy if you are too late for this month.

    Photo by JJ Ying on Unsplash

  • Register this week for the October events at TASSQ and KWSQA

    Last chance to register for TASSQ and KWSQA

       

    NVP is at StarCanada Wednesday and Thursday this week. Stop at our booth for your free stress relieving bug.

       

    If you are in the Greater Toronto Area or Kitchener-Waterloo you might want to consider these events to network with other QA people or learn some of the new ideas in QA.

    This image has an empty alt attribute; its file name is antenna-502680-unsplash-1024x683.jpg

    NVP Software Solutions will be participating in the following software testing and quality assurance events happening this September in Ontario, Canada. The events are located in Toronto and Kitchener-Waterloo in the coming weeks. Check out the relevant websites for more information and to register. This is a great opportunity to connect with other software testing and quality assurance professionals. We hope to see you there!


    Photo by Antenna on Unsplash




    THE POWER OF DESIGN SPRINTS FOR PRODUCT TEAM

     October 29, 2019 6:00 p.m.  The Albany Club – 91 King Street East, Toronto, Ontario

    Presenters:  Leah Oliveira and Carlos Oliveira

    Reality Driven Testing

     October 30, 2019 11:30 a.m.   University of Waterloo

    Presenters:  Rob Sabourin 

  • October 2019 Software QA Events in the GTA and beyond

    If you are in the Greater Toronto Area or Kitchener-Waterloo you might want to consider these events to network with other QA people or learn some of the new ideas in QA.

    This image has an empty alt attribute; its file name is antenna-502680-unsplash-1024x683.jpg

    NVP Software Solutions will be participating in the following software testing and quality assurance events happening this September in Ontario, Canada. The events are located in Toronto and Kitchener-Waterloo in the coming weeks. Check out the relevant websites for more information and to register. This is a great opportunity to connect with other software testing and quality assurance professionals. We hope to see you there!


    Photo by Antenna on Unsplash




    THE POWER OF DESIGN SPRINTS FOR PRODUCT TEAM

     October 29, 2019 6:00 p.m.  The Albany Club – 91 King Street East, Toronto, Ontario

    Presenters:  Leah Oliveira and Carlos Oliveira

    Reality Driven Testing

     October 30, 2019 11:30 a.m.   University of Waterloo

    Presenters:  Rob Sabourin 

  • Trend analysis on Defects – Part 1

    One of the items that is missed frequently in most organisations is Trend Analysis on Defects. This is the trend over time and over multiple projects. We are looking for whether things are improving or not as the case may be, over time. There are a number of prerequisites which we will discuss today prior to returning to this topic in a couple of weeks with further information.

    Prerequisites

    1. A consistent method of classifying defects with regard to Priority and Severity. I include definitions and standards for Priority and Severity either in the Test Strategy for an organisation or in every test plan for every project. Note that without this in place; it is impossible to do any trend analysis.
    2. A commitment to fairly recording defects in every phase of the project no matter what methodology you use. Otherwise it is possible to make certain phases look better than others by ignoring or failing to record the defects.
    3. A commitment from management to the years it will take to obtain sufficient statistics for trends to be meaningful.
    4. A commitment to and an understanding of the methodology used to reconcile projects that of different size and that are built in different ways. A very large project built in-house cannot be directly compared to a small project where the software was purchased.
    5. An understanding of the statistical processes that will be used to generate the trends. It is far too easy to draw the wrong conclusion
    If any of these prerequisites are not in place before anything starts, you are embarking on an expensive failure!

    Photo by Stephan Henning on Unsplash

  • September Events in the GTA and Beyond

    If you are in the Greater Toronto Area or Kitchener-Waterloo you might want to consider these events to network with other QA people or learn some of the new ideas in QA.

    This image has an empty alt attribute; its file name is antenna-502680-unsplash-1024x683.jpg

    NVP Software Solutions will be participating in the following software testing and quality assurance events happening this September in Ontario, Canada. The events are located in Toronto and Kitchener-Waterloo in the coming weeks. Check out the relevant websites for more information and to register. This is a great opportunity to connect with other software testing and quality assurance professionals. We hope to see you there!


    Photo by Antenna on Unsplash




    PRESIDENT’S DINNER – V2X (VEHICLE TO EVERYTHING) COMMUNICATIONS

     September 24, 2019 6:00 p.m.  The Albany Club – 91 King Street East, Toronto, Ontario

    Presenters: Charles Finlay, Peter Watkins, Tony Hui, Ysni Semsedini Moderator: Colin Dhillon

    Targeting Quality Conference

     September 23 – 24, 2019 6:00 p.m.  Cambridge Hotel & Conference Centre easily located on Hespeler Road in Cambridge, just off of the 401.

  • What to do with the reported information?

    We have spent the last couple of months asking what should be reported from testing and to whom it should be reported. However, one of the major issues is that a lot of information and statistics are generated and often ignored until the end of the project. We end up with reams of results, counts, trends (sometimes), comments, analysis and opinions along with change documents and other project documentation with no obvious home and no-one looking at it.

    The end result is that decisions are taken without all the required information.

    This is obviously not just a QA issue but a project issue, however, our concern is with the QA information that has been generated. Although it may seem obvious, we need to make sure the items that need attention are brought up at the correct time and place for a decision. Many people will colour code their results to make sure that any critical issues are addressed but that does not help if no-one is paying attention. We tend to go back to the Quality Assurance aspect of determining the process that should be followed and of Process Improvement as we tweak that process to work for us. We need to identify who needs the information and from whom decisions must be obtained. Statistics and impact analysis also always help here to identify what could go wrong. A few weeks ago we mentioned trend analysis and thresholds. These are also critical for any of the information generated.

    Photo by Vladislav Babienko on Unsplash
  • Does your software Testing Pay – Part 2

    “Does your Software Testing Pay” is the question we posted two weeks ago.
    There are obviously two parts to this question, one is how much it costs and the other is how much it saves.

    The cost portion is reasonably easy:

    1. Chargeback rate on resources (means hours per release must be recorded).
    2. Equipment and space usage (if not included in the above).
    3. Test Tool cost (amortized over all the projects)
    4. Opportunity cost if the resources should be engaged in something else.

    The savings portion is somewhat harder:

    What would each found defect have cost to fix in production? This is obviously an estimate.  One calculation is supplied below.

    1. Look at each defect found by testing.
    2. Estimate the probability of it occuring after release of the code to the users. You might want to take estimates from several people and average them. Either the probabilities must be between 0 and 1 or else you need to convert to a figure between 0 and 1 before using it.
    3. Estimate the cost to the organisation under the assumption that the defect does occur in production. This cost includes the direct costs to the company (fixing, testing and deploying); the indirect costs (administration etc) that are often hidden (Iceberg – 9/10 of the costs are hidden)
    4. The cost of the customers in rework, lost data, inability to respond proerly.
    5. Add up the costs per defect and multiply by the percentage.
    6. Add up the resulting figure for all defects found by testing for a release.

    If Savings > Costs your software testing is paying for itself.

  • Review of the Year – Automation

    Review of the Year – Automation
    Automation of software testing is something that seems to be on the minds of many Quality Assurance Managers and Test Leads. It has been a popular topic for many years.

    Currently we get requests for particular tools and knowledge of their attributes in particular environments; these are usually serviceable. Current status seems to be a separate tool for every need and environment and sometimes every organisation. Based on past experience, in a few years, someone will consolidate all the disparate technologies under one umbrella tool. Then the cycle will start again with people inventing new tools for specific niches and products.

    We also receive requests for people to “automate our testing” with no decision on the tool attached. This is a completely different question and requires some discussion to occur before the attempt to automate even starts. We need to know the what; when and Why the company wants to automate their testing. The thing we want to avoid are the actions below.

    1. Purchase Automated Test Tool.
    2. Install Tool.
    3. Wait for successful automation to save all the cost of the tool and cost of manual testing.
    4. Become disillusioned.
    5. Go to 1; Repeat ad infinitum.

    This has occurred over and over in different organisations. A lot of money gets used up with no progress and eventually the organisation gives up on the cycle and continues manual testing (see last week’s blog).

    Our best recommendations are as follows:

    Look up where you are on the technology maturity level with your current technology.

    Decide what you need (criteria are available) in terms of automation and what you are capable of handling based on maturity level.

    Then do a Plan to implement your automation. Never assume it will just occur. It won’t.

    Want to discuss how to automate effectively? Contact us.