QA Glossary

The ultimate guide to essential QA terms and best practices

Acceptance Testing

Definition:

A type of testing conducted to determine whether a system meets the requirements and is ready for delivery.

Example:

Running a final set of tests against the system to confirm that the software functions as expected before release.

Best practices:

  • Collaborate closely with stakeholders to define acceptance criteria.
  • Use real-world scenarios to validate the product.
  • Document test cases clearly to ensure consistency.

Alpha Testing

Definition:

An initial phase of testing conducted by internal teams before releasing the product to external users.

Example:

In-house testers running tests on an early version of the software to identify bugs and usability issues.

Best practices:

  • Conduct in a controlled environment to catch issues early.
  • Focus on key functionalities and common user flows.
  • Gather feedback from testers to improve the product before beta release.

Automation Testing

Definition:

Using software tools to execute pre-scripted tests on an application before it is released into production.

Example:

Automating regression tests to quickly validate that new code changes haven’t broken existing functionality.

Best practices:

  • Prioritize test cases that are time-consuming and repetitive.
  • Maintain test scripts regularly to keep up with software changes.
  • Use reliable automation tools like Selenium, Cypress, or Playwright.
  • Beta Testing

    Definition:

    A phase of testing where a product is released to a limited audience outside the development team to identify bugs and gather user feedback.

    Example:

    Releasing a new app version to a select group of users who provide feedback on performance and usability.

    Best practices:

    • Choose diverse testers that represent your target audience.
    • Collect feedback systematically and act on critical issues.
    • Use beta feedback to refine the product before the full launch.

    Black Box Testing

    Definition:

    Testing a system without any knowledge of its internal workings; focuses on inputs and expected outputs.

    Example:

    Testing a login function by entering different usernames and passwords to see if the system responds correctly.

    Best practices:

    • Define clear input and expected output scenarios.
    • Test edge cases to ensure the application handles unexpected inputs.
    • Use black box testing alongside white box testing for comprehensive coverage.

    Bug Tracking

    Definition:

    The process of capturing, reporting, and managing defects in software during the development and testing phases.

    Example:

    Using a bug report software to log defects and track their status until resolution.

    Best practices:

    • Use descriptive titles and detailed steps to reproduce bugs.
    • Prioritize bugs based on severity and impact on users.
    • Regularly review and update bug statuses to keep the team informed.
    • Use a tool like Heurio for priorization, status management, and team collaboration.

    Continuous Integration (CI)

    Definition:

    A software development practice where developers frequently integrate code changes into a shared repository, followed by automated builds and tests.

    Example:

    Developers push code changes multiple times a day, and the CI server automatically runs tests to catch issues early.

    Best practices:

  • Set up automated test suites to run after each code commit.
  • Ensure quick feedback loops to fix broken builds immediately.
  • Use CI tools like Jenkins, GitLab CI, or CircleCI.
  • End-to-End (E2E) Testing

    Definition:

    A testing approach that validates the complete functionality of an application, including the integration of all components from start to finish.

    Example:

    Testing an e-commerce site by completing a full purchase flow from product search to checkout and payment confirmation.

    Best practices:

    • Focus on user scenarios that represent typical use cases.
    • Use E2E testing tools like Cypress or TestCafe.
    • Automate repetitive E2E tests to save time and ensure consistency.

    Exploratory Testing

    Definition:

    An informal testing approach where testers actively explore the application to identify defects based on their intuition and experience.

    Example:

    Manually navigating a newly developed feature to identify any unexpected behaviors or errors.

    Best practices:

    • Combine exploratory testing with structured testing to cover gaps.
    • Keep notes of any defects or observations for documentation.
    • Use exploratory testing sessions to identify areas needing further attention.

    Integration Testing

    Definition:

    Testing individual components together to ensure they work correctly as a group.

    Example:

    Testing the interaction between a web application\'92s front-end and back-end APIs.

    Best practices:

    • Start testing small components before moving to larger integrated systems.
    • Use integration tests to validate data flow and communication between components.
    • Prioritize integration points that are complex or have historically failed.

    Load Testing

    Definition:

    A type of performance testing that determines how a system behaves under expected and peak loads.

    Example:

    Testing an e-commerce website by simulating thousands of users checking out simultaneously.

    Best practices:

    • Define realistic load scenarios based on expected traffic patterns.
    • Use load testing tools like Apache JMeter or LoadRunner.
    • Monitor system performance metrics, such as response time and resource utilization.

    Penetration Testing (Pen Testing)

    Definition:

    A security testing method used to identify vulnerabilities in a system by simulating an attack.

    Example:

    Testing a web application for SQL injection vulnerabilities by attempting unauthorized data access.

    Best practices:

    • Conduct regular pen tests, especially after significant code changes.
    • Use both automated tools and manual techniques for thorough testing.
    • Address identified vulnerabilities promptly to maintain security.

    Performance Testing

    Definition:

    A testing technique to determine how a system performs in terms of responsiveness and stability under various conditions.

    Example:

    Measuring how quickly a web page loads under different levels of user traffic.

    Best practices:

    • Focus on response time, throughput, and resource utilization.
    • Use tools like Apache JMeter, LoadRunner, or Gatling.
    • Test under normal, peak, and stress conditions.

    Regression Testing

    Definition:

    Testing existing functionality to ensure that recent code changes have not introduced new bugs.

    Example:

    Running a suite of tests after a bug fix to confirm that other functionalities remain intact.

    Best practices:

    • Automate regression tests for quicker feedback and consistency.
    • Prioritize tests that cover critical application paths.
    • Continuously update regression tests to include new bug fixes and features.

    Smoke Testing

    Definition:

    A preliminary test to check the basic functionality of an application before conducting more detailed testing.

    Example:

    After a new build, verifying that the application starts, navigates between screens, and performs basic functions without crashing.

    Best practices:

    • Run smoke tests after every build to catch fundamental issues early.
    • Keep smoke tests simple and focused on the main application workflows.
    • Automate smoke tests to save time and ensure consistency.

    Stress Testing

    Definition:

    Testing how a system behaves under extreme conditions, often beyond its normal operational capacity.

    Example:

    Overloading a web server to determine its breaking point and recovery time.

    Best practices:

    • Gradually increase load to identify the failure threshold.
    • Monitor system recovery to ensure it can return to normal after stress.
    • Use stress tests to plan for scaling and capacity upgrades.

    Test Case

    Definition:

    A set of conditions or variables used to determine if a system under test meets requirements.

    Example:

    A test case for a login feature might include inputs for valid and invalid usernames and passwords.

    Best practices:

    • Write clear, concise, and reusable test cases.
    • Include expected results to compare against actual outcomes.
    • Regularly review and update test cases as application requirements change.

    Test Data Management (TDM)

    Definition:

    The process of creating, managing, and maintaining data necessary for software testing.

    Example:

    Generating realistic test data for validating user workflows in an application.

    Best practices:

    • Use anonymized production data where possible to ensure realism.
    • Regularly refresh test data to avoid stale scenarios.
    • Ensure test data covers all edge cases and scenarios.

    Test Environment

    Definition:

    A configured environment used for testing, including hardware, software, and network settings that mimic production as closely as possible.

    Example:

    Setting up a staging server with the same configurations as the live server to test new features.

    Best practices:

    • Keep test environments isolated to avoid affecting production data.
    • Regularly refresh environments with the latest data and configurations.
    • Monitor environments to ensure they remain stable and reflective of production.

    Unit Testing

    Definition:

    Testing individual components of software in isolation to ensure they work as expected.

    Example:

    Testing a single function in a codebase, like a calculation method, to ensure it returns the correct result.

    Best practices:

    • Keep unit tests small, focusing on one aspect of the code at a time.
    • Run unit tests frequently to catch bugs early in the development cycle.
    • Use frameworks like Jest for JavaScript or NUnit for .NET to automate unit tests.

    Usability Testing

    Definition:

    Testing how easily end-users can interact with a software product, focusing on the user experience.

    Example:

    Observing users as they navigate an app to identify areas of confusion or difficulty.

    Best practices:

    • Test with real users who match your target audience.
    • Focus on areas where users struggle or hesitate.
    • Use feedback to iterate on design and improve usability.

    User Acceptance Testing (UAT)

    Definition:

    The final phase of testing where end-users validate the functionality before the system goes live.

    Example:

    A group of end-users testing a new feature to confirm it meets their needs and expectations.

    Best practices:

    • Define clear UAT criteria and involve real users in testing.
    • Document feedback and ensure all critical issues are resolved before release.
    • Use scenarios that reflect actual user workflows.

    Vulnerability Scanning

    Definition:

    An automated process that scans software for known security vulnerabilities.

    Example:

    Running a vulnerability scan to identify outdated libraries that may pose security risks.

    Best practices:

    • Regularly update your scanning tools and definitions.
    • Integrate vulnerability scanning into the CI/CD pipeline.
    • Review and remediate identified vulnerabilities promptly.

    White Box Testing

    Definition:

    Testing based on an understanding of the internal structure, logic, and design of the application.

    Example:

    Testing individual code paths in a function to ensure they execute correctly under different conditions.

    Best practices:

    • Use code coverage tools to identify untested paths.
    • Test boundary conditions and error handling within the code.
    • Combine white box testing with black box testing for full coverage.
    A profile image of a woman.A profile image of a man.

    Cut website approval times with Heurio

    By clicking “Accept all”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.