total quality management, tqm, six sigma, methodology, training, calculating, qualtec, method, six-sigma employment, 6 sigma

Become the best in class using six sigma plus. Your total quality 
management (tqm)
efforts can be focused to outpace the rest.

home
|| services || about us || contact us || links || e-mail


     WXYZ .

Taguchi, Genichi Developed a set of practices known as Taguchi Methods, as they are known in the U.S., for improving quality while reducing costs. Taguchi Methods focus on the design of efficient experiments, and the increasing of signal to noise ratios. Dr. Taguchi also articulated the developed the quality loss function. Currently, he is executive director of the American Supplier Institute and director of the Japan Industrial Technology Institute.
Tampering Not differentiating between common and special cause variation and changing the process. 
Tampering Dr. Deming cautions against tampering with systems that are "in control." It is very common for management to react to variation which is in fact normal, thereby starting wild goose chases after sources of problems which don't exist. Tampering with stable processes actually increases variation.
TBD To Be Determined 
Team Feasibility Commitment A commitment by the Product Quality Planning Team that the design can be manufactured, assembled, tested, packaged, and shipped in sufficient quantity at an acceptable cost, and on schedule. 
Test case generator. (IEEE) A software tool that accepts as input source code, test criteria, specifications, or data structure definitions; uses these inputs to generate test input data; and, sometimes, determines expected results.
Test case. (IEEE) Documentation specifying inputs, predicted results, and a set of execution conditions for a test item. Syn: test case specification.  
Test design. (IEEE) Documentation specifying the details of the test approach for a software feature or combination of software features and identifying the associated tests. See: testing functional; cause effect graphing; boundary value analysis; equivalence class partitioning; error guessing; testing, structural; branch analysis; path analysis; statement coverage; condition coverage: decision coverage; multiple-condition coverage.
Test documentation. (IEEE) Documentation describing plans for, or results ot the testing of a system or component, Types include test case specification, test incident report, test log, test plan, test procedure, test report.
Test driver. (IEEE) A software module used to invoke a module under test and, often, provide test inputs, control and monitor execution, and report test results. Syn: test harness. 
Test incident report. (IEEE) A document reporting on any event that occurs during testing that requires further investigation. See: failure analysis.
Test log. (IEEE) A chronological record of all relevant details about the execution of a test.
TEST OF SIGNIFICANCE A procedure to determine whether a quantity subjected to random variation differs from a postulated value by an amount greater than that due to random variation alone.
Test phase. (IEEE) The period of time in the software life cycle in which the components of a software product are evaluated and integrated, and the software product is evaluated to determine whether or not requirements have been satisfied.
Test plan. (IEEE) Documentation specifying the scope, approach, resources, and schedule of intended testing activities. It identifies test items, the features to be tested, the testing tasks, responsibilities, required resources, and any risks requiring contingency planning. See: test design, validation protocol.
Test procedure. (NIST) A formal document developed from a test plan that presents detailed instructions for the setup, operation, and evaluation of the results for each defined test. See: test case.
Test readiness review. (IEEE) (1) A review conducted to evaluate preliminary rest results for one or more configuration items; to verify that the test procedures for each configuration item are complete, comply with test plans and descriptions, and satisfy test requirements; and to verify that a project is prepared to proceed to formal testing of the configuration items. (2) A review as in (1) for any hardware or software component. Contrast with code review, design review, formal qualification review, requirements review.
Test report. (IEEE) A document describing the conduct and results of the testing carried out for a system or system component.
Test result analyzer. A software tool used to test output data reduction, formatting, and printing.
Test script. Documentation specifying a sequence of actions and expected results to accomplish a system task.
Test. (IEEE) An activity in which a system or component is executed under specified conditions, the results are observed or recorded and an evaluation is made of some aspect of the system or component.
Testability. (IEEE) (1) The degree to which a system or component facilitates the establishment of test criteria and the performance of tests to determine whether those criteria have been met. (2) The degree to which a requirement is stated in terms that permit establishment of test criteria and performance of tests to determine whether those criteria have been met.  
Testing, acceptance. (IEEE) Testing conducted to determine whether or not a system satisfies its acceptance criteria and to enable the customer to determine whether or not to accept the system. Contrast with testing, development; testing, operational. See: testing, qualification.
Testing, alpha [a]. (Pressman) Acceptance testing performed by the customer in a controlled environment at the developer's site. The software is used by the customer in a setting approximating the target environment with the developer observing and recording errors and usage problems.
Testing, assertion. (NBS) A dynamic analysis technique which inserts assertions about the relationship between program variables into the program code. The truth of the assertions is determined as the program executes. See:
Testing, beta [B].(1) (Pressman) Acceptance testing performed by the customer in a live application of the software, at one or more end user sites, in an environment not controlled by the developer. (2) For medical device software such use may require an Investigational Device Exemption [ICE] or Institutional Review Board (IRS] approval.
Testing, boundary value. A testing technique using input values at, just below, and just above, the defined limits of an input domain; and with input values causing outputs to be at, just below, and just above, the defined limits of an output domain. See: boundary' value analysis; testing, stress.
Testing, branch. (NBS) Testing technique to satisfy coverage criteria which require that for each decision point, each possible branch (outcome] be executed at least once. Contrast with testing, path; testing, statement. See: branch coverage.
Testing, compatibility. The process of determining the ability of two or more systems to exchange information. In a situation where the developed software replaces an already working program, an investigation should be conducted to assess possible comparability problems between the new software and other programs or systems. See: different software system analysis; testing, integration; testing, interface. program variables. Feasible only for small, simple programs. 
Testing, design based functional. (NBS) The application of test data derived through functional analysis extended to include design functions as well as requirement functions. See: testing, functional.
Testing, development. (IEEE) Testing conducted during the development of a system or component, usually in the development environment by the developer. Contrast with testing, acceptance; testing, operational.
Testing, exhaustive. (NBS) Executing the program with all possible combinations of values for program variables. Feasible only for small, simple programs. 
Testing, formal. (IEEE) Testing conducted in accordance with test plans and procedures that have been reviewed and approved by a customer, user, or designated level of management. Antonym: informal testing.
Testing, functional. (IEEE) (1) Testing that ignores the internal mechanism or structure of a system or component and focuses on the outputs generated in response to selected inputs and execution conditions. (2) Testing conducted to evaluate the compliance of a system or component with specified functional requirements and corresponding predicted results. Syn: black-box testing, input/output driven testing. Contrast with testing, structural.
testing, interface. (IEEE) Testing conducted to evaluate whether systems or components pass data and control correctly to one another. Contrast with testing, unit; testing, system. See: testing, integration.
Testing, invalid case. A testing technique using erroneous [invalid, abnormal, or unexpected] input values or conditions. See: equivalence class partitioning.
testing, mutation. (IEEE) A testing methodology in which two or more program mutations are executed using the same test cases to evaluate the ability of the test cases to detect differences in the mutations.
Testing, operational. (IEEE) Testing conducted to evaluate a system or component in its operational environment. Contrast with testing, development; testing, acceptance; See: testing, system.
Testing, parallel .(ISO) Testing a new or an alternate data processing system with the same source data that is used in another system. The other system is considered as the standard of comparison. Syn: parallel run.
Testing, path. (NBS) Testing to satisfy coverage criteria that each logical path through the program be tested. Often paths through the program are grouped into a finite set of classes. One path from each class is then tested. Syn path coverage. Contrast with testing, branch; testing, statement; branch coverage; condition coverage; decision coverage.
Testing, qualification. (IEEE) Formal testing, usually conducted by the developer for the consumer, to demonstrate that the software meets its specified requirements. See: testing, acceptance; testing, system.
Testing, regression. (NIST) Rerunning test cases which a program has previously executed correctly in order to detect errors spawned by changes or corrections made during software development and maintenance.
Testing, storage. This is a determination of whether or not certain processing conditions use more storage (memory] than estimated.
Testing, stress .(IEEE) Testing conducted to evaluate a system or component at or beyond the limits of its specified requirements. Syn: testing, boundary value.
testing, structural. (1) (IEEE) Testing that takes into account the internal mechanism [structure] of a system or component. Types include branch testing, path testing, statement testing. (2) Testing to insure each program statement is made to execute during testing and that each program statement performs its intended function. Contrast with functional testing. Syn: white-box testing, glass-box testing, logic driven testing.
Testing, system. (IEEE) The process of testing an integrated hardware and software system to verify that the system meets its specified requirements. Such testing may be conducted in both the development environment and the target environment.
testing, unit. (1) (NIST) Testing of a module for typographic, syntactic, and logical errors, for correct implementation of its design, and for satisfaction of its requirements. (2) (IEEE) Testing conducted to verify the implementation of the design for one software element; e.g., a unit or module; or a collection of software elements. Syn: component testing.
Testing, usability. designed in a manner such that the information is displayed in a understandable fashion enabling the operator to correctly interact with the system?
testing, volume. Testing designed to challenge a system's ability to manage the maximum amount of data over a period of time. This type of testing also evaluates a system's ability to handle overload situations in an orderly fashion.
Testing, worst case. Testing which encompasses upper and lower limits, and circumstances which pose the greatest chance finding of errors. Syn: most appropriate challenge conditions. See: testing, boundary' value; testing, invalid case; testing. special case: testing, stress; testing, volume.
Testing. integration. (IEEE) An orderly progression of testing in which software elements, hardware elements, or both are combined and tested, to evaluate their interactions, until the entire system has been integrated.
Testing. performance. (IEEE) Functional testing conducted to evaluate the compliance of a system or component with specified performance requirements.
Testing. special case. A testing technique using input values that seem likely to cause program errors; e.g., "0", "1", NULL, empty string. See: error guessing.
Testing. statement. (NIST) Testing to satisfy the criterion that each statement in a program be executed at least once during program testing. Syn: statement coverage. Contrast with testing, branch; testing, path; branch coverage; condition coverage; decision coverage; multiple condition coverage; path coverage.
Testing. valid case. A testing technique using valid [normal or expected] input values or conditions. See: equivalence class partitioning.
Testing. (IEEE) (1) The process of operating a system or component under specified conditions, observing or recording the results, and making an evaluation of some aspect of the system or component. (2) The process of analyzing a software item to detect the differences between existing and required conditions, i.e., bugs, and to evaluate the features of the software items. See: dynamic analysis, static analysis, software engineering.
TGR Things Gone Right. 
TGW Things Gone Wrong. 
THEORY A plausible or scientifically acceptable general principle offered to explain phenomena. 
Time sharing. (IEEE) A mode of operation that permits two or more users to execute computer programs concurrently on the same computer system by interleaving the execution of their programs. May be implemented by time slicing, priority-based interrupts, or other scheduling methods.
Time to market: The time that begins when resources are assigned to assess a product's feasibility and ends when the first production unit is delivered.
Timekeeper Team member who keeps track of time spent on each agenda item during team meetings. This job can easily be rotated among team members.
Timing analyzer. (IEEE) A software tool that estimates or measures the execution time of a computer program or portion of a computer program, either by summing the execution times of the instructions along specified paths or by inserting probes at specified points in the program and measuring the execution time between probes.
Timing and sizing analysis. (IEEE) Analysis of the safety implications of safety-critical requirements that relate to execution time, clock time, and memory' allocation.
Timing Plan A plan that lists tasks, assignments, events, and timing required to provide a product that meets customer needs and expectations. 
Timing. (IEEE) The process of estimating or measuring the amount of execution time required for a software system or component. Contrast with sizing.
TO BE Model Models that are the result of applying improvement opportunities to the current (AS IS) business environment. 
Top-down design. Pertaining to design methodology that starts with the highest level of abstraction and proceeds through progressively lower levels. See: structured design.
TOPS Team Oriented Problem Solving 
Total Quality Management (TQM) TQM is management and control activities based on the leadership of top management and based on the involvement of all employees and all departments from planning and development to sales and service. These management and control activities focus on quality assurance by which those qualities which satisfy the customer are built into products and services during the above processes and then offered to consumers. 
Total Quality Management Managing for quality in all aspects of an organization focusing on employee participation and customer satisfaction. Often used as a catch-all phrase for implementing various quality control and improvement tools.
Total Quality Management/Total Quality Leadership (TQM/TQL) Both a philosophy and a set of guiding principles that represent the foundation of the continuously improving organization. TQM/TQL is the application of quantitative methods and human resources to improve the material and services supplied to a organization, all the processes within an organization, and the degree to which the needs of the customer are met, now and in the future. TQM/TQL integrates fundamental management techniques, existing improvement efforts and technical tools under a disciplined approach focused on continuous improvement. 
TQM Total Quality Management: A management approach of an organization centered on quality. 
Trace. (IEEE) (1) A record of the execution of a computer program, showing the sequence of instructions executed, the names and values of variables, or both. Types include execution trace, retrospective trace, subroutine trace, symbolic trace, variable trace. (2) To produce a record as in (1). (3) To establish a relationship between two or more products of the development process: a.g., to establish the relationship between a given requirement and the design element that implements that requirement.
Traceability analysis. (IEEE) The tracing of (1) Software Requirements Specifications requirements to system requirements in concept documentation, (2) software design descriptions to software requirements specifications and software requirements specifications to software design descriptions, (3) source code to corresponding design specifications and design specifications to source code. Analyze identified relationships for correctness, consistency, completeness, and accuracy. See: traceability, traceability matrix.
Traceability matrix. (IEEE) A matrix that records the relationship between two or more products; ag., a matrix that records the relationship between the requirements and the design of a given software component. See: traceability, traceability analysis.
Traceability The ability to trace a product back through the process , and identify all sub-processes, components, and equipment that were involved in its manufacture. 
Traceability. (IEEE) (1) The degree to which a relationship can be established between two or more products of the development process, especially products having a predecessor-successor or master-subordinate relationship to one another; ag., the degree to which the requirements and design of a given software component match. See: consistency. (2) The degree to which each element in a software development product establishes its reason for existing; e.g., the degree to which each element in a bubble chart references the requirement that it satisfies. See: traceability analysis, traceability matrix.
transaction analysis. A structured software design technique, deriving the structure of a system from analyzing the transactions that the system is required to process.
Transaction flow-graph. (Seizer) A model of the structure of the system's (program's] behavior, i.e., functionality.
Transaction matrix. (IEEE) A matrix that identifies possible requests for database access and relates each request to information categories or elements in the database.
Transaction. (ANSI) (1) A command, message, or input record that explicitly or implicitly calls for a processing action, such as updating a file. (2) An exchange between and end user and an interactive system. (3) In a database management system, a unit of processing activity' that accomplishes a specific purpose such as a retrieval, an update, a modification, or a deletion of one or more data elements of a storage structure.
Transform analysis. A structured software design technique in which system structure is derived from analyzing the flow of data through the system and the transformations that must be performed on the data.
Transition Period Time when an organization is moving away from an old way of thinking to the new way.
Tree diagram A chart used to break any task, goal, or category into increasingly detailed levels of information. Family trees are the classic example of a tree diagram.  
TRIZ Theory of Inventive Problem Solving 
Trojan horse. A method of attacking a computer system, typically by providing a useful program which contains code intended to compromise a computer system by secretly providing for unauthorized access, the unauthorized collection of privileged system or user data, the unauthorized reading or altering of files, the performance of unintended and unexpected functions, or the malicious destruction of software and hardware See: bomb, virus, worm.
Truth table.(1) (ISO) An operation table for a logic operation. (2) A table that describes a logic function by listing all possible combinations of input values, and indicating, for each combination. the output value.
Tuning. (NIST) Determining what pans of a program are being executed the most. A tool that instruments a program to obtain execution frequencies of statements is a tool with this feature.
TWO SIDED ALTERNATIVE The values of a parameter which designate an upper and lower bound.  
Type I error Rejecting something that is acceptable. Also known as an alpha error.
Type II error Accepting something that should have been rejected. Also known as beta error.

     WXYZ . 

|| newsletter archives ||

home || services || about us || contact us || links || e-mail


Site created by VisionMasters. Hosted by Immix.net
Copyright 1999 Adams Six Sigma. All rights reserved.
Revised: June 18, 2002.