| A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z |

Bibliography

DEBUGGING. The check out of equipment, complex items, or computer programs prior to use in order to detect design and/or operational difficulties.

DECISION FUNCTION. A decision function is a rule of conduct which, at any stage of a sampling investigation, tells the statistician whether to take further observations or whether enough information has been collected, and in the latter case, what decision to make upon it. At each stage beyond the first the decision function is a function of the preceding observations. [22]

DECISION UNDER CERTAINTY. If a choice must be made between two or more actions, decision under certainty involves that realm where each action is known to lead invariably to a specific outcome.

DECISION UNDER RISK. Decision under risk is concerned with that realm of choice where each action leads to one of a set of possible specific outcomes, each outcome occurring with a known probability.

DECISION UNDER UNCERTAINTY. Decision under uncertainty involves that realm of choice where each action or combination has as its consequence a set of possible specific outcomes, but where the probabilities of these outcomes are completely unknown or are not even meaningful.

DECOMPOSITION PRINCIPLE. A method for subdividing a large linear programming problem into smaller, more manageable linear programs, solving the smaller problems in the standard way, and then combining these solutions to solve the original problem. In particular linear programs which qualify for decomposition are those for which the variables can be separated into classes X, Y, Z, ... and the constraints into classes C0, C1, C2, ... such that: The constraints in class C0 can involve all the variables; constraints in class Cl involve only the variables in X; the constraints in class C2 involve only the variables Y; the constraints in class C3 involve only the variables Z; and so on. The subsets of constraints, each involving only a part of the variables, are called subprograms; while the subset of constraints in C0 is called the “master program. “ [12]

DEFECT. A departure of a quality characteristic from its intended level or state that occurs with a severity sufficient to cause an associated product or service not to satisfy intended normal, or reasonably foreseeable, usage requirements.

DEGENERATE SOLUTION. For linear programs, a basic solution in which at least one basic variable is zero. [9]

DEGRADATION FAILURE. (See FAILURE, DEGRADATION.) [2O]

DELAY. (See WAITING TIME.) [33]

DEPENDABILITY. A measure of the item operating condition at one or more points during the mission, including the effects of reliability, maintainability, and survivability, given the item condition(s) at the start of the mission. It may be stated as the probability that an item will (a) enter or occupy any one of its required operational modes during a specified mission, and (b) perform the functions associated with those operational modes. [28]

DEPENDENT VARIABLE. (See INDEPENDENT VARIABLE.)

DERATING. (1) Using an item in such a way that applied stresses are below rated values, or (2) The lowering of the rating of an item in one stress field to allow an increase in rating in another stress field. [28]

DESIGNATED IMPERFECTIONS. A category of imperfections which, because of their type and/or seventy, are to be treated as an event for control purposes.

DETERMINISTIC MODEL. As opposed to a stochastic model, a model which contains no random elements and for which, therefore, the future course of the system is determined by its state at present (and/or in the past).

DIET PROBLEM. A linear programming problem in which the constraints represent minimum daily requirements for specified nutrients and variables represent foods. The ith coefficient of the jth activity vector represents the amount of the ith nutrient in one unit of the jth food. The right-hand side coefficients indicate the daily requirement of each nutrient, while the objective function coefficients represents the cost per unit of each food. [15]

DIOPHANTINE PROGRAMMING. (See INTEGER LINEAR PROGRAMMING.) [19]

DISCRETE VARIABLE PROBLEM. (See INTEGER LINEAR PROGRAMMING.) [19]

DISTRIBUTION-FREE METHOD. A method, e.g., of testing a hypothesis or of setting up a confidence interval which does not depend on the form of the underlying distribution.

DOWNTIME. (See TIME, DOWN; Z94.17 WORK MEASUREMENT) [28]

DUAL LINEAR PROGRAMMING PROBLEMS. A pair of linear programs, called “primal” and “dual” of the following form:

Unsymmetric Case:

Primal Min cx

Subject to AX=b

X ³ 0

Dual Max Wb

Subject to WA£c

Symmetric Case:

Primal Min cx

Subject to AX³b

x³0

Dual: Max Wb

Subject to WA£c

W³0

DUAL PROBLEM. (See DUAL LINEAR PROGRAMMING PROBLEMS.) [19]

DUAL SIMPLEX ALGORITHM. Starting with a dual feasible solution, the algorithm selects the most infeasible vector to leave the primal basis and then computes which vector must enter to maintain dual feasibility. As a consequence of this change of basis, other variables may become primal infeasible, but since the objective value changes monotonically toward optimum, the algorithm is finitely convergent. It is exactly the primal algorithm applied to the dual problem. [19]

DUALITY THEOREMS FOR LINEAR PROGRAMMING. (1) Main theorem: If both the primal and dual problems have a finite optimum, then the optimum values of the objective functions are equal. (2) Corollary: If either problem has a feasible finite optimum, then so does the other, and the optimum values are equal. (3) Corollary: A feasible but unbounded solution to one problem implies no feasible solution for the other. (4) Corollary: No feasible solution to one problem implies that the other is either unbounded or infeasible. (5) Weak theorems of the alternative: A variable and its complementary slack are not both nonzero. (6) Strong theorem of the alternative: Among all alternate optima, at least one solution exists in which a variable and its complementary slack are not both zero, and the one of the pair that is zero in this solution is zero in all alternate solutions. [19]

DYNAMIC PROGRAMMING. A method for optimizing a set of decisions which may be made sequentially. Characteristically, each decision may be made in the light of the information embodied in a small number of observable called state variables. The incurred cost for each period is a mathematical function of the current state and decision variables, while future states are functions of these variables. The aim of the decision policy is to minimize the total incurred cost, of equivalently the average cost per period. [19]