Appendix 6: decision trees.
Decision Tree Methodology: Decision trees can be used to help make decisions. The idea is to concretely identify the choice points and map the sequence of decisions from beginning to end. The advantage is that how a decision is made is made explicit, and others can use the decision tree if faced with the same questions.
A decision tree is started with a decision that must be made: whether or not to arrest a suspect. A square (representing this decision) is drawn on the left hand side of the paper. From this box, lines are drawn out towards the right for each possible solution, and the solutions are written along those lines. At the end of each line, the results are considered. If the result is outside the decision-maker's control--that is, if nature makes the next move--then a circle is drawn at the end of the line. If the result is another decision, a square is drawn. If there is a final consequence, a solid dot is drawn.
The procedure for choosing a decision strategy from a decision tree is called backward induction This article is about game theory. For dynamic programming, see Bellman equation#Solutions.
In game theory, backward induction is an algorithm used to compute subgame perfect equilibria in sequential games. analysis. This analysis can be summarized as follows: First, each terminal node terminal node - leaf (marked by the black dot) is assigned as·sign
tr.v. as·signed, as·sign·ing, as·signs
1. To set apart for a particular purpose; designate: assigned a day for the inspection.
2. a number that represents the worth or utility of the final consequence to the decision maker. For example, at the top of the figure above, the consequence of "describes plot to attack" is given a high value of 10; at the bottom right hand corner, the consequence of "released on bail" is given a low value of 1. Second, each event node node, in astronomy, point at which the orbit of a body crosses a reference plane. One reference plane that is often used is the plane of the earth's orbit around the sun (ecliptic). (the circles) is assigned a sum that represents the expected utility of the node. This is the weighted average utility of the event node--for node (1) for example, this is the consequence of "describing a plot to attack" multiplied by the likelihood of that outcome (p) which is estimated at 1/10 or 0.1, or 10*0.1=1.0, plus the consequence of "no evidence of plot" multiplied by the likelihood of that outcome (p) which is estimated at 8/10 or 0.8, or 2*0.8=2.6, that is, [10(.10)+2(.8)=2.6]. Finally, each decision node is a assigned a number that is the maximum value of the nodes that branch out from it. Thus, for decision , since 2.6>1(.1), at decision point . "interrogate (1) To search, sum or count records in a file. See query.
(2) To test the condition or status of a terminal or computer system. " is the best choice.
Working from right to left, these calculations are:
The value of decision trees are (1) the choices are made very explicit; (2) each choice is explicitly evaluated in terms of the importance of its outcome and the probability of that outcome; (3) how a decision will be made--or was made--can be communicated to another person. As with any technique or tool suggested here, decision trees can be used to guide decisions, not make them. The final decision is left up to the operator.