# decision tree

(redirected from Decision trees)
Also found in: Dictionary, Medical, Encyclopedia.

## Decision tree

Schematic way of representing alternative sequential decisions and the possible outcomes from these decisions.

## Decision Tree

In risk analysis, a diagram of decisions and their potential consequences. It is used to help determine the most straightforward (and cheapest) way to arrive at a stated goal. It is represented by potential decisions (drawn as squares), branching off into different proximate consequences (drawn as circles), and potential end results (drawn as triangles).
Fig.32 Decision tree. The businessman has two options: to open a new factory to boost production capacity or not to open a new factory; and he has to consider two states of nature or events which can occur economic boom or recession. The businessman must assess the likelihood of each of these events occurring and, in this case, based on his knowledge and experience, he estimates that there is a one-in-two chance of a boom and a 0.5 probability of a recession. Finally, the businessman estimates the financial consequences as an £80,000 profit for the new factory if there is a boom, and a £30,000 loss if there is a recession.

## decision tree

an aid to decision-making in uncertain conditions, that sets out alternative courses of action and the financial consequences of each alternative, and assigns subjective probabilities to the likelihood of future events occurring. For, example, a firm thinking of opening a new factory the success of which will depend upon consumer spending (and thus the state of the economy) would have a decision tree like Fig. 32.

In order to make a decision, the manager needs a decision criterion to enable him to choose which he regards as the best of the alternatives and, since these choices involve an element of risk, we therefore need to know something about his attitudes to risk. If the manager were neutral in his attitude to risk then we could calculate the certainty equivalent of the ‘open factory’ alternative using the expected money value criterion, which takes the financial consequence of each outcome and weights it by the probability of its occurrence, thus:

which being greater than the £0 for certain of not opening the factory would justify going ahead with the factory project.

However, if the manager were averse to risk then he might not regard the expected money value criterion as being appropriate, for he might require a risk premium to induce him to take the risk. Application of a more cautious certainty equivalent criterion would reduce the certainty equivalent of the ‘open factory’ branch and might even tip the decision against going ahead on the grounds of the ‘downside risk’ of losing £30,000.See UNCERTAINTY AND RISK.

## decision tree

a graphical representation of the decision-making process in relation to a particular economic decision. The decision tree illustrates the possibilities open to the decision-maker in choosing between alternative strategies. It is possible to specify the financial consequence of each ‘branch’ of the decision tree and to gauge the PROBABILITY of particular events occurring that might affect the consequences of the decisions made. See RISK AND UNCERTAINTY.
References in periodicals archive ?
The paper is structured as follows: Problem Formulation presented in section 2; Decision Tree Algorithm for optimal placement and sizing of PV systems presented in section 3; Losses estimation by Decision Tree Algorithm presented in section 4, Application of Decision Tree Algorithm in Distribution system and comparison results presented in section 5; and Conclusions of this paper are summarized in section 6.
As a result, decision tree analysis is necessary to provide preliminary information in breeding programs to be performed.
When the decision tree is built for all elements of the analyzed minimal cut set, every combination that achieves the ASIL of the safety goal is extracted due to the fact that not all combinations achieve the ASIL of that safety goal.
It was useful when the number of attributes was more and Decision Tree of more than one level was to be generated (Anuradha and Velmurugan, 2015).
Therefore, building decision trees with ID3 seemed to be a good starting point.
While conventional data-mining methods such as decision trees can determine factors influencing the target variable, they do not detect the causal relationships among variables in the system.
As long as the decision trees in SW R offers possibility for reducing number of the variables by recursive feature elimination we use also wrapper feature selection algorithm, which in contrast to filter, exists as a wrapper around the induction algorithm.
In terms of the attribute selection method, the decision tree attribute selection method yields better accuracy than the other methods.
Given the initial capital [I.sub.0] and denoting the capital position in the initial state [S.sub.0] at t = 0as [V.sub.0], we can write out the capital resource constraint in [S.sub.0] as [V.sub.0] = [I.sub.0] - [Inv.sub.1,0] x [X.sub.1,0,[gamma]] - [Inv.sub.1,0] x [X.sub.2,0,[gamma]] This capital resource constraint thus effectively integrates the two division-level decision trees into a single corporate decision tree at [S.sub.0].
Among his topics are contingency analysis and allocation, decision trees, matrix analysis, tracking performance through control charts, and economic order quantity for inventory control.

Site: Follow: Share:
Open / Close