decision tree

(redirected from Regression Tree)
Also found in: Dictionary, Medical, Encyclopedia, Wikipedia.

Decision tree

Schematic way of representing alternative sequential decisions and the possible outcomes from these decisions.

Decision Tree

In risk analysis, a diagram of decisions and their potential consequences. It is used to help determine the most straightforward (and cheapest) way to arrive at a stated goal. It is represented by potential decisions (drawn as squares), branching off into different proximate consequences (drawn as circles), and potential end results (drawn as triangles).
Fig.32 Decision tree. The businessman has two options: to open a new factory to boost production capacity or not to open a new factory; and he has to consider two states of nature or events which can occur economic boom or recession. The businessman must assess the likelihood of each of these events occurring and, in this case, based on his knowledge and experience, he estimates that there is a one-in-two chance of a boom and a 0.5 probability of a recession. Finally, the businessman estimates the financial consequences as an £80,000 profit for the new factory if there is a boom, and a £30,000 loss if there is a recession.click for a larger image
Fig.32 Decision tree. The businessman has two options: to open a new factory to boost production capacity or not to open a new factory; and he has to consider two states of nature or events which can occur economic boom or recession. The businessman must assess the likelihood of each of these events occurring and, in this case, based on his knowledge and experience, he estimates that there is a one-in-two chance of a boom and a 0.5 probability of a recession. Finally, the businessman estimates the financial consequences as an £80,000 profit for the new factory if there is a boom, and a £30,000 loss if there is a recession.

decision tree

an aid to decision-making in uncertain conditions, that sets out alternative courses of action and the financial consequences of each alternative, and assigns subjective probabilities to the likelihood of future events occurring. For, example, a firm thinking of opening a new factory the success of which will depend upon consumer spending (and thus the state of the economy) would have a decision tree like Fig. 32.

In order to make a decision, the manager needs a decision criterion to enable him to choose which he regards as the best of the alternatives and, since these choices involve an element of risk, we therefore need to know something about his attitudes to risk. If the manager were neutral in his attitude to risk then we could calculate the certainty equivalent of the ‘open factory’ alternative using the expected money value criterion, which takes the financial consequence of each outcome and weights it by the probability of its occurrence, thus:

which being greater than the £0 for certain of not opening the factory would justify going ahead with the factory project.

However, if the manager were averse to risk then he might not regard the expected money value criterion as being appropriate, for he might require a risk premium to induce him to take the risk. Application of a more cautious certainty equivalent criterion would reduce the certainty equivalent of the ‘open factory’ branch and might even tip the decision against going ahead on the grounds of the ‘downside risk’ of losing £30,000.See UNCERTAINTY AND RISK.

decision tree

a graphical representation of the decision-making process in relation to a particular economic decision. The decision tree illustrates the possibilities open to the decision-maker in choosing between alternative strategies. It is possible to specify the financial consequence of each ‘branch’ of the decision tree and to gauge the PROBABILITY of particular events occurring that might affect the consequences of the decisions made. See RISK AND UNCERTAINTY.
References in periodicals archive ?
Elith J, Leathwick JR, Hastie T (2008) A working guide to boosted regression trees.
In particular, in this paper we propose a Classification and Regression Trees (CART) model.
Classification and Regression Trees (CART): theory and applications: Master's Thesis.
We used classification and regression trees (CART) to describe variation in catch per hour and incidence of the target species based on distance from the Rio Grande, local environmental factors (maximum depth, mean depth, mean width, pH, turbidity, specific conductance, chlorophyll-a concentration, substrate composition, and maximum current velocity), the catch per hour of nonnative fishes (F.
Mining the customer credit using classification and regression tree and multivariate adaptive regression splines, Computational Statistics and Data Analysis 50: 1113-1130.
Classification and regression tree based survival analysis in oak-dominated forests of Missouri's Ozark highlands.
A particularly helpful statistical approach to exploratory studies is classification and regression tree (C&RT) analysis (Breiman et al.
In addition, Table 4 shows the prediction accuracy of the Classification and Regression Tree is 89.
The classification and regression tree (CART) is a recursive partitioning method that splits the sample space into several homogeneous regions for predicting the response variable, which could be categorical (for classification) or quantitative (for regression) in nature.
The relative value and hierarchy of potential prognostic variables were evaluated by classification and regression tree (CART) methodology.