The use multiplying for risk estimation e.g. R = S x P x D should be debunked once and for all.
If real numbers are used (e.g. money to represent severity, actual probability of harm), then it is correct to multiply probability and severity to estimate risk.
However, most manufacturers use a roughly logarithmic scheme (e.g. 1-4 for severity, 1-6 for probability). These schemes are not numerically grounded, for example, a severity of "4" is not four times more severe than a severity of "1". The numbers are just a label for each level. Thus performing any mathematical operation on the "numbers" is a joke. Really.
Even if the numbers are carefully selected to be strictly logarithmic (e.g. S = 1 for $1, 2 for $100, 3 for $10,000, 4 for $1,000,000, P = 1 for 10-6, 2 for 10-5 etc), the mathematically correct approach would be to
add the numbers, not multiply. Thus again making the above multiplication a real joke. A lawyer's picnic if it ever got into court.
Detection makes the situation even worse. Again if real numbers are used it's OK, detection is another factor in the probability. If we want to use the formula R = S x P x D, the parameter D should represent probability that the detection
does not work. If real numbers are used, it quickly becomes apparent that the detection effectiveness needs to be very high (90% or more, so D = 0.1 or less) to have any significant reduction in risk. In a log scheme, D must follow the same log base as P (e.g +1 for each power of 10). If so, then the addition of R = S + P + D is correct.
Otherwise, any mathematical operation in meaningless.
Adding an arbitrary factor D which is then multiplied to determine the risk ala R = S x P x D, is beyond a joke, it should be "go to jail, do not pass go, do not collect $200".
Of course, the real story is that regardless of the scheme (even using real numbers), risk is almost always impossible to estimate. So the numbers are all fudged anyway, usually adjusted to support the status quo. So it doesn't really matter ...
My
