AHP-OS and AHP Judgment Scales – Published Articles

My latest articles related to AHP:

AHP-OS:

Goepel, K.D. (2018). Implementation of an Online Software Tool for the Analytic Hierarchy Process (AHP-OS). International Journal of the Analytic Hierarchy Process, Vol. 10 Issue 3 2018, pp 469-487

https://doi.org/10.13033/ijahp.v10i3.590

https://www.ijahp.org/index.php/IJAHP/article/view/590/652

AHP Judgment scales:

Goepel, K.D. (2018). Comparison of Judgment Scales of the Analytical Hierarchy Process — A New Approach. International Journal of Information Technology & Decision Making, published Dec 11, 2018

https://doi.org/10.1142/S0219622019500044

AHP Judgment Scales

A revised version of my paper can now be downloaded:

Goepel, K.D., Comparison of Judgment Scales
of the Analytical Hierarchy Process - A New Approach, Preprint of an article submitted for consideration in International Journal of Information Technology and Decision Making © 2017 World Scientific Publishing Company http://www.worldscientific.com/worldscinet/ijitdm (2017)

Why the AHP Balanced Scale is not balanced

As part of my current work about AHP scales, here an important finding for the balanced scale:

Salo and Hamalainen [1] pointed out that the integers from 1 to 9 yield local weights, which are not equally dispersed. Based on this observation, they proposed a balanced scale, where local weights are evenly dispersed over the weight range [0.1, 0.9]. They state that for a given set of priority vectors the corresponding ratios can be computed from the inverse relationship

r = w / (1 – w)      (1a)

The priorities 0.1, 0.15, 0.2, … 0.8, 0.9 lead, for example, to the scale 1, 1.22, 1.5, 1.86, 2.33, 3.00, 4.00, 5.67 and 9.00. This scale can be computed by

wbal = 0.45 + 0.05 x     (1b)

with x = 1 … 9 and

 (1c)

c ( resp. 1/c) are the entry values in the decision matrix, and x the pairwise comparison judgment on the scale 1 to 9.

In fact, eq. 1a or its inverse are the special case for one selected pairwise comparison of two criteria. If we take into account the complete n x n decision matrix for n criteria, the resulting weights for one criterion, judged as x-times more important than all others, can be calculated as:

(2)

Eq. 2 simplifies to eq. 1a for n=2.

With eq. 2 we can formulate the general case for the balanced scale, resulting in evenly dispersed weights for n criteria and a judgment x with x from 1 to M:

(3)

with

(3a)

(3b)

(3c)

We get the general balanced scale (balanced-n) as

(4)

Continue reading Why the AHP Balanced Scale is not balanced

Incoming search terms:

  • balanced scales ahp

AHP Group Consensus Indicator – how to understand and interpret?

BPMSG’s AHP excel template and AHP online software AHP-OS can be used for group decision making by asking several participants to give their inputs to a project in form of pairwise comparisons. Aggregation of individual judgments (AIJ) is done by calculating the geometric mean of the elements of all decision matrices using this consolidated decision matrix to derive the group priorities.

AHP consensus indicator

In [1] I proposed an AHP group consensus indicator to quantify the consensus of the group, i.e. to have an estimate of the agreement on the outcoming priorities between participants. This indicator ranges from 0% to 100%. Zero percent corresponds to no consensus at all, 100% to full consensus. This indicator is derived from the concept of diversity based on Shannon alpha and beta entropy, as described in [2].  It is a measure of homogeneity of priorities between the participants and can also be interpreted as a measure of overlap between priorities of the group members.

Continue reading AHP Group Consensus Indicator – how to understand and interpret?

Incoming search terms:

  • ahp group consensus
  • software for shannon entropy
  • test geometric dispersion ahp group consensus

The Analytical Hierarchy Process (AHP) – Is it old and outdated?

This was a question in researchgate.net, and the answer of Prof. Saaty – the creator of the method – is of course: “The AHP is the only accurate and rigorous mathematical way known for the measurement on intangibles. It is not going to get old for a long time., with a lot of answers from others following.

When it comes to AHP, it seems the scientific world is still divided in opponents and advocates of the method.

I answered with the statistic of my website: BPMSG has more than 4000 users of the online software AHP-OS, 600 of them active users with 1000 projects and more than 3500 decision makers. My AHP excel template reached nearly 21 thousand downloads.  It clearly shows that the method is not outdated.

As a reply Nolberto wrote:

No, I don´t think that AHP is outdated, but the fact that over than 1000 projects have been developed using AHP does not mean that their results are correct (which is impossible to check), or that the method is sound (which is easily challenged)… 

Here my answer:

yes, I agree, the numbers only show that AHP is not outdated (which was the original question). They don’t show, whether the results are correct or incorrect, but they also do not show whether the users did or did not realise the method’s drawbacks and limitations.

For me, as a practitioner, AHP is one of the supporting tools in decision making. The intention of a tool is what it does. A hammer intends to strike, a lever intends to lift. It is what they are made for.

From my users feedback I sometimes get the impression that some of them expect a decision making support tool to make the decision for them, and this is not what it is made for.

In my practical applications AHP helped me and the teams a lot to gain a better insight into a decision problem, to separate important from less important criteria and to achieve a group consensus and agreement how to tackle a problem or proceed with a project. Probably, this could be achieved with other tools too, but as you say, AHP is simple, understandable and easy.

For sure, real world problems are complex. Therefore they have to be broken down and simplified, to be handled with the method, and I agree, over-simplification can be dangerous. On the other hand, what other approach than the break down of complex problems into digestable pieces is possible?

Finally, it’s not the tool producing the decision, but the humans behind it. They will be accountable for the decision, and it’s their responsibility to find the appropriate model of a decision problem and the right balance between  rational and non-rational arguments and potential consequences of their decision.

Let me know your opinion!