Dempster Shafer theory (DST) of evidence is an effective approach for decision analysis, especially in cases of high uncertainty. It is an evidence based probabilistic reasoning technique which differs from the traditional decision support methods by introducing belief functions not only on sets of propositions but also on all corresponding subsets which provides the capability of distinguishing beliefs on propositions from potential uncertainties among them. However,a significant factor that determines the reliability of reasoning systems is the Fairness that characterizes the processes and the outcomes of the systems. In this paper, it is proposed a modified fairness – by – design Dempster – Shafer reasoning system that quantitative fairness metrics are taken into consideration within the algorithmic procedure. For each evidence provided, a dedicated fairness estimation function determines whether the evidence is compliant with the pre – defined Ethics/Legal regulations of a given Fairness framework. Each fairness estimation function acts as a doubt factor for the evidence and reduces the belief value of the corresponding hypothesis and increases the related to the hypothesis uncertainty. This way unfairness limits the trustfulness of the corresponding evidence and as a result weakens its contribution. The proposed solution is tested against a simulated queue surveillance use case scenario, where 2 CCTV cameras input are used for inferring malicious behavior of people in the queue. For proof of concept, the one of the two cameras, introduces discrimination bias which violates the pre-defined Fairness regulation. Results show that the modified DST systems tolerates unfairness effectively while retaining algorithmic accuracy to a satisfying level.
|