Date of Award

2010

Publication Type

Doctoral Thesis

Degree Name

Ph.D.

Department

Computer Science

Keywords

Artificial Intelligence.

Supervisor

Wu, Dan (School of Computer Science)

Rights

info:eu-repo/semantics/openAccess

Abstract

Probabilistic reasoning methods, Bayesian networks (BNs) in particular, have emerged as an effective and central tool for reasoning under uncertainty. In a multi-agent environment, agents equipped with local knowledge often need to collaborate and reason about a larger uncertainty domain. Multiply sectioned Bayesian networks (MSBNs) provide a solution for the probabilistic reasoning of cooperative agents in such a setting. In this thesis, we first aim to improve the efficiency of current MSBN exact inference algorithms. We show that by exploiting the calculation schema and the semantic meaning of inter-agent messages, we can significantly reduce an agent's local computational cost as well as the inter-agent communication overhead. Our novel technical contributions include 1) a new message passing architecture based on an MSBN linked junction tree forest (LJF); 2) a suite of algorithms extended from our work in BNs to provide the semantic analysis of inter-agent messages; 3) a fast marginal calibration algorithm, designed for an LJF that guarantees exact results with a minimum local and global cost. We then investigate how to incorporate approximation techniques in the MSBN framework. We present a novel local adaptive importance sampler (LLAIS) designed to apply localized stochastic sampling while maintaining the LJF structure. The LLAIS sampler provides accurate estimations for local posterior beliefs and promotes efficient calculation of inter-agent messages. We also address the problem of online monitoring for cooperative agents. As the MSBN model is restricted to static domains, we introduce an MA-DBN model based on a combination of the MSBN and dynamic Bayesian network (DBN) models. We show that effective multi-agent online monitoring with bounded error is possible in an MA-DBN through a new secondary inference structure and a factorized representation of forward messages.

Share

COinS