Causal Explanation with Bayesian Networks

28 July 2017

Professor Ann Nicholson

Supervisor

Both supervisors are leading international researchers in the specialised area of Bayesian networks, now the dominant technology for probabilistic causal modelling in intelligent systems.

  • achieve excellence at a world top 100 university
  • be part of an internationally recognised faculty
  • generate a new causal interpretation of Bayesian networks

The Project

How to generate explanations from Bayesian networks is a long-standing problem that has attracted many different answers, for example, using mutual information (Suermondt, 1992, Explanation in Bayesian Belief Networks, PhD Stanford). Since those early days in the causal interpretation of Bayesian networks, the theory has flourished, especially in interventionist accounts of causation, for example in the work of computer scientists Judea Pearl (Causality, Cambridge  Univ, 2009) and Joe Halpern, and also in the work of philosophers of science (James Woodward, Chris Hitchcock inter alia).

Within this tradition we have recently developed a causal information theory that can assist understanding and using causal Bayesian networks, combining mutual information with an interventionist theory of causality (e.g., Korb, Nyberg & Hope, 2011, "A New Causal Power Theory"). While causal information theory is a very promising tool, the theory itself is not fully developed and its precise application in making sense of causal explanations is not clear. Example of the former: multiple causes often interact with one another in their joint effects, but thus far accounts of the types of interactions and how to measure them are deficient. Example of the latter: causal explanation depends upon context (e.g., what conditions are assumed as a part of a causal query), but how to translate explanatory context into conditions or settings for a causal Bayesian network is not well understood. This project will aim to answer these questions.

"This PhD project is funded with a full, independent scholarship and travel money and is part of a larger, international project, Bayesian ARgumentationvia Delphi (BARD), aimed at using Bayesian network technology in argument analysis, within the large CREATE (Crowdsourcing Evidence, Argumentation,Thinking and Evaluation) program run by the US government's IARPA. BARD is a consortium led by Monash and including researchers at Birkbeck College University of London, University College London and Strathclyde University.

We are seeking an outstanding PhD candidate who can commence this project as soon as possible. Please note the awarding of this scholarship is not bound by the University’s scholarship application deadlines.

The Opportunity

Scholarship: This PhD project is fully funded and includes a full tuition fee offset, as well as a stipend at RTP rate ($26,682 pa). Additional travel funds over and above Faculty and University travel funding are also provided. You will be based in the Faculty of information Technology Monash University.

Read more about Scholarships and Funding

Candidate Requirements

This is a great opportunity for an outstanding candidate to be at the forefront of generating a new causal interpretation of Bayesian networks. The candidate will need to work within a team setting. A strong interest in philosophy of science, causal Bayesian networks and argument analysis is required. You will need a good background in either artificial intelligence (Bayesian networks), philosophy of science or mathematics (Bayesian methods).

Candidates need to be eligible to undertake a PhD in the Faculty of IT at Monash University. Please check your eligibility on the How to apply page and if you meet the criteria please submit an Expression of Interest.

Contact:

For any questions and more information please email

fit-bard-project-officer@monash.edu

Contact name: Associate Professor Kevin Korb & Professor Ann Nicholson