James Cussens
Picture:
On Craig-y-dorth
with
Sugar
Loaf (Mynydd Pen-y-fâl) in the background.
[Research]
[PhD supervision]
[Projects]
[Software]
[Teaching]
[Professional
Activities]
[Pastoral and admin roles]
[Personal history]
[Contact information]
Funded PhDs available in Bristol from the Centre for Doctoral
Training Practice-Oriented Artificial Intelligence (PrO-AI).
I do not take on summer interns or accept students to do MSc by
Research, so if you contact me about either of these you will
not get a reply.
There appears to me to be a difficulty in this conclusion: that
happenings which depend upon an infinite number of cases cannot be
determined by a finite number of experiments; indeed nature has her
own habits, born from the return of causes, but only 'in general'. And
so, who will say whether a subsequent experiment will not stray
somewhat from the rule of all the preceding experiments, because of
the very mutabilities of things? [Letter from Leibniz to Bernoulli, 3
December 1703. Quoted in: Cussens, Probability and Statistics in Antognazza (ed.) The Oxford Handbook of Leibniz, OUP, 2018.]
Research
GOBNILP software for Bayesian network structure learning
Google scholar profile
University of Bristol research
profile
Recent papers
-
Shouta Sugahara, Koya Kato, James Cussens and Maomi Ueno.
Learning
Bayesian Network Classifiers to Minimize Class Variable
Parameters, Journal of Machine Learning Research,
27(21):1−41, 2026.
-
James Cussens, Julia Hatamyar, Vishalie Shah and Noemi
Kreif. Fast
Learning of Optimal Policy Trees, arXiv 2506.15435, June,
2025
-
James
Cussens.
Conditional independence constraints in score-based learning
of Bayesian Networks, Proceedings of the 13th
Workshop on Uncertainty Processing (WUPES'25), 82-91,
Třešť, Czech Republic, June 2025.
Recent talks
I am seeking suitably qualified PhD students and
have a page with suggested PhD topics. I
am seeking suitably qualified PhD students and If you are
interested in probabilistic graphical models, integer
programming applications in machine learning or statistical
relational learning then contact me.
Current students
Former students
-
Teny Handhayani
- A
Kernel-based Approach for Learning Causal Graphs From
Mixed Data Containing Missing Values
-
Durdane Kocacoban - Online Causal Structure Learning in the Presence of Latent Variables
-
Mark Balmer (MSc by Research)
-
Garo Panikian - Statistical
inference of dynamical systems with application to modelling
fish populations
-
Eman Aljohani - Informative priors for learning graphical
models
-
Waleed Alsanie - Learning
PRISM programs
-
Joanne Powell - PrediCtoR: Predicting the Recovery of
Ancient DNA and Ancient Proteins (with Matthew
Collins, Archaeology)
-
Adel Aloraini - Extending the graphical represetation of KEGG pathways for a better understanding of prostate cancer using machine learning
-
Barnaby Fisher -
Inductive Logic Programming and Mercury (MSc by Research)
-
Heather
Maclaren
-
Inductive Logic Programming for Software Agents:
Algorithms and Implementations
Software
GOBNILP
GOBNILP learns Bayesian network structure by encoding the
problem as an integer program. Use the following links to get the C/SCIP and
Python/Gurobi versions of GOBNILP, respectively:
Some other GOBNILP-related links:
FASTPOLICYTREE
fastpolicytree is an R
package available
on CRAN which aims to do the same job (constructing optimal
policy trees) as the
existing policytree
R package, but faster. The source code
is available on
github. It is also possible to
create a standalone executable (as long as you have a C
compiler).
See
here for information on how to do this.
fastpolicytree was developed as part of the project Tailoring health policies to improve outcomes using machine learning, causal inference and operations research methods 01/07/20-30/06/23. Funded by
MRC
.
PYADTREE
When learning Bayesian networks from discrete data it is important
to construct contingency tables quickly. AD-trees were introduced
by Moore and Lee to do this
(Cached Sufficient
Statistics for Efficient Machine Learning with Large Datasets ,
Journal of Artificial Intelligence Research 8 (1998) 67-91). A
Python module (wrapping C code) called adtree can be
obtained via the link below. To install it you need
both SWIG and a C compiler
installed. In a comparative test using adtree was
about 600 times faster than using the groupby.count()
method from pandas and about 2500 times faster than
simply traversing the data in Python. If adtree is
available the (most recent version of the) Python version of
GOBNILP uses it to speed up its computation, although sometimes
the speedup is modest.
Projects
Teaching
Currently, at Bristol:
Professional Activities
General chair
Programme chair
Invited speaker
-
Bilbao Data Science Workshop, Bilbao, November 2019
-
Graphical
Models: Conditional Independence and Algebraic Structures,
Munich, October 2019
- CP 2018, Lille, 27-31 August, 2018
- Workshop on Learning with Structured Data and Natural
Language, Toulouse, 9-11 December, 2015.
-
Joint Workshop on Limit Theorems and Algebraic Statistics,
Prague Stochastics 2014, August 25-29, 2014
- ICLP
workshop on Probabilistic logic programming, 17 July
2014
-
ILP-MLG-SRL 09
-
UKKDD-2007
-
AC05
-
ILP04
Area chair/Senior PC
-
AAAI
2026,
-
UAI
2025,
AAAI
2025,
IJCAI 2025 (Senior Area Chair),
-
AAAI 2024,
IJCAI 2024,
UAI 2024
-
IJCAI 2023,
AAAI 2023
-
IJCAI-ECAI 2022
-
IJCAI-21, UAI
2021, ECML/PKDD 21
-
ECAI 2020
-
IJCAI-19
-
IJCAI-ECAI-18
-
IJCAI-17
-
IJCAI-15 (Main track),
IJCAI-15 (Machine Learning track)
-
IJCAI-13
- ECAI-12
-
IJCAI-11,
AAAI-11
-
ECML/PKDD-08
-
ECML/PKDD-07
-
ECML/PKDD-06
-
ICML05
Co-organiser
PC member / Reviewer
-
AISTATS 2026,
CLeaR 2026
-
PLP 2025,
IJCLR 2025,
AISTATS 2025,
CLeaR 2025
-
PLP 2024,
PGM 2024,
IJCLR 2024
-
IJCLR 2023
-
PGM 2022,
ILP 2022
-
NeurIPS 2021 (Outstanding Reviewer Award),
ICML 2021,
AISTATS 2021,
ILP2020-21@IJCLR
PLP 2021
-
NeurIPS 2020,
AAAI-20,
AISTATS 2020,
UAI 2020,
PGM 2020,
ILP 2020
-
AAAI-19,
AISTATS 2019,
ICML-19,
ILP 2019
-
NIPS18 (Outstanding Reviewer Award),
ICML-18,
AAAI-18,
AISTATS 2018,
ICLR 2018,
PGM 2018,
UAI 2018,
ILP 2018
-
NIPS17,
AAAI-17,
AISTATS 2017,
ICML 2017,
ECML/PKDD 2017,
UAI 2017,
ILP 2017
-
NIPS16,
AAAI-16
,
KDD 2016
,
UAI 2016
,
IJCAI-16
,
ECML/PKDD 2016
,
ECAI 2016
,
StarAI 2016
,
PGM 2016
,
PLP 2016
-
NIPS15,
ECML/PKDD 2015,UAI 2015,
AAAI-15,
ILP
2015,
PLP 2015
-
NIPS14,
ICML
2014,
UAI 2014,
ILP
2014,
ECML/PKDD 2014,
AAAI-14,
KR 2014,
ECAI'14,
BUDA 2014,
-
NIPS13,
ICML 2013,
UAI 2013,
ILP 2013,
ECML/PKDD 2013,
EMNLP 2013,
NAACL-HLT 2013,
LML workshop at ECML/PKDD 2013
-
NIPS12,
ICML 2012,
UAI 2012,
ILP 2012,
ECML/PKDD 2012,
AAAI-12,
KR 2012
StaRAI-12,
CoCoMile 2012,
ACL 2012,
Cognitive 2012,
-
ICML 2011,
UAI 2011,
ILP 2011,
ECML/PKDD 2011,
-
ILP 2010,
AAAI-10,
ECAI-2010,
ECML/PKDD 2010,
SBIA 2010
-
NIPS09,
EACL09,
ICML 09,
ILP-09,
SRL-09,
Terminologie et intelligence artificielle (TIA - 2009),
IJCAI-09,
AISTATS
09, CoNLL 09, NAACL-HLT 09,
EACL
Cognitive 2009, NAACL-2009 Workshop on Unsupervised and Minimally Supervised Learning of Lexical Semantics
-
NIPS08,
ICML 08,
ILP 08,
ECAI 08,
SBIA 08,
CoNLL 08,
- ICML
07,
UAI07, ILP07, ACL-2007 Workshop on Cognitive Aspects of Computational Language Acquisition, TIA'07
-
EACL06,
UAI06,
ILP06,
AAAI-06,
SRL06,
CoNLL06
-
IJCAI05,
ICML05,
UAI05,
ILP05,
ECML/PKDD05,
LLLL,
CoNLL05,
TIA05
-
NIPS04,
ICML04,
UAI04,
ECML04,
CIFT04,
SRL04,
CoNLL04,
Psycho-computational models ...
-
ICML03,
UAI03,
ILP03,
CoNLL03,
Acquisition, apprentissage et ...,
SRL2003,
ECML03
-
ICML02,
UAI02,
ILP02,
CIFT02,
CoNLL02
-
ILP01,
ECML01
,
CoNLL01,
LLL01
-
ILP00,
CoNLL00,
LLL00
-
ILP99
,
LLL99
- ILP98
Miscellaneous
Pastoral and admin roles
- Lead Senior Tutor, School of Computer Science
Personal history
Contact information
|
Address
|
School of Computer Science, University of Bristol,
Merchant Venturers Building, Woodland Road, Bristol, BS8 1UB, UK |
|
Phone
|
+44 (0)117 455 8723
|
|
Email
|
firstname.lastname AT bristol DOT ac DOT uk
|