I obtained my PhD from UCD under the supervision of Nial Friel (2015). My PhD research focused on overcoming intractable likelihoods in Bayesian analysis. I investigated using Markov Chain Monte Carlo (MCMC) to target tractable approximate posterior distributions which were ‘close’ to the true posterior distributions for Gibbs random fields. I obtained empirical and theoretical results for a variety of these so called ‘noisy’ MCMC methods.

I completed a year as a Postdoctoral researcher in the Insight Center for Data Analytics at UCD with Andrew Parnell (2016). The project was in conjunction with Clavis Insight and focused on machine learning algorithms, in particular supervised and unsupervised text classification.

In 2019 I will deliver lectures for the Data Analysis for Decision Makers module in the UCD Lochlann Quinn School of Buisness.

I previously lectured an introductory statistical module (Practical Statistics) in UCD. Topics covered included calculating summary statistics, graphs, basic probability theory, confidence intervals, regression and correlation. The students were also thought the basics of Minitab and R.

I was a tutor in UCD from 2010 until 2015. Modules I covered included Bayesian Statistics, Linear Models, Probability Theory, Time Series, Data Programming and Actuarial Statistics.

**Efficient MCMC for Gibbs Random Fields using pre-computation (2018)**. Electronic Journal of Statistics.

**Abstract**

*Bayesian inference of Gibbs random fields (GRFs) is often referred to as a doubly intractable problem, since the likelihood function is intractable. The exploration of the posterior distribution of such models is typically carried out with a sophisticated Markov chain Monte Carlo (MCMC) method, the exchange algorithm (Murray et al., 2006), which requires simulations from the likelihood function at each iteration. The purpose of this paper is to consider an approach to dramatically reduce this computational overhead. To this end we introduce a novel class of algorithms which use realizations of the GRF model, simulated offline, at locations specified by a grid that spans the parameter space. This strategy speeds up dramatically the posterior inference, as illustrated on several examples. However, using the pre-computed graphs introduces a noise in the MCMC algorithm, which is no longer exact. We study the theoretical behaviour of the resulting approximate MCMC algorithm and derive convergence bounds using a recent theoretical development on approximate MCMC methods.*

**Noisy Monte Carlo: Convergence of Markov chains with approximate transition kernels (2014)**. Statistics and Computing.

**Abstract**

*Monte Carlo algorithms often aim to draw from a distribution \(\pi\) by simulating a Markov chain with transition kernel \(P\) such that \(\pi\) is invariant under \(P\). However, there are many situations for which it is impractical or impossible to draw from the transition kernel \(P\). For instance, this is the case with massive datasets, where is it prohibitively expensive to calculate the likelihood and is also the case for intractable likelihood models arising from, for example, Gibbs random fields, such as those found in spatial statistics and network analysis. A natural approach in these cases is to replace \(P\) by an approximation \(\hat{P}\). Using theory from the stability of Markov chains we explore a variety of situations where it is possible to quantify how ‘close’ the chain given by the transition kernel \(\hat{P}\) is to the chain given by \(P\). We apply these results to several examples from spatial statistics and network analysis.*

**The prognostic utility of the transcription factor SRF in docetaxel-resistant prostate cancer: in-vitro discovery and in-vivo validation (2017).** BMC Cancer.

**The Effect of Hand Dominance on Functional Outcome Following Single Row Rotator Cuff Repair (2017).** The Open Orthopaedics Journal.