Hi, I’m Aidan Boland, I currently work as a Senior Data Scientist with Edge by Ascential a leading firm in ecommerce insights and analytics. My work focuses on researching and implementing statistical and machine learning techniques in order to improve and automate processes within the company.

I’m also an occasional lecturer in the School of Mathematics and Statistics in University College Dublin, where I currently lecture Introduction to Data Analytics which forms part of the Online Masters in Data Analytics.

- Edge by Ascential (Clavis Insight)

Clavis Insight were acquired by Ascential in 2017, and rebranded as Edge by Ascential in 2018. I have worked in Clavis Insight and subsequently Edge by Ascential since 2016. - Postdoc (2015/2016)

I completed a year as a Postdoctoral researcher in the Insight Center for Data Analytics at UCD with Prof. Andrew Parnell. The project was in conjunction with Clavis Insight and focused on machine learning algorithms, in particular supervised and unsupervised text classification. - PhD (2011 - 2015)

I obtained my PhD from University College Dublin under the supervision of Prof. Nial Friel. My PhD research focused on using Markov Chain Monte Carlo (MCMC) to estimate intractable likelihoods for Gibbs random fields.

**Young-ISA 2019**

In October 2019 I was invited to speak at the inaugural Young-Irish Statistical Association meeting.

I spoke about my work as a data scientist, and explored the evolving nature of job titles for statisticians/data scientists.**EARL Boston 2017**

At the Enterprise Application of the R Language in Boston November 2017 I presented a case study on integrating R into a production environment. This was achieved by creating an API using the plumber R package.

See the Documents section in the navbar above for more presentations and posters.

**2019**

In the Autumn trimester of the 2019/2020 academic year I lectured Introduction to Data Analytics which forms part of the Online MSc in Data Analytics in the School of Mathematics and Statistics in University College Dublin.

In the first half of 2019 I delivered lectures for the Data Analysis for Decision Makers module in the Lochlann Quinn School of Buisness in University College Dublin.

**2016**

I previously lectured an introductory statistical module (Practical Statistics) in UCD. Topics covered included calculating summary statistics, graphs, basic probability theory, confidence intervals, regression and correlation. The students were also thought the basics of Minitab and R.**2010-2015**

I was a tutor in UCD from 2010 until 2015. Modules I covered included Bayesian Statistics, Linear Models, Probability Theory, Time Series, Data Programming and Actuarial Statistics.

**Efficient MCMC for Gibbs Random Fields using pre-computation (2018)**. Electronic Journal of Statistics.

*Bayesian inference of Gibbs random fields (GRFs) is often referred to as a doubly intractable problem, since the likelihood function is intractable. The exploration of the posterior distribution of such models is typically carried out with a sophisticated Markov chain Monte Carlo (MCMC) method, the exchange algorithm (Murray et al., 2006), which requires simulations from the likelihood function at each iteration. The purpose of this paper is to consider an approach to dramatically reduce this computational overhead. To this end we introduce a novel class of algorithms which use realizations of the GRF model, simulated offline, at locations specified by a grid that spans the parameter space. This strategy speeds up dramatically the posterior inference, as illustrated on several examples. However, using the pre-computed graphs introduces a noise in the MCMC algorithm, which is no longer exact. We study the theoretical behaviour of the resulting approximate MCMC algorithm and derive convergence bounds using a recent theoretical development on approximate MCMC methods.***Noisy Monte Carlo: Convergence of Markov chains with approximate transition kernels (2014)**. Statistics and Computing.

*Monte Carlo algorithms often aim to draw from a distribution \(\pi\) by simulating a Markov chain with transition kernel \(P\) such that \(\pi\) is invariant under \(P\). However, there are many situations for which it is impractical or impossible to draw from the transition kernel \(P\). For instance, this is the case with massive datasets, where is it prohibitively expensive to calculate the likelihood and is also the case for intractable likelihood models arising from, for example, Gibbs random fields, such as those found in spatial statistics and network analysis. A natural approach in these cases is to replace \(P\) by an approximation \(\hat{P}\). Using theory from the stability of Markov chains we explore a variety of situations where it is possible to quantify how ‘close’ the chain given by the transition kernel \(\hat{P}\) is to the chain given by \(P\). We apply these results to several examples from spatial statistics and network analysis.*

**The prognostic utility of the transcription factor SRF in docetaxel-resistant prostate cancer: in-vitro discovery and in-vivo validation (2017).**BMC Cancer.**The Effect of Hand Dominance on Functional Outcome Following Single Row Rotator Cuff Repair (2017).**The Open Orthopaedics Journal.