Hi, I’m Aidan Boland, I currently work as a Data Architect at Cubic Telecom, a technology company who provide connected software for Internet of Things
My work focuses on building end to end machine learning solutions within the analytics team at Cubic.
Previously, I was a Senior Data Scientist at Edge by Ascential where I researched and implemented statistical and machine learning techniques in order to improve and automate processes within the company.
I was also an occasional lecturer in the School of Mathematics and Statistics in University College Dublin, where I lectured Introduction to Data Analytics which formed part of the Online Masters in Data Analytics.
RSS
2020
In September 2020 I was invited to speak at the Royal Statistical
Society annual conference. I spoke about my work developing Shiny dashboards.
Young-ISA
2019
In October 2019 I was invited to speak at the inaugural Young-Irish
Statistical Association meeting.
I spoke about my work as a data scientist, and explored the evolving
nature of job titles for statisticians/data scientists.
EARL Boston
2017
At the Enterprise Application of the R Language in Boston November 2017
I presented a case study on integrating R into a production environment.
This was achieved by creating an API using the plumber R package.
See the Documents section in the navbar above for more presentations
and posters.
Efficient MCMC for Gibbs Random Fields using
pre-computation (2018). Electronic
Journal of Statistics.
Bayesian inference of Gibbs random fields (GRFs) is often referred
to as a doubly intractable problem, since the likelihood function is
intractable. The exploration of the posterior distribution of such
models is typically carried out with a sophisticated Markov chain Monte
Carlo (MCMC) method, the exchange algorithm (Murray et al., 2006), which
requires simulations from the likelihood function at each iteration. The
purpose of this paper is to consider an approach to dramatically reduce
this computational overhead. To this end we introduce a novel class of
algorithms which use realizations of the GRF model, simulated offline,
at locations specified by a grid that spans the parameter space. This
strategy speeds up dramatically the posterior inference, as illustrated
on several examples. However, using the pre-computed graphs introduces a
noise in the MCMC algorithm, which is no longer exact. We study the
theoretical behaviour of the resulting approximate MCMC algorithm and
derive convergence bounds using a recent theoretical development on
approximate MCMC methods.
Noisy Monte Carlo: Convergence of Markov chains with
approximate transition kernels (2014). Statistics
and Computing.
Monte Carlo algorithms often aim to draw from a distribution \(\pi\) by simulating a Markov chain with
transition kernel \(P\) such that \(\pi\) is invariant under \(P\). However, there are many situations for
which it is impractical or impossible to draw from the transition kernel
\(P\). For instance, this is the case
with massive datasets, where is it prohibitively expensive to calculate
the likelihood and is also the case for intractable likelihood models
arising from, for example, Gibbs random fields, such as those found in
spatial statistics and network analysis. A natural approach in these
cases is to replace \(P\) by an
approximation \(\hat{P}\). Using theory
from the stability of Markov chains we explore a variety of situations
where it is possible to quantify how ‘close’ the chain given by the
transition kernel \(\hat{P}\) is to the
chain given by \(P\). We apply these
results to several examples from spatial statistics and network
analysis.