August
2019  M  T  W  T  F  S  S     1  2  3  4  5  6  7  8  9  10  11  12  13  14  15  16  17  18  19  20  21  22  23  24  25  26  27  28  29  30  31         

Search the School of Mathematical SciencesPeople matching "Quadratic Forms in Statistics: Evaluating Contribu"Courses matching "Quadratic Forms in Statistics: Evaluating Contribu" 
Mathematical Statistics III Statistical methods used in practice are based on a foundation of statistical theory. One branch of this theory uses the tools of probability to establish important distributional results that are used throughout statistics. Another major branch of statistical theory is statistical inference. It deals with issues such as how do we define a "good" estimator or hypothesis test, how do we recognise one and how do we construct one? This course is concerned with the fundamental theory of random variables and statistical inference. Topics covered are: calculus of distributions, moments, moment generating functions; multivariate distributions, marginal and conditional distributions, conditional expectation and variance operators, change of variable, multivariate normal distribution, exact distributions arising in statistics; weak convergence, convergence in distribution, weak law of large numbers, central limit theorem; statistical inference, likelihood, score and information; estimation, minimum variance unbiased estimation, the CramerRao lower bound, exponential families, sufficient statistics, the RaoBlackwell theorem, efficiency, consistency, maximum likelihood estimators, large sample properties; tests of hypotheses, most powerful tests, the NeymanPearson lemma, likelihood ratio, score and Wald tests, large sample properties.
More about this course... 

Probability and Statistics Probability theory is the branch of mathematics that deals with modelling uncertainty. It is important because of its direct application in areas such as genetics, finance and telecommunications. It also forms the fundamental basis for many other areas in the mathematical sciences including statistics, modern optimisation methods and risk modelling. This course provides an introduction to probability theory, random variables and Markov processes. Topics covered are: probability axioms, conditional probability; Bayes' theorem; discrete random variables, moments, bounding probabilities, probability generating functions, standard discrete distributions; continuous random variables, uniform, normal, Cauchy, exponential, gamma and chisquare distributions, transformations, the Poisson process; bivariate distributions, marginal and conditional distributions, independence, covariance and correlation, linear combinations of two random variables, bivariate normal distribution; sequences of independent random variables, the weak law of large numbers, the central limit theorem; definition and properties of a Markov chain and probability transition matrices; methods for solving equilibrium equations, absorbing Markov chains.
More about this course... 
Events matching "Quadratic Forms in Statistics: Evaluating Contribu" 
Stability of timeperiodic flows 15:10 Fri 10 Mar, 2006 :: G08 Mathematics Building University of Adelaide :: Prof. Andrew Bassom, School of Mathematics and
Statistics, University of Western Australia
Timeperiodic shear layers occur naturally in a wide
range of applications from engineering to physiology. Transition to
turbulence in such flows is of practical interest and there have been
several papers dealing with the stability of flows composed of a
steady component plus an oscillatory part with zero mean. In such
flows a possible instability mechanism is associated with the mean
component so that the stability of the flow can be examined using some
sort of perturbationtype analysis. This strategy fails when the mean
part of the flow is small compared with the oscillatory component
which, of course, includes the case when the mean part is precisely
zero.
This difficulty with analytical studies has meant that the stability
of purely oscillatory flows has relied on various numerical
methods. Until very recently such techniques have only ever predicted
that the flow is stable, even though experiments suggest that they do
become unstable at high enough speeds. In this talk I shall expand on
this discrepancy with emphasis on the particular case of the socalled
flat Stokes layer. This flow, which is generated in a deep layer of
incompressible fluid lying above a flat plate which is oscillated in
its own plane, represents one of the few exact solutions of the
NavierStokes equations. We show theoretically that the flow does
become unstable to waves which propagate relative to the basic motion
although the theory predicts that this occurs much later than has been
found in experiments. Reasons for this discrepancy are examined by
reference to calculations for oscillatory flows in pipes and
channels. Finally, we propose some new experiments that might reduce
this disagreement between the theoretical predictions of instability
and practical realisations of breakdown in oscillatory flows. 

A mathematical look at dripping honey 15:10 Fri 4 May, 2007 :: G08 Mathematics Building University of Adelaide :: Dr Yvonne Stokes :: University of Adelaide
Honey dripping from an upturned spoon is an everyday example of a flow that extends and breaks up into drops. Such flows have been of interest for over 300 years, attracting the attention of Plateau and Rayleigh among others. Theoretical understanding has, however, lagged behind experimental investigation, with major progress being made only in the last two decades, driven by industrial applications including inkjet printing, spinning of polymer and glass fibres, blowmoulding of containers, light bulbs and glass tubing, and rheological measurement by fibre extension. Albeit, the exact details of the final stages of breakup are yet to be fully resolved.
An aspect that is relatively unexplored is the evolution of drop and filament from some initial configuration, and the influence of initial conditions on the final breakup. We will consider a drop of very viscous fluid hanging beneath a solid boundary, similar to honey dripping from an upturned spoon, using methods that allow examination of development and behaviour from early time, when a drop and filament begin to form, out to large times when the bulk of the fluid forms a drop at the bottom of a long thin filament which connects it with the upper boundary. The roles of gravity, inertia and surface tension will be examined. 

An Introduction to invariant differential pairings 14:10 Tue 24 Jul, 2007 :: Mathematics G08 :: Jens Kroeske
On homogeneous spaces G/P, where G is a semisimple Lie group and P is a
parabolic subgroup (the ordinary sphere or projective spaces being
examples), invariant operators, that is operators between certain
homogeneous bundles (functions, vector fields or forms being amongst the
typical examples) that are invariant under the action of the group G, have
been studied extensively. Especially on so called hermitian symmetric spaces
which arise through a 1grading of the Lie algebra of G there exists a
complete classification of first order invariant linear differential
operators even on more general manifolds (that allow a so called almost
hermitian structure).
This talk will introduce the notion of an invariant bilinear differential
pairing between sections of the aforementioned homogeneous bundles. Moreover
we will discuss a classification (excluding certain totally degenerate
cases) of all first order invariant bilinear differential pairings on
manifolds with an almost hermitian symmetric structure. The similarities and
connections with the linear operator classification will be highlighted and
discussed.


Insights into the development of the enteric nervous system and Hirschsprung's disease 15:10 Fri 24 Aug, 2007 :: G08 Mathematics building University of Adelaide :: Assoc. Prof. Kerry Landman :: Department of Mathematics and Statistics, University of Melbourne
During the development of the enteric nervous system, neural crest (NC) cells must first migrate into and colonise the entire gut from stomach to anal end. The migratory precursor NC cells change type and differentiate into neurons and glia cells. These cells form the enteric nervous system, which gives rise to normal gut function and peristaltic contraction. Failure of the NC cells to invade the whole gut results in a lack of neurons in a length of the terminal intestine. This potentially fatal condition, marked by intractable constipation, is called Hirschsprung's Disease. The interplay between cell migration, cell proliferation and embryonic gut growth are important to the success of the NC cell colonisation process.
Multiscale models are needed in order to model the different spatiotemporal scales of the NC invasion. For example, the NC invasion wave moves into unoccupied regions of the gut with a wave speed of around 40 microns per hour. New timelapse techniques have shown that there is a weblike network structure within the invasion wave. Furthermore, within this network, individual cell trajectories vary considerably.
We have developed a populationscale model for basic rules governing NC cell invasive behaviour incorporating the important mechanisms. The model predictions were tested experimentally. Mathematical and experimental results agreed. The results provide an understanding of why many of the genes implicated in Hirschsprung's Disease influence NC population size. Our recently developed individual cellbased model also produces an invasion wave with a welldefined wave speed; however, in addition Individual cell trajectories within the invasion wave can be extracted. Further challenges in modeling the various scales of the developmental system will be discussed. 

Fermat's Last Theorem and modular elliptic curves 15:10 Wed 5 Sep, 2007 :: G08 Mathematics Building University of Adelaide :: Dr Mark Kisin
Media...I will give a historical talk, explaining the steps by which one can deduce Fermat's Last Theorem from a statement about modular forms and elliptic curves. 

Regression: a backwards step? 13:10 Fri 7 Sep, 2007 :: Maths G08 :: Dr Gary Glonek
Media...Most students of high school mathematics will have encountered the technique of fitting a line to data by least squares. Those who have taken a university statistics course will also have heard this method referred to as regression. However, it is not obvious from common dictionary definitions why this should be the case. For example, "reversion to an earlier or less advanced state or form". In this talk, the mathematical phenomenon that gave regression its name will be explained and will be shown to have implications in some unexpected contexts.


The Linear Algebra of Internet Search Engines 15:10 Fri 5 Oct, 2007 :: G04 Napier Building University of Adelaide :: Dr Lesley Ward :: School of Mathematics and Statistics, University of South Australia
We often want to search the web for information on a given topic. Early websearch algorithms worked by counting up the number of times the words in a query topic appeared on each webpage. If the topic words appeared often on a given page, that page was ranked highly as a source of information on that topic.
More recent algorithms rely on Link Analysis. People make judgments about how useful a given page is for a given topic, and they express these judgments through the hyperlinks they choose to put on their own webpages. Linkanalysis algorithms aim to mine the collective wisdom encoded in the resulting network of links.
I will discuss the linear algebra that forms the common underpinning of three linkanalysis algorithms for web search. I will also present some work on refining one such algorithm, Kleinberg's HITS algorithm.
This is joint work with Joel Miller, Greg Rae, Fred Schaefer, Ayman Farahat, Tom LoFaro, Tracy Powell, Estelle Basor, and Kent Morrison. It originated in a Mathematics Clinic project at Harvey Mudd College. 

Global and Local stationary modelling in finance: Theory and empirical evidence 14:10 Thu 10 Apr, 2008 :: G04 Napier Building University of Adelaide :: Prof. Dominique Guégan :: Universite Paris 1 PantheonSorbonne
To model real data sets using second order stochastic processes imposes that the data sets verify the second order stationarity condition. This stationarity condition concerns the unconditional moments of the process. It is in that context that most of models developed from the sixties' have been studied; We refer to the ARMA processes (Brockwell and Davis, 1988), the ARCH, GARCH and EGARCH models (Engle, 1982, Bollerslev, 1986, Nelson, 1990), the SETAR process (Lim and Tong, 1980 and Tong, 1990), the bilinear model (Granger and Andersen, 1978, Guégan, 1994), the EXPAR model (Haggan and Ozaki, 1980), the long memory process (Granger and Joyeux, 1980, Hosking, 1981, Gray, Zang and Woodward, 1989, Beran, 1994, Giraitis and Leipus, 1995, Guégan, 2000), the switching process (Hamilton, 1988). For all these models, we get an invertible causal solution under specific conditions on the parameters, then the forecast points and the forecast intervals are available.
Thus, the stationarity assumption is the basis for a general asymptotic theory for identification, estimation and forecasting. It guarantees that the increase of the sample size leads to more and more information of the same kind which is basic for an asymptotic theory to make sense.
Now nonstationarity modelling has also a long tradition in econometrics. This one is based on the conditional moments of the data generating process. It appears mainly in the heteroscedastic and volatility models, like the GARCH and related models, and stochastic volatility processes (Ghysels, Harvey and Renault 1997). This nonstationarity appears also in a different way with structural changes models like the switching models (Hamilton, 1988), the stopbreak model (Diebold and Inoue, 2001, Breidt and Hsu, 2002, Granger and Hyung, 2004) and the SETAR models, for instance. It can also be observed from linear models with time varying coefficients (Nicholls and Quinn, 1982, Tsay, 1987).
Thus, using stationary unconditional moments suggest a global stationarity for the model, but using nonstationary unconditional moments or nonstationary conditional moments or assuming existence of states suggest that this global stationarity fails and that we only observe a local stationary behavior.
The growing evidence of instability in the stochastic behavior of stocks, of exchange rates, of some economic data sets like growth rates for instance, characterized by existence of volatility or existence of jumps in the variance or on the levels of the prices imposes to discuss the assumption of global stationarity and its consequence in modelling, particularly in forecasting. Thus we can address several questions with respect to these remarks.
1. What kinds of nonstationarity affect the major financial and economic data sets? How to detect them?
2. Local and global stationarities: How are they defined?
3. What is the impact of evidence of nonstationarity on the statistics computed from the global non stationary data sets?
4. How can we analyze data sets in the nonstationary global framework? Does the asymptotic theory work in nonstationary framework?
5. What kind of models create local stationarity instead of global stationarity? How can we use them to develop a modelling and a forecasting strategy?
These questions began to be discussed in some papers in the economic literature. For some of these questions, the answers are known, for others, very few works exist. In this talk I will discuss all these problems and will propose 2 new stategies and modelling to solve them. Several interesting topics in empirical finance awaiting future research will also be discussed.


Betti's Reciprocal Theorem for Inclusion and Contact Problems 15:10 Fri 1 Aug, 2008 :: G03 Napier Building University of Adelaide :: Prof. Patrick Selvadurai :: Department of Civil Engineering and Applied Mechanics, McGill University
Enrico Betti (18231892) is recognized in the mathematics community for his pioneering contributions to topology. An equally important contribution is his formulation of the reciprocity theorem applicable to elastic bodies that satisfy the classical equations of linear elasticity. Although James Clerk Maxwell (18311879) proposed a law of reciprocal displacements and rotations in 1864, the contribution of Betti is acknowledged for its underlying formal mathematical basis and generality. The purpose of this lecture is to illustrate how Betti's reciprocal theorem can be used to full advantage to develop compact analytical results for certain contact and inclusion problems in the classical theory of elasticity. Inclusion problems are encountered in number of areas in applied mechanics ranging from composite materials to geomechanics. In composite materials, the inclusion represents an inhomogeneity that is introduced to increase either the strength or the deformability characteristics of resulting material. In geomechanics, the inclusion represents a constructed material region, such as a ground anchor, that is introduced to provide load transfer from structural systems. Similarly, contact problems have applications to the modelling of the behaviour of indentors used in materials testing to the study of foundations used to distribute loads transmitted from structures. In the study of conventional problems the inclusions and the contact regions are directly loaded and this makes their analysis quite straightforward. When the interaction is induced by loads that are placed exterior to the indentor or inclusion, the direct analysis of the problem becomes inordinately complicated both in terns of formulation of the integral equations and their numerical solution. It is shown by a set of selected examples that the application of Betti's reciprocal theorem leads to the development of exact closed form solutions to what would otherwise be approximate solutions achievable only through the numerical solution of a set of coupled integral equations. 

Probabilistic models of human cognition 15:10 Fri 29 Aug, 2008 :: G03 Napier Building University of Adelaide :: Dr Daniel Navarro :: School of Psychology, University of Adelaide
Over the last 15 years a fairly substantial psychological literature has developed in which human reasoning and decisionmaking is viewed as the solution to a variety of statistical problems posed by the environments in which we operate. In this talk, I briefly outline the general approach to cognitive modelling that is adopted in this literature, which relies heavily on Bayesian statistics, and introduce a little of the current research in this field. In particular, I will discuss work by myself and others on the statistical basis of how people make simple inductive leaps and generalisations, and the links between these generalisations and how people acquire word meanings and learn new concepts. If time permits, the extensions of the work in which complex concepts may be characterised with the aid of nonparametric Bayesian tools such as Dirichlet processes will be briefly mentioned. 

What on Earth is Computational Advertising? 15:10 Wed 28 Jan, 2009 :: Napier G03 :: Dr John Tomlin :: Yahoo! Research Labs
This talk will begin with a brief introduction to, and
overview of, the topic we have come to call "computational advertising",
by which we mean the algorithmic techniques useful for the optimal
placement, scheduling and context of online advertisements. Such
advertisements encompass a large and growing fraction of the advertising
industry, and, in the forms of display advertising, content match, and
search marketing, bring in a large fraction of the income derived from
the web. In addition to the overview, we give two examples of
optimization models applied to problems in sponsored search and display
advertising. 

Multiscale tools for interpreting cell biology data 15:10 Fri 17 Apr, 2009 :: Napier LG29 :: Dr Matthew Simpson :: University of Melbourne
Trajectory data from observations of a random walk process are often used to characterize macroscopic transport coefficients and to infer motility mechanisms in cell biology. New continuum equations describing the average moments of the position of an individual agent in a population of interacting agents are derived and validated. Unlike standard noninteracting random walks, the new moment equations explicitly represent the interactions between agents as they are coupled to the macroscopic agent density. Key issues associated with the validity of the new continuum equations and the interpretation of experimental data will be explored. 

Curved pipe flow and its stability 15:10 Fri 11 Sep, 2009 :: Badger labs G13
Macbeth Lecture Theatre :: Dr Richard Clarke :: University of Auckland
The unsteady flow of a viscous fluid through a curved pipe is a widely occuring and well studied problem. The stability of such flows, however, has largely been overlooked; this is in marked contrast to flow through a straightpipe, examination of which forms a cornerstone of hydrodynamic stability theory. Importantly, however, flow through a curved pipe exhibits an array of flow structures that are simply not present in the zero curvature limit, and it is natural to expect these to substantially impact upon the flow's stability. By considering two very different kinds of flows through a curved pipe, we illustrate that this can indeed be the case. 

Understanding hypersurfaces through tropical geometry 12:10 Fri 25 Sep, 2009 :: Napier 102 :: Dr Mohammed Abouzaid :: Massachusetts Institute of Technology
Given a polynomial in two or more variables, one may study the
zero locus from the point of view of different mathematical subjects
(number theory, algebraic geometry, ...). I will explain how tropical
geometry allows to encode all topological aspects by elementary
combinatorial objects called "tropical varieties."
Mohammed Abouzaid received a B.S. in 2002 from the University of Richmond, and a Ph.D. in 2007 from the University of Chicago under the supervision of Paul Seidel. He is interested in symplectic topology and its interactions with algebraic geometry and differential topology, in particular the homological mirror symmetry conjecture. Since 2007 he has been a postdoctoral fellow at MIT, and a Clay Mathematics Institute Research Fellow. 

Contemporary frontiers in statistics 15:10 Mon 28 Sep, 2009 :: Badger Labs G31 Macbeth Lectrue :: Prof. Peter Hall :: University of Melbourne
The availability of powerful computing equipment has had a dramatic impact on statistical methods and thinking, changing forever the way data are analysed. New data types, larger quantities of data, and new classes of research problem are all motivating new statistical methods. We shall give examples of each of these issues, and discuss the current and future directions of frontier problems in statistics. 

Analytic torsion for twisted de Rham complexes 13:10 Fri 30 Oct, 2009 :: School Board Room :: Prof Mathai Varghese :: University of Adelaide
We define analytic torsion for the twisted de Rham complex, consisting of differential forms on a compact Riemannian manifold X with coefficients in a flat vector bundle E, with a differential given by a flat connection on E plus a closed odd degree differential form on X. The definition in our case is more complicated than in the case discussed by RaySinger, as it uses pseudodifferential operators. We show that this analytic torsion is independent of the choice of metrics on X and E, establish some basic functorial properties, and compute it in many examples. We also establish the relationship of an invariant version of analytic torsion for Tdual circle bundles with closed 3form flux. This is joint work with Siye Wu. 

Critical sets of products of linear forms 13:10 Mon 14 Dec, 2009 :: School Board Room :: Dr Graham Denham :: University of Western Ontario, Canada
Suppose $f_1,f_2,\ldots,f_n$ are linear polynomials in $\ell$
variables and $\lambda_1,\lambda_2,\ldots,\lambda_n$ are nonzero complex numbers. The product
$$
\Phi_\lambda=\Prod_{i=1}^n f_1^{\lambda_i},
$$
called a master function,
defines a (multivalued) function on $\ell$dimensional complex space, or more precisely, on the complement of a set of hyperplanes. Then it is easy to ask (but harder to answer) what the set of critical points of a master function looks like, in terms of some properties of the input polynomials and $\lambda_i$'s.
In my talk I will describe the motivation for considering such a question. Then I will indicate how the geometry and combinatorics of hyperplane arrangements can be used to provide at least a partial answer. 

Hartogstype holomorphic extensions 13:10 Tue 15 Dec, 2009 :: School Board Room :: Prof Roman Dwilewicz :: Missouri University of Science and Technology
We will review holomorphic extension problems starting with the famous Hartogs extension theorem (1906), via SeveriKneserFicheraMartinelli theorems, up to some recent (partial) results of Al Boggess (Texas A&M Univ.), Zbigniew Slodkowski (Univ. Illinois at Chicago), and the speaker. The holomorphic extension problems for holomorphic or CauchyRiemann functions are fundamental problems in complex analysis of several variables. The talk will be very elementary, with many figures, and accessible to graduate and even advanced undergraduate students. 

A solution to the GromovVaserstein problem 15:10 Fri 29 Jan, 2010 :: Engineering North N 158 Chapman Lecture Theatre :: Prof Frank Kutzschebauch :: University of Berne, Switzerland
Any matrix in $SL_n (\mathbb C)$ can be written as a product of elementary matrices using the Gauss elimination process. If instead of the field of complex numbers, the entries in the matrix are elements of a more general ring, this becomes a delicate question. In particular, rings of complexvalued functions on a space are interesting cases. A deep result of Suslin gives an affirmative answer for the polynomial ring in $m$ variables in case the size $n$ of the matrix is at least 3. In the topological category, the problem was solved by Thurston and Vaserstein. For holomorphic functions on $\mathbb C^m$, the problem was posed by Gromov in the 1980s. We report on a complete solution to Gromov's problem. A main tool is the OkaGrauertGromov hprinciple in complex analysis. Our main theorem can be formulated as follows: In the absence of obvious topological obstructions, the Gauss elimination process can be performed in a way that depends holomorphically on the matrix. This is joint work with Bj\"orn Ivarsson. 

Some unusual uses of usual symmetries and some usual uses of unusual symmetries 12:10 Wed 10 Mar, 2010 :: School board room :: Prof Phil Broadbridge :: La Trobe University
Ever since Sophus Lie around 1880, continuous groups of invariance transformations have been used to reduce variables and to construct special solutions of PDEs. I will outline the general ideas, then show some variations on the usual reduction algorithm that I have used to solve some practical nonlinear boundary value problems. Applications include soilwater flow, metal surface evolution and population genetics. 

American option pricing in a Markov chain market model 15:10 Fri 19 Mar, 2010 :: School Board Room :: Prof Robert Elliott :: School of Mathematical Sciences, University of Adelaide
This paper considers a model for asset pricing in a world where
the randomness is modeled by a Markov chain rather than Brownian motion.
In this paper we develop a theory of optimal stopping and related
variational inequalities for American options in this model. A version of
Saigal's Lemma is established and numerical algorithms developed.
This is a joint work with John van der Hoek. 

Estimation of sparse Bayesian networks using a scorebased approach 15:10 Fri 30 Apr, 2010 :: School Board Room :: Dr Jessica Kasza :: University of Copenhagen
The estimation of Bayesian networks given highdimensional data sets, with more variables than there are observations, has been the focus of much recent research. These structures provide a flexible framework for the representation of the conditional independence relationships of a set of variables, and can be particularly useful in the estimation of genetic regulatory networks given gene expression data.
In this talk, I will discuss some new research on learning sparse networks, that is, networks with many conditional independence restrictions, using a scorebased approach. In the case of genetic regulatory networks, such sparsity reflects the view that each gene is regulated by relatively few other genes. The presented approach allows prior information about the overall sparsity of the underlying structure to be included in the analysis, as well as the incorporation of prior knowledge about the connectivity of individual nodes within the network.


Interpolation of complex data using spatiotemporal compressive sensing 13:00 Fri 28 May, 2010 :: Santos Lecture Theatre :: A/Prof Matthew Roughan :: School of Mathematical Sciences, University of Adelaide
Many complex datasets suffer from missing data, and interpolating these missing
elements is a key task in data analysis. Moreover, it is often the case that we
see only a linear combination of the desired measurements, not the measurements
themselves. For instance, in network management, it is easy to count the traffic
on a link, but harder to measure the endtoend flows. Additionally, typical
interpolation algorithms treat either the spatial, or the temporal
components of data separately, but in many real datasets have strong
spatiotemporal structure that we would like to exploit in reconstructing the
missing data. In this talk I will describe a novel reconstruction algorithm that
exploits concepts from the growing area of compressive sensing to solve all of
these problems and more. The approach works so well on Internet traffic matrices
that we can obtain a reasonable reconstruction with as much as 98% of the
original data missing. 

A variance constraining ensemble Kalman filter: how to improve forecast using climatic data of unobserved variables 15:10 Fri 28 May, 2010 :: Santos Lecture Theatre :: A/Prof Georg Gottwald :: The University of Sydney
Data assimilation aims to solve one of the fundamental problems ofnumerical weather prediction  estimating the optimal state of the
atmosphere given a numerical model of the dynamics, and sparse, noisy
observations of the system. A standard tool in attacking this
filtering problem is the Kalman filter.
We consider the problem when only partial observations are available.
In particular we consider the situation where the observational space
consists of variables which are directly observable with known
observational error, and of variables of which only their climatic
variance and mean are given. We derive the corresponding Kalman
filter in a variational setting.
We analyze the variance constraining Kalman filter (VCKF) filter for
a simple linear toy model and determine its range of optimal
performance. We explore the variance constraining Kalman filter in an
ensemble transform setting for the Lorenz96 system, and show that
incorporating the information on the variance on some unobservable
variables can improve the skill and also increase the stability of
the data assimilation procedure.
Using methods from dynamical systems theory we then systems where the
unobserved variables evolve deterministically but chaotically on a
fast time scale.
This is joint work with Lewis Mitchell and Sebastian Reich.


Some thoughts on wine production 15:05 Fri 18 Jun, 2010 :: School Board Room :: Prof Zbigniew Michalewicz :: School of Computer Science, University of Adelaide
In the modern information era, managers (e.g. winemakers) recognize the
competitive opportunities represented by decisionsupport tools which can
provide a significant cost savings & revenue increases for their businesses.
Wineries make daily decisions on the processing of grapes, from harvest time
(prediction of maturity of grapes, scheduling of equipment and labour, capacity
planning, scheduling of crushers) through tank farm activities (planning and
scheduling of wine and juice transfers on the tank farm) to packaging processes
(bottling and storage activities). As such operation is quite complex, the whole
area is loaded with interesting ORrelated issues. These include the issues of
global vs. local optimization, relationship between prediction and optimization,
operating in dynamic environments, strategic vs. tactical optimization, and
multiobjective optimization & tradeoff analysis. During the talk we address
the above issues; a few realworld applications will be shown and discussed to
emphasize some of the presented material. 

Dr 10:10 Thu 5 Aug, 2010 :: 10 Pulteney St :: Gary Glonek :: University of Adelaide
Gary will introduce Statistics without Parameters. 

Counting lattice points in polytopes and geometry 15:10 Fri 6 Aug, 2010 :: Napier G04 :: Dr Paul Norbury :: University of Melbourne
Counting lattice points in polytopes arises in many areas of pure and applied mathematics. A basic counting problem is this: how many different ways can one give change of 1 dollar into 5,10, 20 and 50 cent coins? This problem counts lattice points in a tetrahedron, and if there also must be exactly 10 coins then it counts lattice points in a triangle. The number of lattice points in polytopes can be used to measure the robustness of a computer network, or in statistics to test independence of characteristics of samples. I will describe the general structure of lattice point counts and the difficulty of calculations. I will then describe a particular lattice point count in which the structure simplifies considerably allowing one to calculate easily. I will spend a brief time at the end describing how this is related to the moduli space of Riemann surfaces. 

The two envelope problem 12:10 Wed 11 Aug, 2010 :: Napier 210 :: A/Prof Gary Glonek :: University of Adelaide
Media...The two envelope problem is a long standing paradox in
probability theory. Although its formulation has elements in common
with the celebrated Monty Hall problem, the underlying paradox is
apparently far more subtle. In this talk, the problem will be
explained and various aspects of the paradox will be discussed.
Connections to Bayesian inference and other areas of statistics will
be explored. 

A spatialtemporal point process model for fine resolution multisite rainfall data from Roma, Italy 14:10 Thu 19 Aug, 2010 :: Napier G04 :: A/Prof Paul Cowpertwait :: Auckland University of Technology
A point process rainfall model is further developed that has storm origins occurring in spacetime according to a Poisson process. Each storm origin has a random radius so that storms occur as circular regions in twodimensional
space, where the storm radii are taken to be independent exponential random
variables. Storm origins are of random type z, where z follows a continuous
probability distribution. Cell origins occur in a further spatial Poisson
process and have arrival times that follow a NeymanScott point process. Cell
origins have random radii so that cells form discs in twodimensional space.
Statistical properties up to third order are derived and used to fit the model
to 10 min series taken from 23 sites across the Roma region, Italy.
Distributional properties of the observed annual maxima are compared to
equivalent values sampled from series that are simulated using the fitted
model. The results indicate that the model will be of use in urban drainage
projects for the Roma region.


Compound and constrained regression analyses for EIV models 15:05 Fri 27 Aug, 2010 :: Napier LG28 :: Prof Wei Zhu :: State University of New York at Stony Brook
In linear regression analysis, randomness often exists in the independent variables and the resulting models are referred to errorsinvariables (EIV) models. The existing general EIV modeling framework, the structural model approach, is parametric and dependent on the usually unknown underlying distributions. In this work, we introduce a general nonparametric EIV modeling framework, the compound regression analysis, featuring an intuitive geometric representation and a 11 correspondence to the structural model. Properties, examples and further generalizations of this new modeling approach are discussed in this talk. 

Simultaneous confidence band and hypothesis test in generalised varyingcoefficient models 15:05 Fri 10 Sep, 2010 :: Napier LG28 :: Prof Wenyang Zhang :: University of Bath
Generalised varyingcoefficient models (GVC) are very important
models. There are a considerable number of literature addressing these models.
However, most of the existing literature are devoted to the estimation
procedure. In this talk, I will systematically investigate the statistical
inference for GVC, which includes confidence band as well as hypothesis test. I
will show the asymptotic distribution of the maximum discrepancy between the
estimated functional coefficient and the true functional coefficient. I will
compare different approaches for the construction of confidence band and
hypothesis test. Finally, the proposed statistical inference methods are used to
analyse the data from China about contraceptive use there, which leads to some
interesting findings. 

Principal Component Analysis Revisited 15:10 Fri 15 Oct, 2010 :: Napier G04 :: Assoc. Prof Inge Koch :: University of Adelaide
Since the beginning of the 20th century, Principal Component Analysis (PCA) has been an important tool in the analysis of multivariate data. The principal components summarise data in fewer than the original number of variables without losing essential information, and thus allow a split of the data into signal and noise components. PCA is a linear method, based on elegant mathematical theory.
The increasing complexity of data together with the emergence of fast computers in the later parts of the 20th century has led to a renaissance of PCA. The growing numbers of variables (in particular, highdimensional low sample size problems), nonGaussian data, and functional data (where the data are curves) are posing exciting challenges to statisticians, and have resulted in new research which extends the classical theory.
I begin with the classical PCA methodology and illustrate the challenges presented by the complex data that we are now able to collect. The main part of the talk focuses on extensions of PCA: the duality of PCA and the Principal Coordinates of Multidimensional Scaling, Sparse PCA, and consistency results relating to principal components, as the dimension grows. We will also look at newer developments such as Principal Component Regression and Supervised PCA, nonlinear PCA and Functional PCA.


TBA 15:05 Fri 22 Oct, 2010 :: Napier LG28 :: Dr Andy Lian :: University of Adelaide


Arbitrage bounds for weighted variance swap prices 15:05 Fri 3 Dec, 2010 :: Napier LG28 :: Prof Mark Davis :: Imperial College London
This paper builds on earlier work by Davis and Hobson (Mathematical Finance,
2007) giving modelfreeexcept for a 'frictionless markets' assumption
necessary and sufficient conditions for absence of arbitrage given a set of
currenttime put and call options on some underlying asset. Here we suppose
that the prices of a set of put options, all maturing at the same time, are
given and satisfy the conditions for consistency with absence of arbitrage.
We
now add a pathdependent option, specifically a weighted variance swap, to
the
set of traded assets and ask what are the conditions on its time0 price
under
which consistency with absence of arbitrage is maintained. In the present
work,
we work under the extra modelling assumption that the underlying asset price
process has continuous paths. In general, we find that there is always a
non
trivial lower bound to the range of arbitragefree prices, but only in the
case
of a corridor swap do we obtain a finite upper bound. In the case of, say,
the
vanilla variance swap, a finite upper bound exists when there are additional
traded European options which constrain the left wing of the volatility
surface
in appropriate ways. 

Queues with skill based routing under FCFS–ALIS regime 15:10 Fri 11 Feb, 2011 :: B17 Ingkarni Wardli :: Prof Gideon Weiss :: The University of Haifa, Israel
We consider a system where jobs of several types are served by servers
of several types, and a bipartite graph between server types and job types
describes feasible assignments. This is a common situation in manufacturing,
call centers with skill based routing, matching of parentchild in adoption or
matching in kidney transplants etc. We consider the case of first come first
served policy: jobs are assigned to the first available feasible server in
order of their arrivals. We consider two types of policies for assigning
customers to idle servers  a random assignment and assignment to the longest
idle server (ALIS) We survey some results for four different situations:
 For a loss system we find conditions for reversibility and insensitivity.
 For a manufacturing type system, in which there is enough capacity to serve
all jobs, we discuss a product form solution and waiting times.
 For an infinite matching model in which an infinite sequence of customers of
IID types, and infinite sequence of servers of IID types are matched
according to first come first, we obtain a product form stationary
distribution for this system, which we use to calculate matching rates.
 For a call center model with overload and abandonments we make some plausible
observations.
This talk surveys joint work with Ivo Adan, Rene Caldentey, Cor Hurkens, Ed
Kaplan and Damon Wischik, as well as work by Jeremy Visschers, Rishy Talreja and
Ward Whitt.


Mathematical modelling in nanotechnology 15:10 Fri 4 Mar, 2011 :: 7.15 Ingkarni Wardli :: Prof Jim Hill :: University of Adelaide
Media...In this talk we present an overview of the mathematical modelling contributions of the Nanomechanics Groups at the Universities of Adelaide and Wollongong. Fullerenes and carbon nanotubes have unique properties, such as low weight, high strength, flexibility, high thermal conductivity and chemical stability, and they have many potential applications in nanodevices. In this talk we first present some new results on the geometric structure of carbon nanotubes and on related nanostructures. One concept that has attracted much attention is the creation of nanooscillators, to produce frequencies in the gigahertz range, for applications such as ultrafast optical filters and nanoantennae. The sliding of an inner shell inside an outer shell of a multiwalled carbon nanotube can generate oscillatory frequencies up to several gigahertz, and the shorter the inner tube the higher the frequency. A C60nanotube oscillator generates high frequencies by oscillating a C60 fullerene inside a singlewalled carbon nanotube. Here we discuss the underlying mechanisms of nanooscillators and using the LennardJones potential together with the continuum approach, to mathematically model the C60nanotube nanooscillator. Finally, three illustrative examples of recent modelling in hydrogen storage, nanomedicine and nanocomputing are discussed. 

Bioinspired computation in combinatorial optimization: algorithms and their computational complexity 15:10 Fri 11 Mar, 2011 :: 7.15 Ingkarni Wardli :: Dr Frank Neumann :: The University of Adelaide
Media...Bioinspired computation methods, such as evolutionary algorithms and ant colony
optimization, are being applied successfully to complex engineering and
combinatorial optimization problems. The computational complexity analysis of
this type of algorithms has significantly increased the theoretical
understanding of these successful algorithms. In this talk, I will give an
introduction into this field of research and present some important results
that we achieved for problems from combinatorial optimization. These results
can also be found in my recent textbook "Bioinspired Computation in
Combinatorial Optimization  Algorithms and Their Computational Complexity". 

Classification for highdimensional data 15:10 Fri 1 Apr, 2011 :: Conference Room Level 7 Ingkarni Wardli :: Associate Prof Inge Koch :: The University of Adelaide
For twoclass classification problems Fisher's discriminant rule performs
well in many scenarios provided the dimension, d, is much smaller than the sample
size n. As the dimension increases, Fisher's rule may no longer be
adequate, and can perform as poorly as random guessing.
In this talk we look at new ways of overcoming this poor performance for
highdimensional data by suitably modifying Fisher's rule, and in particular
we describe the 'Features Annealed Independence Rule (FAIR)? of Fan and Fan
(2008) and a rule based on canonical correlation analysis. I describe some
theoretical developments, and also show analysis of data which illustrate the
performance of these modified rule. 

Spherical tube hypersurfaces 13:10 Fri 8 Apr, 2011 :: Mawson 208 :: Prof Alexander Isaev :: Australian National University
We consider smooth real hypersurfaces in a complex vector space. Specifically, we are interested in tube hypersurfaces, i.e., hypersurfaces represented as the direct product of the imaginary part of the space and hypersurfaces lying in its real part. Tube hypersurfaces arise, for instance, as the boundaries of tube domains. The study of tube domains is a classical subject in several complex variables and complex geometry, which goes back to the beginning of the 20th century. Indeed, already Siegel found it convenient to realise certain symmetric domains as tubes.
One can endow a tube hypersurface with a socalled CRstructure, which is the remnant of the complex structure on the ambient vector space. We impose on the CRstructure the condition of sphericity. One way to state this condition is to require a certain curvature (called the CRcurvature of the hypersurface) to vanish identically. Spherical tube hypersurfaces possess remarkable properties and are of interest from both the complexgeometric and affinegeometric points of view. I my talk I will give an overview of the theory of such hypersurfaces. In particular, I will mention an algebraic construction arising from this theory that has applications in abstract commutative algebra and singularity theory. I will speak about these applications in detail in my colloquium talk later today. 

On parameter estimation in population models 15:10 Fri 6 May, 2011 :: 715 Ingkarni Wardli :: Dr Joshua Ross :: The University of Adelaide
Essential to applying a mathematical model to a realworld application is
calibrating the model to data. Methods for calibrating population models
often become computationally infeasible when the populations size (more generally
the size of the state space) becomes large, or other complexities such as
timedependent transition rates, or sampling error, are present. Here we
will discuss the use of diffusion approximations to perform estimation in several
scenarios, with successively reduced assumptions: (i) under the assumption
of stationarity (the process had been evolving for a very long time with
constant parameter values); (ii) transient dynamics (the assumption of stationarity
is invalid, and thus only constant parameter values may be assumed); and, (iii)
timeinhomogeneous chains (the parameters may vary with time) and accounting
for observation error (a sample of the true state is observed). 

When statistics meets bioinformatics 12:10 Wed 11 May, 2011 :: Napier 210 :: Prof Patty Solomon :: School of Mathematical Sciences
Media...Bioinformatics is a new field of research which encompasses mathematics, computer science, biology, medicine and the physical sciences. It has arisen from the need to handle and analyse the vast amounts of data being generated by the new genomics technologies. The interface of these disciplines used to be informationpoor, but is now informationmegarich, and statistics plays a central role in processing this information and making it intelligible. In this talk, I will describe a published bioinformatics study which claimed to have developed a simple test for the early detection of ovarian cancer from a blood sample. The US Food and Drug Administration was on the verge of approving the test kits for market in 2004 when demonstrated flaws in the study design and analysis led to its withdrawal. We are still waiting for an effective early biomarker test for ovarian cancer. 

Statistical challenges in molecular phylogenetics 15:10 Fri 20 May, 2011 :: Mawson Lab G19 lecture theatre :: Dr Barbara Holland :: University of Tasmania
Media...This talk will give an introduction to the ways that mathematics and statistics gets used in the inference of evolutionary (phylogenetic) trees. Taking a modelbased approach to estimating the relationships between species has proven to be an enormously effective, however, there are some tricky statistical challenges that remain. The increasingly plentiful amount of DNA sequence data is a boon, but it is also throwing a spotlight on some of the shortcomings of current best practice particularly in how we (1) assess the reliability of our phylogenetic estimates, and (2) how we choose appropriate models. This talk will aim to give a general introduction this area of research and will also highlight some results from two of my recent PhD students. 

Optimal experimental design for stochastic population models 15:00 Wed 1 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Dan Pagendam :: CSIRO, Brisbane
Markov population processes are popular models for studying a wide range of
phenomena including the spread of disease, the evolution of chemical reactions
and the movements of organisms in population networks (metapopulations). Our
ability to use these models effectively can be limited by our knowledge about
parameters, such as disease transmission and recovery rates in an epidemic.
Recently, there has been interest in devising optimal experimental designs for
stochastic models, so that practitioners can collect data in a manner that
maximises the precision of maximum likelihood estimates of the parameters for
these models. I will discuss some recent work on optimal design for a variety
of population models, beginning with some simple oneparameter models where the
optimal design can be obtained analytically and moving on to more complicated
multiparameter models in epidemiology that involve latent states and
nonexponentially distributed infectious periods. For these more complex
models, the optimal design must be arrived at using computational methods and we
rely on a Gaussian diffusion approximation to obtain analytical expressions for
Fisher's information matrix, which is at the heart of most optimality criteria
in experimental design. I will outline a simple crossentropy algorithm that
can be used for obtaining optimal designs for these models. We will also
explore the improvements in experimental efficiency when using the optimal
design over some simpler designs, such as the design where observations are
spaced equidistantly in time. 

Priority queueing systems with random switchover times and generalisations of the KendallTakacs equation 16:00 Wed 1 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Andrei Bejan :: The University of Cambridge
In this talk I will review existing analytical results for priority queueing
systems with Poisson incoming flows, general service times and a single server
which needs some (random) time to switch between requests of different priority.
Specifically, I will discuss analytical results for the busy period and workload
of such systems with a special structure of switchover times.
The results related to the busy period can be seen as generalisations of the
famous KendallTak\'{a}cs functional equation for $MG1$:
being formulated in terms of LaplaceStieltjes transform, they represent systems
of functional recurrent equations.
I will present a methodology and algorithms of their numerical solution;
the efficiency of these algorithms is achieved by acceleration of the numerical
procedure of solving the classical KendallTak\'{a}cs equation.
At the end I will identify open problems with regard to such systems; these open
problems are mainly related to the modelling of switchover times.


Inference and optimal design for percolation and general random graph models (Part I) 09:30 Wed 8 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Andrei Bejan :: The University of Cambridge
The problem of optimal arrangement of nodes of a random weighted graph
is discussed in this workshop. The nodes of graphs under study are fixed, but
their edges are random and established according to the so called
edgeprobability function. This function is assumed to depend on the weights
attributed to the pairs of graph nodes (or distances between them) and a
statistical parameter. It is the purpose of experimentation to make inference on
the statistical parameter and thus to extract as much information about it as
possible. We also distinguish between two different experimentation scenarios:
progressive and instructive designs.
We adopt a utilitybased Bayesian framework to tackle the optimal design problem
for random graphs of this kind. Simulation based optimisation methods, mainly
Monte Carlo and Markov Chain Monte Carlo, are used to obtain the solution. We
study optimal design problem for the inference based on partial observations of
random graphs by employing data augmentation technique. We prove that the
infinitely growing or diminishing node configurations asymptotically represent
the worst node arrangements. We also obtain the exact solution to the optimal
design problem for proximity (geometric) graphs and numerical solution for
graphs with threshold edgeprobability functions.
We consider inference and optimal design problems for finite clusters from bond
percolation on the integer lattice $\mathbb{Z}^d$ and derive a range of both
numerical and analytical results for these graphs. We introduce innerouter
plots by deleting some of the lattice nodes and show that the ÃÂÃÂ«mostly populatedÃÂÃÂ
designs are not necessarily optimal in the case of incomplete observations under
both progressive and instructive design scenarios. Some of the obtained results
may generalise to other lattices. 

Inference and optimal design for percolation and general random graph models (Part II) 10:50 Wed 8 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Andrei Bejan :: The University of Cambridge
The problem of optimal arrangement of nodes of a random weighted graph
is discussed in this workshop. The nodes of graphs under study are fixed, but
their edges are random and established according to the so called
edgeprobability function. This function is assumed to depend on the weights
attributed to the pairs of graph nodes (or distances between them) and a
statistical parameter. It is the purpose of experimentation to make inference on
the statistical parameter and thus to extract as much information about it as
possible. We also distinguish between two different experimentation scenarios:
progressive and instructive designs.
We adopt a utilitybased Bayesian framework to tackle the optimal design problem
for random graphs of this kind. Simulation based optimisation methods, mainly
Monte Carlo and Markov Chain Monte Carlo, are used to obtain the solution. We
study optimal design problem for the inference based on partial observations of
random graphs by employing data augmentation technique. We prove that the
infinitely growing or diminishing node configurations asymptotically represent
the worst node arrangements. We also obtain the exact solution to the optimal
design problem for proximity (geometric) graphs and numerical solution for
graphs with threshold edgeprobability functions.
We consider inference and optimal design problems for finite clusters from bond
percolation on the integer lattice $\mathbb{Z}^d$ and derive a range of both
numerical and analytical results for these graphs. We introduce innerouter
plots by deleting some of the lattice nodes and show that the ÃÂÃÂÃÂÃÂ«mostly populatedÃÂÃÂÃÂÃÂ
designs are not necessarily optimal in the case of incomplete observations under
both progressive and instructive design scenarios. Some of the obtained results
may generalise to other lattices. 

Probability density estimation by diffusion 15:10 Fri 10 Jun, 2011 :: 7.15 Ingkarni Wardli :: Prof Dirk Kroese :: University of Queensland
Media...One of the beautiful aspects of Mathematics is that seemingly
disparate areas can often have deep connections. This talk is about
the fundamental connection between probability density estimation,
diffusion processes, and partial differential equations. Specifically,
we show how to obtain efficient probability density estimators by
solving partial differential equations related to diffusion processes.
This new perspective leads, in combination with Fast Fourier
techniques, to very fast and accurate algorithms for density
estimation. Moreover, the diffusion formulation unifies most of the
existing adaptive smoothing algorithms and provides a natural solution
to the boundary bias of classical kernel density estimators. This talk
covers topics in Statistics, Probability, Applied Mathematics, and
Numerical Mathematics, with a surprise appearance of the theta
function. This is joint work with Zdravko Botev and Joe Grotowski. 

Quantitative proteomics: data analysis and statistical challenges 10:10 Thu 30 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Peter Hoffmann :: Adelaide Proteomics Centre


Introduction to functional data analysis with applications to proteomics data 11:10 Thu 30 Jun, 2011 :: 7.15 Ingkarni Wardli :: A/Prof Inge Koch :: School of Mathematical Sciences


Object oriented data analysis 14:10 Thu 30 Jun, 2011 :: 7.15 Ingkarni Wardli :: Prof Steve Marron :: The University of North Carolina at Chapel Hill
Object Oriented Data Analysis is the statistical analysis of populations of complex objects. In the special case of Functional Data Analysis, these data objects are curves, where standard Euclidean approaches, such as principal components analysis, have been very successful. Recent developments in medical image analysis motivate the statistical analysis of populations of more complex data objects which are elements of mildly nonEuclidean spaces, such as Lie Groups and Symmetric Spaces, or of strongly nonEuclidean spaces, such as spaces of treestructured data objects. These new contexts for Object Oriented Data Analysis create several potentially large new interfaces between mathematics and statistics. Even in situations where Euclidean analysis makes sense, there are statistical challenges because of the High Dimension Low Sample Size problem, which motivates a new type of asymptotics leading to nonstandard mathematical statistics. 

Object oriented data analysis of treestructured data objects 15:10 Fri 1 Jul, 2011 :: 7.15 Ingkarni Wardli :: Prof Steve Marron :: The University of North Carolina at Chapel Hill
The field of Object Oriented Data Analysis has made a lot of
progress on the statistical analysis of the variation in populations
of complex objects. A particularly challenging example of this type
is populations of treestructured objects. Deep challenges arise,
which involve a marriage of ideas from statistics, geometry, and
numerical analysis, because the space of trees is strongly
nonEuclidean in nature. These challenges, together with three
completely different approaches to addressing them, are illustrated
using a real data example, where each data point is the tree of blood
arteries in one person's brain. 

Horocycle flows at prime times 13:10 Wed 10 Aug, 2011 :: B.19 Ingkarni Wardli :: Prof Peter Sarnak :: Institute for Advanced Study, Princeton
The distribution of individual orbits of unipotent flows in homogeneous spaces are well
understood thanks to the work work of Marina Ratner. It is conjectured that this property
is preserved on restricting the times from the integers to primes, this being important in the study of prime numbers as well as in such dynamics. We review progress in understanding this conjecture, starting with Dirichlet (a finite system), Vinogradov (rotation of a circle or torus), Green and Tao (translation on a nilmanifold) and Ubis and Sarnak (horocycle flows in the semisimple case).


IGAAMSI Workshop: Groupvalued moment maps with applications to mathematics and physics 10:00 Mon 5 Sep, 2011 :: 7.15 Ingkarni Wardli
Media...Lecture series by Eckhard Meinrenken, University of Toronto.
Titles of individual lectures: 1) Introduction to Gvalued moment maps. 2) Dirac geometry and Witten's volume formulas.
3) DixmierDouady theory and prequantization. 4) Quantization of groupvalued moment maps. 5) Application to Verlinde formulas. These lectures will be supplemented by additional talks by invited speakers. For more details, please see the conference webpage. 

Can statisticians do better than random guessing? 12:10 Tue 20 Sep, 2011 :: Napier 210 :: A/Prof Inge Koch :: School of Mathematical Sciences
In the finance or credit risk area, a bank may want to assess whether a client is going to default, or be able to meet the repayments. In the assessment of benign or malignant tumours, a correct diagnosis is required. In these and similar examples, we make decisions based on data. The classical ttests provide a tool for making such decisions. However, many modern data sets have more variables than observations, and the classical rules may not be any better than random guessing. We consider Fisher's rule for classifying data into two groups, and show that it can break down for highdimensional data. We then look at ways of overcoming some of the weaknesses of the classical rules, and I show how these "postmodern" rules perform in practice. 

Estimating disease prevalence in hidden populations 14:05 Wed 28 Sep, 2011 :: B.18 Ingkarni Wardli :: Dr Amber Tomas :: The University of Oxford
Estimating disease prevalence in "hidden" populations such as injecting
drug users or men who have sex with men is an important public health
issue. However, traditional designbased estimation methods are
inappropriate because they assume that a list of all members of the
population is available from which to select a sample. Respondent Driven
Sampling (RDS) is a method developed over the last 15 years for sampling
from hidden populations. Similarly to snowball sampling, it leverages the
fact that members of hidden populations are often socially connected to
one another. Although RDS is now used around the world, there are several
common population characteristics which are known to cause estimates
calculated from such samples to be significantly biased. In this talk I'll
discuss the motivation for RDS, as well as some of the recent developments
in methods of estimation. 

Understanding the dynamics of event networks 15:00 Wed 28 Sep, 2011 :: B.18 Ingkarni Wardli :: Dr Amber Tomas :: The University of Oxford
Within many populations there are frequent communications between
pairs of individuals. Such communications might be emails sent within a
company, radio communications in a disaster zone or diplomatic
communications
between states. Often it is of interest to understand the factors that
drive the observed patterns of such communications, or to study how these
factors are changing over over time. Communications can be thought of as
events
occuring on the edges of a network which connects individuals in the
population.
In this talk I'll present a model for such communications which uses ideas
from
social network theory to account for the complex correlation structure
between
events. Applications to the Enron email corpus and the dynamics of hospital
ward transfer patterns will be discussed. 

Statistical modelling for some problems in bioinformatics 11:10 Fri 14 Oct, 2011 :: B.17 Ingkarni Wardli :: Professor Geoff McLachlan :: The University of Queensland
Media...In this talk we consider some statistical analyses of data arising in
bioinformatics. The problems include the detection of differential
expression in microarray geneexpression data, the clustering of
timecourse geneexpression data and, lastly, the analysis of
modernday cytometric data. Extensions are considered to the procedures
proposed for these three problems in McLachlan et al. (Bioinformatics, 2006),
Ng et al. (Bioinformatics, 2006), and Pyne et al. (PNAS, 2009), respectively.
The latter references are available at http://www.maths.uq.edu.au/~gjm/. 

On the role of mixture distributions in the modelling of heterogeneous data 15:10 Fri 14 Oct, 2011 :: 7.15 Ingkarni Wardli :: Prof Geoff McLachlan :: University of Queensland
Media...We consider the role that finite mixture distributions have played in the modelling of heterogeneous data, in particular for clustering continuous data via mixtures of normal distributions. A very brief history is given starting with the seminal papers by Day and Wolfe in the sixties before the appearance of the EM algorithm. It was the publication in 1977 of the latter algorithm by Dempster, Laird, and Rubin that greatly stimulated interest in the use of finite mixture distributions to model heterogeneous data. This is because the fitting of mixture models by maximum likelihood is a classic example of a problem that is simplified considerably by the EM's conceptual unification of maximum likelihood estimation from data that can be viewed as being incomplete. In recent times there has been a proliferation of applications in which the number of experimental units n is comparatively small but the underlying dimension p is extremely large as, for example, in microarraybased genomics and other highthroughput experimental approaches. Hence there has been increasing attention given not only in bioinformatics and machine learning, but also in mainstream statistics, to the analysis of complex data in this situation where n is small relative to p. The latter part of the talk shall focus on the modelling of such highdimensional data using mixture distributions. 

Metric geometry in data analysis 13:10 Fri 11 Nov, 2011 :: B.19 Ingkarni Wardli :: Dr Facundo Memoli :: University of Adelaide
The problem of object matching under invariances can be
studied using certain tools from metric geometry. The central idea is
to regard
objects as metric spaces (or metric measure spaces). The type of
invariance that one wishes to have in the matching is encoded by the
choice of the metrics with which one endows the objects. The standard
example is matching objects in Euclidean space under rigid isometries:
in this
situation one would endow the objects with the Euclidean metric. More
general scenarios are possible in which the desired invariance cannot
be reflected by the preservation of an ambient space metric. Several
ideas due to M. Gromov are useful for approaching this problem. The
GromovHausdorff distance is a natural candidate for doing this.
However, this metric leads to very hard combinatorial optimization
problems and it is difficult to relate to previously reported
practical approaches to the problem of object matching. I will discuss
different variations of these ideas, and in particular will show a
construction of an L^p version of the GromovHausdorff metric, called
the GromovWassestein distance, which is based on mass transportation
ideas. This new metric directly leads to quadratic optimization
problems on continuous variables with linear constraints.
As a consequence of establishing several lower bounds, it turns out
that several invariants of metric measure spaces turn out to be
quantitatively stable in the GW sense. These invariants provide
practical tools for the discrimination of shapes and connect the GW
ideas to a number of preexisting approaches. 

Stability analysis of nonparallel unsteady flows via separation of variables 15:30 Fri 18 Nov, 2011 :: 7.15 Ingkarni Wardli :: Prof Georgy Burde :: BenGurion University
Media...The problem of variables separation in the linear stability
equations, which govern the disturbance behavior in viscous
incompressible fluid flows, is discussed.
Stability of some unsteady nonparallel threedimensional flows (exact
solutions of the NavierStokes equations)
is studied via separation of variables using a semianalytical, seminumerical approach.
In this approach, a solution with separated variables is defined in a new coordinate system which is sought together with the solution form. As the result, the linear stability problems are reduced to eigenvalue problems for ordinary differential equations which can be solved numerically.
In some specific cases, the eigenvalue
problems can be solved analytically. Those unique examples of exact
(explicit) solution of the nonparallel unsteady flow stability
problems provide a very useful test for methods used in the
hydrodynamic stability theory. Exact solutions of the stability problems for some stagnationtype flows are presented. 

Space of 2D shapes and the WeilPetersson metric: shapes, ideal fluid and Alzheimer's disease 13:10 Fri 25 Nov, 2011 :: B.19 Ingkarni Wardli :: Dr Sergey Kushnarev :: National University of Singapore
The WeilPetersson metric is an exciting metric on a space of simple
plane curves. In this talk the speaker will introduce the shape space and
demonstrate the connection with the EulerPoincare equations on the group
of diffeomorphisms (EPDiff). A numerical method for finding geodesics
between two shapes will be demonstrated and applied to the surface of the hippocampus to study the effects of Alzheimer's disease. As another application the speaker will discuss how to do statistics on the shape space and what should be done to improve it. 

Mixing, dynamics, and probability 15:10 Fri 2 Mar, 2012 :: B.21 Ingkarni Wardli :: A/Prof Gary Froyland :: University of New South Wales
Media...Many interesting natural phenomena are hard to predict.
When modelled as a dynamical system, this unpredictability is often the result of rapid separation of nearby trajectories.
Viewing the dynamics as acting on a probability measure, the mixing property states that two measurements (or random variables), evaluated at increasingly separated times, become independent in the timeseparation limit.
Thus, the later measurement becomes increasingly difficult to predict, given the outcome of the earlier measurement.
If this approach to independence occurs exponentially quickly in time, one can profitably use linear operator tools to analyse the dynamics.
I will give an overview of these techniques and show how they can be applied to answer mathematical questions, describe observed behaviour in fluid mixing, and analyse models of the ocean and atmosphere. 

Forecasting electricity demand distributions using a semiparametric additive model 15:10 Fri 16 Mar, 2012 :: B.21 Ingkarni Wardli :: Prof Rob Hyndman :: Monash University
Media...Electricity demand forecasting plays an important role in shortterm load allocation and longterm planning for future generation facilities and transmission augmentation. Planners must adopt a probabilistic view of potential peak demand levels, therefore density forecasts (providing estimates of the full probability distributions of the possible future values of the demand) are more helpful than point forecasts, and are necessary for utilities to evaluate and hedge the financial risk accrued by demand variability and forecasting uncertainty.
Electricity demand in a given season is subject to a range of uncertainties, including underlying population growth, changing technology, economic conditions, prevailing weather conditions (and the timing of those conditions), as well as the general randomness inherent in individual usage. It is also subject to some known calendar effects due to the time of day, day of week, time of year, and public holidays.
I will describe a comprehensive forecasting solution designed to take all the available information into account, and to provide forecast distributions from a few hours ahead to a few decades ahead. We use semiparametric additive models to estimate the relationships between demand and the covariates, including temperatures, calendar effects and some demographic and economic variables. Then we forecast the demand distributions using a mixture of temperature simulation, assumed future economic scenarios, and residual bootstrapping. The temperature simulation is implemented through a new seasonal bootstrapping method with variable blocks.
The model is being used by the state energy market operators and some electricity supply companies to forecast the probability distribution of electricity demand in various regions of Australia. It also underpinned the Victorian Vision 2030 energy strategy. 

The de Rham Complex 12:10 Mon 19 Mar, 2012 :: 5.57 Ingkarni Wardli :: Mr Michael Albanese :: University of Adelaide
Media...The de Rham complex is of fundamental importance in differential geometry. After first introducing differential forms (in the familiar setting of Euclidean space), I will demonstrate how the de Rham complex elegantly encodes one half (in a sense which will become apparent) of the results from vector calculus. If there is time, I will indicate how results from the remaining half of the theory can be concisely expressed by a single, far more general theorem. 

Revenge of the undead statistician part II 13:10 Tue 24 Apr, 2012 :: 7.15 Ingkarni Wardli :: Mr Jono Tuke :: School of Mathematical Sciences
Media...If you only go to one undergraduate seminar this year, then you should have gone to Jim Denier's  it was cracking, but if you decide to go to another, then this one has cholera, Bayesian statistics, random networks and zombies.
Warning: may contain an overuse of pop culture references to motivate an interest in statistics. 

Acyclic embeddings of open Riemann surfaces into new examples of elliptic manifolds 13:10 Fri 4 May, 2012 :: Napier LG28 :: Dr Tyson Ritter :: University of Adelaide
In complex geometry a manifold is Stein if there are, in a certain
sense, "many" holomorphic maps from the manifold into C^n. While this
has long been well understood, a fruitful definition of the dual
notion has until recently been elusive. In Oka theory, a manifold is
Oka if it satisfies several equivalent definitions, each stating that
the manifold has "many" holomorphic maps into it from C^n. Related to
this is the geometric condition of ellipticity due to Gromov, who
showed that it implies a complex manifold is Oka.
We present recent contributions to three open questions involving
elliptic and Oka manifolds. We show that affine quotients of C^n are
elliptic, and combine this with an example of Margulis to construct
new elliptic manifolds of interesting homotopy types. It follows that
every open Riemann surface properly acyclically embeds into an
elliptic manifold, extending an existing result for open Riemann
surfaces with abelian fundamental group.


Multiscale models of collective cell behaviour: Linear or nonlinear diffusion? 15:10 Fri 4 May, 2012 :: B.21 Ingkarni Wardli :: Dr Matthew Simpson :: Queensland University of Technology
Media...Continuum diffusion models are often used to represent the collective motion of cell populations. Most previous studies have simply used linear diffusion to represent collective cell spreading, while others found that degenerate nonlinear diffusion provides a better match to experimental cell density profiles. There is no guidance available in the mathematical biology literature with regard to which approach is more appropriate. Furthermore, there is no knowledge of particular experimental measurements that can be made to distinguish between situations where these two models are appropriate. We provide a link between individualbased and continuum models using a multiscale approach in which we analyse the collective motion of a population of interacting agents in a generalized latticebased exclusion process. For round agents that occupy a single lattice site, we find that the relevant continuum description is a linear diffusion equation, whereas for elongated rodshaped agents that occupy L adjacent lattice sites we find that the relevant continuum description is a nonlinear diffusion equation related to the porous media equation. We show that there are several reasonable approaches for dealing with agent size effects, and that these different approaches are related mathematically through the concept of mean action time. We extend our results to consider proliferation and travelling waves where greater care must be taken to ensure that the continuum model replicates the discrete process. This is joint work with Dr Ruth Baker (Oxford) and Dr Scott McCue (QUT). 

Evaluation and comparison of the performance of Australian and New Zealand intensive care units 14:10 Fri 25 May, 2012 :: 7.15 Ingkarni Wardli :: Dr Jessica Kasza :: The University of Adelaide
Media...Recently, the Australian Government has emphasised the need for monitoring and comparing the performance of Australian hospitals. Evaluating the performance of intensive care units (ICUs) is of particular importance, given that the most severe cases are treated in these units. Indeed, ICU performance can be thought of as a proxy for the overall performance of a hospital. We compare the performance of the ICUs contributing to the Australian and New Zealand Intensive Care Society (ANZICS) Adult Patient Database, the largest of its kind in the world, and identify those ICUs with unusual performance.
It is wellknown that there are many statistical issues that must be accounted for in the evaluation of healthcare provider performance. Indicators of performance must be appropriately selected and estimated, investigators must adequately adjust for casemix, statistical variation must be fully accounted for, and adjustment for multiple comparisons must be made. Our basis for dealing with these issues is the estimation of a hierarchical logistic model for the inhospital death of each patient, with patients clustered within ICUs. Both patient and ICUlevel covariates are adjusted for, with a random intercept and random coefficient for the APACHE III severity score. Given that we expect most ICUs to have similar performance after adjustment for these covariates, we follow Ohlssen et al., JRSS A (2007), and estimate a null model that we expect the majority of ICUs to follow. This methodology allows us to rigorously account for the aforementioned statistical issues, and accurately identify those ICUs contributing to the ANZICS database that have comparatively unusual performance. This is joint work with Prof. Patty Solomon and Assoc. Prof. John Moran. 

Epidemiological consequences of householdbased antiviral prophylaxis for pandemic influenza 14:10 Fri 8 Jun, 2012 :: 7.15 Ingkarni Wardli :: Dr Joshua Ross :: The University of Adelaide
Media...Antiviral treatment offers a fast acting alternative to vaccination. It is viewed as a firstline of defence against pandemic influenza, protecting families and household members once infection has been detected. In clinical trials antiviral treatment has been shown to be efficacious in preventing infection, limiting disease and reducing transmission, yet their impact at containing the 2009 influenza A(H1N1)pdm outbreak was limited. I will describe some of our work, which attempts to understand this seeming discrepancy, through the development of a general model and computationally efficient methodology for studying householdbased interventions.
This is joint work with Dr Andrew Black (Adelaide), and Prof. Matt Keeling and Dr Thomas House (Warwick, U.K.). 

Adventures with group theory: counting and constructing polynomial invariants for applications in quantum entanglement and molecular phylogenetics 15:10 Fri 8 Jun, 2012 :: B.21 Ingkarni Wardli :: Dr Peter Jarvis :: The University of Tasmania
Media...In many modelling problems in mathematics and physics, a standard
challenge is dealing with several repeated instances of a system under
study. If linear transformations are involved, then the machinery of
tensor products steps in, and it is the job of group theory to control how
the relevant symmetries lift from a single system, to having many copies.
At the level of group characters, the construction which does this is
called PLETHYSM.
In this talk all this will be contextualised via two case studies:
entanglement invariants for multipartite quantum systems, and Markov
invariants for tree reconstruction in molecular phylogenetics. By the end
of the talk, listeners will have understood why Alice, Bob and Charlie
love Cayley's hyperdeterminant, and they will know why the three squangles
 polynomial beasts of degree 5 in 256 variables, with a modest 50,000
terms or so  can tell us a lot about quartet trees! 

Probability, what can it tell us about health? 13:10 Tue 9 Oct, 2012 :: 7.15 Ingkarni Wardli :: Prof Nigel Bean :: School of Mathematical Sciences
Media...Clinical trials are the way in which modern medical systems test whether individual treatments are worthwhile. Sophisticated statistics is used to try and make the conclusions from clinical trials as meaningful as possible. What can a very simple probability model then tell us about the worth of multiple treatments? What might the implications of this be for the whole health system?
This talk is based on research currently being conducted with a physician at a major Adelaide hospital. It requires no health knowledge and was not tested on animals. All you need is an enquiring and open mind.


Multiscale models of evolutionary epidemiology: where is HIV going? 14:00 Fri 19 Oct, 2012 :: Napier 205 :: Dr Lorenzo Pellis :: The University of Warwick
An important component of pathogen evolution at the population level is evolution within hosts, which can alter the composition of genotypes available for transmission as infection progresses. I will present a deterministic multiscale model, linking the withinhost competition dynamics with the transmission dynamics at a population level. I will take HIV as an example of how this framework can help clarify the conflicting evolutionary pressure an infectious disease might be subject to. 

Epidemic models in socially structured populations: when are simple models too simple? 14:00 Thu 25 Oct, 2012 :: 5.56 Ingkarni Wardli :: Dr Lorenzo Pellis :: The University of Warwick
Both age and household structure are recognised as important heterogeneities affecting epidemic spread of infectious pathogens, and many models exist nowadays that include either or both forms of heterogeneity. However, different models may fit aggregate epidemic data equally well and nevertheless lead to different predictions of public health interest. I will here present an overview of stochastic epidemic models with increasing complexity in their social structure, focusing in particular on households models. For these models, I will present recent results about the definition and computation of the basic reproduction number R0 and its relationship with other threshold parameters. Finally, I will use these results to compare models with no, either or both age and household structure, with the aim of quantifying the conditions under which each form of heterogeneity is relevant and therefore providing some criteria that can be used to guide model design for realtime predictions. 

Epidemic models in socially structured populations: when are simple models too simple? 14:00 Thu 25 Oct, 2012 :: 5.56 Ingkarni Wardli :: Dr Lorenzo Pellis :: The University of Warwick
Both age and household structure are recognised as important heterogeneities affecting epidemic spread of infectious pathogens, and many models exist nowadays that include either or both forms of heterogeneity. However, different models may fit aggregate epidemic data equally well and nevertheless lead to different predictions of public health interest. I will here present an overview of stochastic epidemic models with increasing complexity in their social structure, focusing in particular on households models. For these models, I will present recent results about the definition and computation of the basic reproduction number R0 and its relationship with other threshold parameters. Finally, I will use these results to compare models with no, either or both age and household structure, with the aim of quantifying the conditions under which each form of heterogeneity is relevant and therefore providing some criteria that can be used to guide model design for realtime predictions. 

Dynamics of microbial populations from a copper sulphide leaching heap 12:30 Mon 12 Nov, 2012 :: B.21 Ingkarni Wardli :: Ms Susana Soto Rojo :: University of Adelaide
Media...We are interested in the dynamics of the microbial population from a copper sulphide bioleaching heap. The composition of the microbial consortium is closely related to the kinetics of the oxidation processes that lead to copper recovery. Using a nonlinear model, which considers the effect of substrate depletion and incorporates spatial dependence, we analyse adjacent strips correlation, patterns of microbial succession, relevance of pertinent physicchemical parameters and the implications of the absence of barriers between the three lifts of the heap. We also explore how the dynamics of the microbial community relate to the mineral composition of the individual strips of the bioleaching pile. 

Twistor theory and the harmonic hull 15:10 Fri 8 Mar, 2013 :: B.18 Ingkarni Wardli :: Prof Michael Eastwood :: Australian National University
Media...Harmonic functions are realanalytic and so automatically extend as functions of complex variables. But how far do they extend? This question may be answered by twistor theory, the Penrose transform, and associated conformal geometry. Nothing will be supposed about such matters: I shall base the constructions on an elementary yet mysterious formula of Bateman from 1904. This is joint work with Feng Xu. 

Modular forms: a rough guide 12:10 Mon 18 Mar, 2013 :: B.19 Ingkarni Wardli :: Damien Warman :: University of Adelaide
Media...I recently found the need to learn a little about what I had naively believed to be an abstruse branch of number theory, but which turns out to be a ubiquitous and intriguing theory.
I'll introduce some of the geometry underlying the elementary theory of modular functions and modular forms. We'll look at some pictures and play with sage, time permitting. 

Multiscale modelling couples patches of wavelike simulations 12:10 Mon 27 May, 2013 :: B.19 Ingkarni Wardli :: Meng Cao :: University of Adelaide
Media...A multiscale model is proposed to significantly reduce the expensive numerical simulations of complicated waves over large spatial domains. The multiscale model is built from given microscale simulations of complicated physical processes such as sea ice or turbulent shallow water. Our long term aim is to enable macroscale simulations obtained by coupling small patches of simulations together over large physical distances. This initial work explores the coupling of patch simulations of wavelike pdes. With the line of development being to water waves we discuss the dynamics of two complementary fields called the 'depth' h and 'velocity' u. A staggered grid is used for the microscale simulation of the depth h and velocity u. We introduce a macroscale staggered grid to couple the microscale patches. Linear or quadratic interpolation provides boundary conditions on the field in each patch. Linear analysis of the whole coupled multiscale system establishes that the resultant macroscale dynamics is appropriate. Numerical simulations support the linear analysis. This multiscale method should empower the feasible computation of large scale simulations of wavelike dynamics with complicated underlying physics. 

Invariant Theory: The 19th Century and Beyond 15:10 Fri 21 Jun, 2013 :: B.18 Ingkarni Wardli :: Dr Jarod Alper :: Australian National University
Media...A central theme in 19th century mathematics was invariant theory, which was viewed as a bridge between geometry and algebra. David Hilbert revolutionized the field with two seminal papers in 1890 and 1893 with techniques such as Hilbert's basis theorem, Hilbert's Nullstellensatz and Hilbert's syzygy theorem that spawned the modern field of commutative algebra. After Hilbert's groundbreaking work, the field of invariant theory remained largely inactive until the 1960's when David Mumford revitalized the field by reinterpreting Hilbert's ideas in the context of algebraic geometry which ultimately led to the influential construction of the moduli space of smooth curves. Today invariant theory remains a vital research area with connections to various mathematical disciplines: representation theory, algebraic geometry, commutative algebra, combinatorics and nonlinear differential operators.
The goal of this talk is to provide an introduction to invariant theory with an emphasis on Hilbert's and Mumford's contributions. Time permitting, I will explain recent research with Maksym Fedorchuk and David Smyth which exploits the ideas of Hilbert, Mumford as well as Kempf to answer a classical question concerning the stability of algebraic curves. 

Quadratic Forms in Statistics: Evaluating Contributions of Individual Variables 11:10 Tue 27 Aug, 2013 :: Ingkarni Wardli Level 5 Room 5.57 :: A/Prof Inge Koch :: University of Adelaide


Knots and Quantum Computation 15:10 Fri 6 Sep, 2013 :: B.18 Ingkarni Wardli :: Dr Scott Morrison :: Australian National University
Media...I'll begin with the Jones polynomial, a knot invariant discovered 30 years ago that radically changed our view of topology. From there, we'll visit the complexity of evaluating the Jones polynomial, the topological field theories related to the Jones polynomial, and how all these ideas come together to offer an unorthodox model for quantum computation. 

Random Wanderings on a Sphere... 11:10 Tue 17 Sep, 2013 :: Ingkarni Wardli Level 5 Room 5.57 :: A/Prof Robb Muirhead :: University of Adelaide
This will be a short talk (about 30 minutes) about the following problem. (Even if I tell you all I know about it, it won't take very long!)
Imagine the earth is a unit sphere in 3dimensions. You're standing at a fixed point, which we may as well take to be the North Pole. Suddenly you get moved to another point on the sphere by a random (uniform) orthogonal transormation. Where are you now? You're not at a point which is uniformly distributed on the surface of the sphere (so, since most of the earth's surface is water, you're probably drowning). But then you get moved again by the same orthogonal transformation. Where are you now? And what happens to your location it this happens repeatedly? I have only a partial answwer to this question, for 2 and 3 transformations. (There's nothing special about 3 dimensions hereresults hold for all dimensions which are at least 3.)
I don't know of any statistical application for this! This work was motivated by a talk I heard, given by Tom Marzetta (Bell Labs) at a conference at MIT. Although I know virtually nothing about signal processing, I gather Marzetta was trying to encode signals using powers of ranfom orthogonal matrices. After carrying out simulations, I think he decided it wasn't a good idea. 

Conformal geometry in four variables and a special geometry in five 12:10 Fri 20 Sep, 2013 :: Ingkarni Wardli B19 :: Dr Dennis The :: Australian National University
Starting with a split signature 4dimensional conformal manifold, one can build a 5dimensional bundle over it equipped with a 2plane distribution. Generically, this is a (2,3,5)distribution in the sense of Cartan's five variables paper, an aspect that was recently pursued by Daniel An and Pawel Nurowski (finding new examples concerning the geometry of rolling bodies where the (2,3,5)distribution has G2symmetry). I shall explain how to understand some elementary aspects of this "twistor construction" from the perspective of parabolic geometry. This is joint work with Michael Eastwood and Katja Sagerschnig. 

Controlling disease, one household at a time. 12:10 Mon 23 Sep, 2013 :: B.19 Ingkarni Wardli :: Michael Lydeamore :: University of Adelaide
Pandemics and Epidemics have always caused significant disruption to society. Attempting to model each individual in any reasonable sized population is unfeasible at best, but we can get surprisingly good results just by looking at a single household in a population. In this talk, I'll try to guide you through the logic I've discovered this year, and present some of the key results we've obtained so far, as well as provide a brief indication of what's to come. 

A mathematician walks into a bar..... 12:10 Mon 30 Sep, 2013 :: B.19 Ingkarni Wardli :: Ben Rohrlach :: University of Adelaide
Media...Man is by his very nature, inquisitive. Our need to know has been the reason we've always evolved as a species. From discovering fire, to exploring the galaxy with those Vulcan guys in that documentary I saw, knowing the answer to a question has always driven human kind. Clearly then, I had to ask something. Something that by it's very nature is a thing. A thing that, specifically, I had to know. That thing that I had to know was this:
Do mathematicians get stupider the more they drink? Is this effect more pronounced than for normal (Gaussian) people?
At the quiz night that AUMS just ran I managed to talk two tables into letting me record some key drinking statistics. I'll be using those statistics to introduce some different statistical tests commonly seen in most analyses you'll see in other fields. Oh, and I'll answer those questions I mentioned earlier too, hopefully. Let's do this thing. 

Equivalence of Pvalues  not what you expect 11:10 Tue 22 Oct, 2013 :: Ingkarni Wardli Level 5 Room 5.57 :: Dr Jono Tuke :: University of Adelaide


Group meeting 15:10 Fri 25 Oct, 2013 :: 5.58 (Ingkarni Wardli) :: Dr Ben Binder and Mr David Wilke :: University of Adelaide
Dr Ben Binder :: 'An inverse approach for solutions to freesurface flow problems'
:: Abstract: Surface water waves are familiar to most people, for example, the wave
pattern generated at the stern of a ship. The boundary or interface
between the air and water is called the freesurface. When determining a
solution to a freesurface flow problem it is commonplace for the forcing
(eg. shape of ship or waterbed topography) that creates the surface waves
to be prescribed, with the freesurface coming as part of the solution.
Alternatively, one can choose to prescribe the shape of the freesurface
and find the forcing inversely. In this talk I will discuss my ongoing
work using an inverse approach to discover new types of solutions to
freesurface flow problems in two and three dimensions, and how the
predictions of the method might be verified with experiments. ::
Mr David Wilke:: 'A Computational Fluid Dynamic Study of Blood Flow Within the Coiled Umbilical Arteries'::
Abstract: The umbilical cord is the lifeline of the fetus throughout gestation. In a normal pregnancy it facilitates the supply of oxygen and nutrients from the placenta via a single vein, in addition to the return of deoxygenated blood from the developing embryo or fetus via two umbilical arteries. Despite the major role it plays in the growth of the fetus, pathologies of the umbilical cord are poorly understood. In particular, variations in the cord geometry, which typically forms a helical arrangement, have been correlated with adverse outcomes in pregnancy. Cords exhibiting either abnormally low or high levels of coiling have been associated with pathological results including growthrestriction and fetal demise. Despite this, the methodology currently employed by clinicians to characterise umbilical pathologies can misdiagnose cords and is prone to error. In this talk a computational model of blood flow within rigid threedimensional structures representative of the umbilical arteries will be presented. This study determined that the current characterization was unable to differentiate between cords which exhibited clinically distinguishable flow properties, including the cord pressure drop, which provides a measure of the loading on the fetal heart.


All at sea with spectral analysis 11:10 Tue 19 Nov, 2013 :: Ingkarni Wardli Level 5 Room 5.56 :: A/Prof Andrew Metcalfe :: The University of Adelaide
The steady state response of a single degree of freedom damped linear stystem to a sinusoidal input is a sinusoidal function at the same frequency, but generally with a different amplitude and a phase shift. The analogous result for a random stationary input can be described in terms of input and response spectra and a transfer function description of the linear system.
The practical use of this result is that the parameters of a linear system can be estimated from the input and response spectra, and the response spectrum can be predicted if the transfer function and input spectrum are known.
I shall demonstrate these results with data from a small ship in the North Sea. The results from the sea trial raise the issue of nonlinearity, and second order amplitude response functons are obtained using autoregressive estimators.
The possibility of using wavelets rather than spectra is consedred in the context of single degree of freedom linear systems.
Everybody welcome to attend.
Please not a change of venue  we will be in room 5.56 

Weak Stochastic Maximum Principle (SMP) and Applications 15:10 Thu 12 Dec, 2013 :: B.21 Ingkarni Wardli :: Dr Harry Zheng :: Imperial College, London
Media...In this talk we discuss a weak necessary and sufficient SMP for Markov modulated optimal control problems. Instead of insisting on the maximum condition of the Hamiltonian, we show that 0 belongs to the sum of Clarke's generalized gradient of the Hamiltonian and Clarke's normal cone of the control constraint set at the optimal control. Under a joint concavity condition on the Hamiltonian the necessary condition becomes sufficient. We give examples to demonstrate the weak SMP and its applications in quadratic loss minimization. 

Hormander's estimate, some generalizations and new applications 12:10 Mon 17 Feb, 2014 :: Ingkarni Wardli B20 :: Prof Zbigniew Blocki :: Jagiellonian University
Lars Hormander proved his estimate for the dbar equation in 1965. It is one the most important results in several complex variables (SCV). New applications have
emerged recently, outside of SCV. We will present three of them: the OhsawaTakegoshi extension theorem with optimal constant, the onedimensional Suita Conjecture, and Nazarov's approach to the BourgainMilman inequality from convex analysis. 

The effects of preexisting immunity 15:10 Fri 7 Mar, 2014 :: B.18 Ingkarni Wardli :: Associate Professor Jane Heffernan :: York University, Canada
Media...Immune system memory, also called immunity, is gained as a result of primary infection or vaccination, and can be boosted after vaccination or secondary infections. Immunity is developed so that the immune system is primed to react and fight a pathogen earlier and more effectively in secondary infections. The effects of memory, however, on pathogen propagation in an individual host (inhost) and a population (epidemiology) are not well understood. Mathematical models of infectious diseases, employing dynamical systems, computer simulation and bifurcation analysis, can provide projections of pathogen propagation, show outcomes of infection and help inform public health interventions. In the Modelling Infection and Immunity (MI^2) lab, we develop and study biologically informed mathematical models of infectious diseases at both levels of infection, and combine these models into comprehensive multiscale models so that the effects of individual immunity in a population can be determined. In this talk we will discuss some of the interesting mathematical phenomenon that arise in our models, and show how our results are directly applicable to what is known about the persistence of infectious diseases. 

Bayesian Indirect Inference 12:10 Mon 14 Apr, 2014 :: B.19 Ingkarni Wardli :: Brock Hermans :: University of Adelaide
Media...Bayesian likelihoodfree methods saw the resurgence of Bayesian statistics through the use of computer sampling techniques. Since the resurgence, attention has focused on socalled 'summary statistics', that is, ways of summarising data that allow for accurate inference to be performed. However, it is not uncommon to find data sets in which the summary statistic approach is not sufficient.
In this talk, I will be summarising some of the likelihoodfree methods most commonly used (don't worry if you've never seen any Bayesian analysis before), as well as looking at Bayesian Indirect Likelihood, a new way of implementing Bayesian analysis which combines new inference methods with some of the older computational algorithms. 

A generalised KacPeterson cocycle 11:10 Thu 17 Apr, 2014 :: Ingkarni Wardli B20 :: Pedram Hekmati :: University of Adelaide
The KacPeterson cocycle appears in the study of highest weight modules of infinite dimensional Lie algebras and determines a central extension. The vanishing of its cohomology class is tied to the existence of a cubic Dirac operator whose square is a quadratic Casimir element. I will introduce a closely related Lie algebra cocycle that comes about when constructing spin representations and gives rise to a Banach Lie group with a highly nontrivial topology. I will also explain how to make sense of the cubic Dirac operator in this setting and discuss its relation to twisted Ktheory. This is joint work with Jouko Mickelsson. 

Outlier removal using the Bayesian information criterion for groupbased trajectory modelling 12:10 Mon 28 Apr, 2014 :: B.19 Ingkarni Wardli :: Chris Davies :: University of Adelaide
Media...Attributes measured longitudinally can be used to define discrete paths of measurements, or trajectories, for each individual in a given population. Groupbased trajectory modelling methods can be used to identify subgroups of trajectories within a population, such that trajectories that are grouped together are more similar to each other than to trajectories in distinct groups. Existing methods generally allocate every individual trajectory into one of the estimated groups. However this does not allow for the possibility that some individuals may be following trajectories so different from the rest of the population that they should not be included in a groupbased trajectory model. This results in these outlying trajectories being treated as though they belong to one of the groups, distorting the estimated trajectory groups and any subsequent analyses that use them.
We have developed an algorithm for removing outlying trajectories based on the maximum change in Bayesian information criterion (BIC) due to removing a single trajectory. As well as deciding which trajectory to remove, the number of groups in the model can also change. The decision to remove an outlying trajectory is made by comparing the loglikelihood contributions of the observations to those of simulated samples from the estimated groupbased trajectory model. In this talk the algorithm will be detailed and an application of its use will be demonstrated. 

Networkbased approaches to classification and biomarker identification in metastatic melanoma 15:10 Fri 2 May, 2014 :: B.21 Ingkarni Wardli :: Associate Professor Jean Yee Hwa Yang :: The University of Sydney
Media...Finding prognostic markers has been a central question in much of current research in medicine and biology. In the last decade, approaches to prognostic prediction within a genomics setting are primarily based on changes in individual genes / protein. Very recently, however, network based approaches to prognostic prediction have begun to emerge which utilize interaction information between genes. This is based on the believe that largescale molecular interaction networks are dynamic in nature and changes in these networks, rather than changes in individual genes/proteins, are often drivers of complex diseases such as cancer.
In this talk, I use data from stage III melanoma patients provided by Prof. Mann from Melanoma Institute of Australia to discuss how network information can be utilize in the analysis of gene expression analysis to aid in biological interpretation. Here, we explore a number of novel and previously published networkbased prediction methods, which we will then compare to the common singlegene and geneset methods with the aim of identifying more biologically interpretable biomarkers in the form of networks. 

Ergodicity and loss of capacity: a stochastic horseshoe? 15:10 Fri 9 May, 2014 :: B.21 Ingkarni Wardli :: Professor Ami Radunskaya :: Pomona College, the United States of America
Media...Random fluctuations of an environment are common in ecological and
economical settings. The resulting processes can be described by a
stochastic dynamical system, where a family of maps parametrized by an
independent, identically distributed random variable forms the basis for a
Markov chain on a continuous state space. Random dynamical systems are a
beautiful combination of deterministic and random processes, and they have
received considerable interest since von Neuman and Ulam's seminal work in
the 1940's. Key questions in the study of a stochastic dynamical system
are: does the system have a welldefined average, i.e. is it ergodic?
How does this longterm behavior compare to that of the state
variable in a constant environment with the averaged parameter?
In this talk we answer these questions for a family of maps on the unit
interval that model selflimiting growth. The techniques used can be
extended to study other families of concave maps, and so we conjecture the
existence of a "stochastic horseshoe". 

Complexifications, Realifications, Real forms and Complex Structures 12:10 Mon 23 Jun, 2014 :: B.19 Ingkarni Wardli :: Kelli FrancisStaite :: University of Adelaide
Media...Italian mathematicians NiccolÃ² Fontana Tartaglia and Gerolamo Cardano introduced complex numbers to solve polynomial equations such as x^2+1=0. Solving a standard real differential equation often uses complex eigenvalues and eigenfunctions. In both cases, the solution space is expanded to include the complex numbers, solved, and then translated back to the real case.
My talk aims to explain the process of complexification and related concepts. It will give vocabulary and some basic results about this important process. And it will contain cute cat pictures.


All's Fair in Love and Statistics 12:35 Mon 28 Jul, 2014 :: B.19 Ingkarni Wardli :: Annie Conway :: University of Adelaide
Media...Earlier this year Wired.com published an article about a "math genius" who found true love after scraping and analysing data from a dating site. In this talk I will be investigating the actual mathematics that he used, in particular methods for clustering categorical data, and whether or not the approach was successful. 

Tduality and the chiral de Rham complex 12:10 Fri 22 Aug, 2014 :: Ingkarni Wardli B20 :: Andrew Linshaw :: University of Denver
The chiral de Rham complex of Malikov, Schechtman, and Vaintrob is a sheaf of vertex algebras that exists on any smooth manifold M. It has a squarezero differential D, and contains the algebra of differential forms on M as a subcomplex. In this talk, I'll give an introduction to vertex algebras and sketch this construction. Finally, I'll discuss a notion of Tduality in this setting. This is based on joint work in progress with V. Mathai. 

Inferring absolute population and recruitment of southern rock lobster using only catch and effort data 12:35 Mon 22 Sep, 2014 :: B.19 Ingkarni Wardli :: John Feenstra :: University of Adelaide
Media...Abundance estimates from a datalimited version of catch survey analysis are compared to those from a novel oneparameter deterministic method. Bias of both methods is explored using simulation testing based on a more complex datarich stock assessment population dynamics fishery operating model, exploring the impact of both varying levels of observation error in data as well as model process error. Recruitment was consistently better estimated than legal size population, the latter most sensitive to increasing observation errors. A hybrid of the datalimited methods is proposed as the most robust approach. A more statistically conventional errorinvariables approach may also be touched upon if enough time. 

A Hybrid Markov Model for Disease Dynamics 12:35 Mon 29 Sep, 2014 :: B.19 Ingkarni Wardli :: Nicolas Rebuli :: University of Adelaide
Media...Modelling the spread of infectious diseases is fundamental to protecting ourselves from potentially devastating epidemics. Among other factors, two key indicators for the severity of an epidemic are the size of the epidemic and the time until the last infectious individual is removed. To estimate the distribution of the size and duration of an epidemic (within a realistic population) an epidemiologist will typically use Monte Carlo simulations of an appropriate Markov process. However, the number of states in the simplest Markov epidemic model, the SIR model, is quadratic in the population size and so Monte Carlo simulations are computationally expensive. In this talk I will discuss two methods for approximating the SIR Markov process and I will demonstrate the approximation error by comparing probability distributions and estimates of the distributions of the final size and duration of an SIR epidemic. 

Exploration vs. Exploitation with Partially Observable Gaussian Autoregressive Arms 15:00 Mon 29 Sep, 2014 :: Engineering North N132 :: Julia Kuhn :: The University of Queensland & The University of Amsterdam
Media...We consider a restless bandit problem with Gaussian autoregressive arms, where the state of an arm is only observed when it is played and the statedependent reward is collected. Since arms are only partially observable, a good decision policy needs to account for the fact that information about the state of an arm becomes more and more obsolete while the arm is not being played. Thus, the decision maker faces a tradeoff between exploiting those arms that are believed to be currently the most rewarding (i.e. those with the largest conditional mean), and exploring arms with a high conditional variance. Moreover, one would like the decision policy to remain tractable despite the infinite state space and also in systems with many arms. A policy that gives some priority to exploration is the Whittle index policy, for which we establish structural properties. These motivate a parametric index policy that is computationally much simpler than the Whittle index but can still outperform the myopic policy. Furthermore, we examine the manyarm behavior of the system under the parametric policy, identifying equations describing its asymptotic dynamics. Based on these insights we provide a simple heuristic algorithm to evaluate the performance of index policies; the latter is used to optimize the parametric index. 

The Mathematics behind Simultaneous Localisation and Mapping 12:10 Mon 13 Oct, 2014 :: B.19 Ingkarni Wardli :: David Skene :: University of Adelaide
Media...Simultaneous localisation and mapping (or SLAM) is a process where individual images of an environment are taken and compared against one another. This comparison allows a map of the environment and changes in the location the images were taken to be determined.
This presentation discusses the relevance of SLAM in making a motorised platform autonomous, the process of a SLAM algorithm, and the all important mathematics that makes a SLAM algorithm work. The resulting algorithm is then tested against using a real world motorised platform. 

Optimally Chosen Quadratic Forms for Partitioning Multivariate Data 13:10 Tue 14 Oct, 2014 :: Ingkarni Wardli 715 Conference Room :: Assoc. Prof. Inge Koch :: School of Mathematical Sciences
Media...Quadratic forms are commonly used in linear algebra. For ddimensional vectors they have a matrix representation, Q(x) = x'Ax, for some symmetric matrix A. In statistics quadratic forms are defined for ddimensional random vectors, and one of the bestknown quadratic forms is the Mahalanobis distance of two random vectors.
In this talk we want to partition a quadratic form Q(X) = X'MX, where X is a random vector, and M a symmetric matrix, that is, we want to find a ddimensional random vector W such that Q(X) = W'W. This problem has many solutions. We are interested in a solution or partition W of X such that pairs of corresponding variables (X_j, W_j) are highly correlated and such that W is simpler than the given X.
We will consider some natural candidates for W which turn out to be suboptimal in the sense of the above constraints, and we will then exhibit the optimal solution. Solutions of this type are useful in the wellknown Tsquare statistic. We will see in examples what these solutions look like. 

Modelling segregation distortion in multiparent crosses 15:00 Mon 17 Nov, 2014 :: 5.57 Ingkarni Wardli :: Rohan Shah (joint work with B. Emma Huang and Colin R. Cavanagh) :: The University of Queensland
Construction of highdensity genetic maps has been made feasible by lowcost highthroughput genotyping technology; however, the process is still complicated by biological, statistical and computational issues. A major challenge is the presence of segregation distortion, which can be caused by selection, difference in fitness, or suppression of recombination due to introgressed segments from other species. Alien introgressions are common in major crop species, where they have often been used to introduce beneficial genes from wild relatives.
Segregation distortion causes problems at many stages of the map construction process, including assignment to linkage groups and estimation of recombination fractions. This can result in incorrect ordering and estimation of map distances. While discarding markers will improve the resulting map, it may result in the loss of genomic regions under selection or containing beneficial genes (in the case of introgression).
To correct for segregation distortion we model it explicitly in the estimation of recombination fractions. Previously proposed methods introduce additional parameters to model the distortion, with a corresponding increase in computing requirements. This poses difficulties for large, densely genotyped experimental populations. We propose a method imposing minimal additional computational burden which is suitable for highdensity map construction in large multiparent crosses. We demonstrate its use modelling the known Sr36 introgression in wheat for an eightparent complex cross.


Topology Tomography with Spatial Dependencies 15:00 Tue 25 Nov, 2014 :: Engineering North N132 :: Darryl Veitch :: The University of Melbourne
Media...There has been quite a lot of tomography inference work on measurement networks with a tree topology. Here observations are made, at the leaves of the tree, of `probes' sent down from the root and copied at each branch point. Inference can be performed based on loss or delay information carried by probes, and used in order to recover loss parameters, delay parameters, or the topology, of the tree. In all of these a strong assumption of spatial independence between links in the tree has been made in prior work. I will describe recent work on topology inference, based on loss measurement, which breaks that assumption. In particular I will introduce a new model class for loss with non trivial spatial dependence, the `Jump Independent Models', which are well motivated, and prove that within this class the topology is identifiable. 

Multiscale modelling of multicellular biological systems: mechanics, development and disease 03:10 Fri 6 Mar, 2015 :: Lower Napier LG24 :: Dr James Osborne :: University of Melbourne
When investigating the development and function of multicellular biological systems it is not enough to only consider the behaviour of individual cells in isolation. For example when studying tissue development, how individual cells interact, both mechanically and biochemically, influences the resulting tissues form and function. In this talk we present a multiscale modelling framework for simulating the development and function of multicellular biological systems (in particular tissues). Utilising the natural structural unit of the cell, the framework consists
of three main scales: the tissue level (macroscale); the cell level (mesoscale); and the subcellular level (microscale), with multiple interactions occurring between all scales. The cell level is central to the framework and cells are modelled as discrete interacting entities using one of a number of possible modelling paradigms, including lattice based models (cellular automata and cellular Potts) and offlattice based models (cell centre and vertex based representations). The subcellular level concerns numerous metabolic and biochemical processes represented by interaction networks rendered stochastically or into ODEs. The outputs from such systems influence the behaviour of the cell level affecting properties such as adhesion and also influencing cell mitosis and apoptosis. At the tissue level we consider factors or restraints that influence the cells, for example the distribution of a nutrient or messenger molecule, which is represented by field equations, on a growing domain, with individual cells functioning as
sinks and/or sources. The modular approach taken within the framework enables more realistic behaviour to be considered at each scale.
This framework is implemented within the Open Source Chaste library (Cancer Heart and Soft Tissue Environment, (http://www.cs.ox.ac.uk/chaste/)
and has been used to model biochemical and biomechanical interactions in various biological systems. In this talk we present the key ideas of the framework along with applications within the fields of development and disease. 

On the analyticity of CRdiffeomorphisms 12:10 Fri 13 Mar, 2015 :: Engineering North N132 :: Ilya Kossivskiy :: University of Vienna
One of the fundamental objects in several complex variables is CRmappings. CRmappings naturally occur in complex analysis as boundary values of mappings between domains, and as restrictions of holomorphic mappings onto real submanifolds. It was already observed by Cartan that smooth CRdiffeomorphisms between CRsubmanifolds in C^N tend to be very regular, i.e., they are restrictions of holomorphic maps. However, in general smooth CRmappings form a more restrictive class of mappings. Thus, since the inception of CRgeometry, the following general question has been of fundamental importance for the field: Are CRequivalent realanalytic CRstructures also equivalent holomorphically? In joint work with Lamel, we answer this question in the negative, in any positive CRdimension and CRcodimension. Our construction is based on a recent dynamical technique in CRgeometry, developed in my earlier work with Shafikov. 

Cricket and Maths 12:10 Mon 16 Mar, 2015 :: Napier LG29 :: Peter Ballard :: University of Adelaide
Media...Each game of international cricket has a scorecard. You don't need to know much maths to go through these scorecards and extract simple information, such as batting and bowling averages. However there is also the opportunity to use some more advanced maths. I will be using a bit of optimisation, probability and statistics to try to answer the questions: Which was the most dominant team ever? What scores are most likely? And are some players unlucky? 

Haven't I seen you before? Accounting for partnership duration in infectious disease modeling 15:10 Fri 8 May, 2015 :: Level 7 Conference Room Ingkarni Wardli :: Dr Joel Miller :: Monash University
Media...Our ability to accurately predict and explain the spread of an infectious disease is a significant factor in our ability to implement effective interventions. Our ability to accurately model disease spread depends on how accurately we capture the various effects. This is complicated by the fact that infectious disease spread involves a number of time scales. Four that are particularly relevant are: duration of infection in an individual, duration of partnerships between individuals, the time required for an epidemic to spread through the population, and the time required for the population structure to change (demographic or otherwise).
Mathematically simple models of disease spread usually make the implicit assumption that the duration of partnerships is by far the shortest time scale in the system. Thus they miss out on the tendency for infected individuals to deplete their local pool of susceptibles. Depending on the details of the disease in question, this effect may be significant.
I will discuss work done to reduce these assumptions for "SIR" (SusceptibleInfectedRecovered) diseases, which allows us to interpolate between populations which are static and populations which change partners rapidly in closed populations (no entry/exit). I will then discuss early results in applying these methods to diseases such as HIV in which the population time scales are relevant. 

People smugglers and statistics 12:10 Mon 25 May, 2015 :: Ingkarni Wardli 715 Conference Room :: Prof. Patty Solomon :: School of Mathematical Sciences
Media...In 2012 the Commonwealth Chief Scientist asked for my advice on the statistics being used in people smuggling prosecutions. Many defendants come from poor fishing villages in Indonesia, where births are not routinely recorded and the age of the defendant is not known. However mandatory jail sentences apply in Australia for individuals convicted of people smuggling which do not apply to children less than 18 years old  so assessing the age of each defendant is very important. Following an Australian Human Rights Commission inquiry into the treatment of individuals suspected of people smuggling, the AttorneyGeneral's department sought advice from the Chief Scientist, which is where I come in. I'll present the methods used by the prosecution and defence, which are both wrong, and introduce the prosecutor's fallacy.


Hillary Clinton was liberal. Hillary Clinton is liberal. 12:10 Mon 1 Jun, 2015 :: Napier LG29 :: Brock Hermans :: University of Adelaide
Media...Didn't enjoy last weeks talk? Thought it was a bit too complicated in some areas? Too much pure maths? Well even if your answer is no you should still come along to mine. I will be talking about the most uniting, agreeable area of our lives; politics. By using rudimentary statistics I'll be looking at three things. One, a method for poll aggression as a tool to predict elections (using Bayesian statistics). Two, why the polls were so wrong in the U.K. election recently. And three, what claims (if any) can we make about the current 2016 U.S. presidential race. In one of the most exciting talks of the year so far, I'll be looking at 'Shy Torries', 'Freedom lovinglibertarians' and answering the question "is Hilary Clinton the most liberal (that means left wing in America) candidate in the race?". 

Noncrossing quantiles 15:10 Fri 14 Aug, 2015 :: Ingkarni Wardli B21 :: Dr Yanan Fan :: UNSW
Media...Quantile regression has received increased attention in the statistics community in recent years. However, since the quantile regression curves are estimated separately, the curves can cross, leading to invalid response distribution. Many authors have proposed remedies for this in the context of frequentist estimation. In this talk, I will explain some of the existing approaches, and then describe a new Bayesian semiparametric approach for fitting noncrossing quantile regression models simultaneously. 

Equivariant bundle gerbes 12:10 Fri 21 Aug, 2015 :: Ingkarni Wardli B17 :: Michael Murray :: The University of Adelaide
Media...I will present the definitions of strong and weak group actions on a bundle gerbe and calculate the strongly equivariant
class of the basic bundle gerbe on a unitary group. This is joint work with David Roberts, Danny Stevenson and
Raymond Vozzo and forms part of arXiv:1506.07931. 

Queues and cooperative games 15:00 Fri 18 Sep, 2015 :: Ingkarni Wardli B21 :: Moshe Haviv :: Department of Statistics and the Federmann Center for the Study of Rationality, The Hebrew Universit
Media...The area of cooperative game theory deals with models in which a number of individuals, called players, can form coalitions so as to improve the utility of its members. In many cases, the formation of the grand coalition is a natural result of some negotiation or a bargaining procedure.
The main question then is how the players should split the gains due to their cooperation among themselves. Various solutions have been suggested among them the Shapley value, the nucleolus and the core.
Servers in a queueing system can also join forces. For example, they can exchange service capacity among themselves or serve customers who originally seek service at their peers. The overall performance improves and the question is how they should split the gains, or,
equivalently, how much each one of them needs to pay or be paid in order to cooperate with the others. Our major focus is in the core of the resulting cooperative game and in showing that in many queueing games the core is not empty.
Finally, customers who are served by the same server can also be looked at as players who form a grand coalition, now inflicting damage on each other in the form of additional waiting time. We show how cooperative game theory, specifically the AumannShapley prices, leads to a way in which this damage can be attributed to individual customers or groups of customers. 

Predicting the Winning Time of a Stage of the Tour de France 12:10 Mon 21 Sep, 2015 :: Benham Labs G10 :: Nic Rebuli :: University of Adelaide
Media...Sports can be lucrative, especially popular ones. But for all of us mere mortals, the only money we will ever glean from sporting events is through gambling (responsibly). When it comes to cycling, people generally choose their favourites based on individual and team performance, throughout the world cycling calendar. But what can be said for the duration of a given stage or the winning time of the highly sort after General Classification? In this talk I discuss a basic model for predicting the winning time of the Tour de France. I then apply this model to predicting the outcome of the 2012 and 2013 Tour de France and discuss the results in context. 

Analytic complexity of bivariate holomorphic functions and cluster trees 12:10 Fri 2 Oct, 2015 :: Ingkarni Wardli B17 :: Timur Sadykov :: Plekhanov University, Moscow
The KolmogorovArnold theorem yields a representation of a multivariate continuous function in terms of a composition of functions which depend on at most two variables. In the analytic case, understanding the complexity of such a representation naturally leads to the notion of the analytic complexity of (a germ of) a bivariate multivalued analytic function. According to Beloshapka's local definition, the order of complexity of any univariate function is equal to zero while the nth complexity class is defined recursively to consist of functions of the form a(b(x,y)+c(x,y)), where a is a univariate analytic function and b and c belong to the (n1)th complexity class. Such a represenation is meant to be valid for suitable germs of multivalued holomorphic functions.
A randomly chosen bivariate analytic functions will most likely have infinite analytic complexity. However, for a number of important families of special functions of mathematical physics their complexity is finite and can be computed or estimated. Using this, we introduce the notion of the analytic complexity of a binary tree, in particular, a cluster tree, and investigate its properties.


Modelling Coverage in RNA Sequencing 09:00 Mon 9 Nov, 2015 :: Ingkarni Wardli 5.57 :: Arndt von Haeseler :: Max F Perutz Laboratories, University of Vienna
Media...RNA sequencing (RNAseq) is the method of choice for measuring the expression of RNAs in a cell population. In an RNAseq experiment, sequencing the full length of larger RNA molecules requires fragmentation into smaller pieces to be compatible with limited read lengths of most deepsequencing technologies. Unfortunately, the issue of nonuniform coverage across a genomic feature has been a concern in RNAseq and is attributed to preferences for certain fragments in steps of library preparation and sequencing. However, the disparity between the observed nonuniformity of read coverage in RNAseq data and the assumption of expected uniformity elicits a query on the read coverage profile one should expect across a transcript, if there are no biases in the sequencing protocol. We propose a simple model of unbiased fragmentation where we find that the expected coverage profile is not uniform and, in fact, depends on the ratio of fragment length to transcript length. To compare the nonuniformity proposed by our model with experimental data, we extended this simple model to incorporate empirical attributes matching that of the sequenced transcript in an RNAseq experiment. In addition, we imposed an experimentally derived distribution on the frequency at which fragment lengths occur.
We used this model to compare our theoretical prediction with experimental data and with the uniform coverage model. If time permits, we will also discuss a potential application of our model. 

Use of epidemic models in optimal decision making 15:00 Thu 19 Nov, 2015 :: Ingkarni Wardli 5.57 :: Tim Kinyanjui :: School of Mathematics, The University of Manchester
Media...Epidemic models have proved useful in a number of applications in epidemiology. In this work, I will present two areas that we have used modelling to make informed decisions. Firstly, we have used an age structured mathematical model to describe the transmission of Respiratory Syncytial Virus in a developed country setting and to explore different vaccination strategies. We found that delayed infant vaccination has significant potential in reducing the number of hospitalisations in the most vulnerable group and that most of the reduction is due to indirect protection. It also suggests that marked public health benefit could be achieved through RSV vaccine delivered to age groups not seen as most at risk of severe disease. The second application is in the optimal design of studies aimed at collection of householdstratified infection data. A design decision involves making a tradeoff between the number of households to enrol and the sampling frequency. Two commonly used study designs are considered: crosssectional and cohort. The search for an optimal design uses Bayesian methods to explore the joint parameterdesign space combined with Shannon entropy of the posteriors to estimate the amount of information for each design. We found that for the crosssectional designs, the amount of information increases with the sampling intensity while the cohort design often exhibits a tradeoff between the number of households sampled and the intensity of followup. Our results broadly support the choices made in existing data collection studies. 

A SemiMarkovian Modeling of Limit Order Markets 13:00 Fri 11 Dec, 2015 :: Ingkarni Wardli 5.57 :: Anatoliy Swishchuk :: University of Calgary
Media...R. Cont and A. de Larrard (SIAM J. Financial Mathematics, 2013) introduced a tractable stochastic model for the dynamics of a limit order book, computing various quantities of interest such as the probability of a price increase or the diffusion limit of the price process. As suggested by empirical observations, we extend their framework to 1) arbitrary distributions for book events interarrival times (possibly nonexponential) and 2) both the nature of a new book event and its corresponding interarrival time depend on the nature of the previous book event. We do so by resorting to Markov renewal processes to model the dynamics of the bid and ask queues. We keep analytical tractability via explicit expressions for the Laplace transforms of various quantities of interest. Our approach is justified and illustrated by calibrating the model to the five stocks Amazon, Apple, Google, Intel and Microsoft on June 21st 2012. As in Cont and Larrard, the bidask spread remains constant equal to one tick, only the bid and ask queues are modelled (they are independent from each other and get reinitialized after a price change), and all orders have the same size. (This talk is based on our joint paper with Nelson Vadori (Morgan Stanley)). 

Mathematical modelling of the immune response to influenza 15:00 Thu 12 May, 2016 :: Ingkarni Wardli B20 :: Ada Yan :: University of Melbourne
Media...The immune response plays an important role in the resolution of primary influenza infection and prevention of subsequent infection in an individual. However, the relative roles of each component of the immune response in clearing infection, and the effects of interaction between components, are not well quantified.
We have constructed a model of the immune response to influenza based on data from viral interference experiments, where ferrets were exposed to two influenza strains within a short time period. The changes in viral kinetics of the second virus due to the first virus depend on the strains used as well as the interval between exposures, enabling inference of the timing of innate and adaptive immune response components and the role of crossreactivity in resolving infection. Our model provides a mechanistic explanation for the observed variation in viruses' abilities to protect against subsequent infection at short interexposure intervals, either by delaying the second infection or inducing stochastic extinction of the second virus. It also explains the decrease in recovery time for the second infection when the two strains elicit crossreactive cellular adaptive immune responses. To account for intersubject as well as intervirus variation, the model is formulated using a hierarchical framework. We will fit the model to experimental data using Markov Chain Monte Carlo methods; quantification of the model will enable a deeper understanding of the effects of potential new treatments.


Harmonic Analysis in Rough Contexts 15:10 Fri 13 May, 2016 :: Engineering South S112 :: Dr Pierre Portal :: Australian National University
Media...In recent years, perspectives on what constitutes the ``natural" framework within which to conduct various forms of mathematical analysis have shifted substantially. The common theme of these shifts can be described as a move towards roughness, i.e. the elimination of smoothness assumptions that had previously been considered fundamental. Examples include partial differential equations on domains with a boundary that is merely Lipschitz continuous, geometric analysis on metric measure spaces that do not have a smooth structure, and stochastic analysis of dynamical systems that have nowhere differentiable trajectories.
In this talk, aimed at a general mathematical audience, I describe some of these shifts towards roughness, placing an emphasis on harmonic analysis, and on my own contributions. This includes the development of heat kernel methods in situations where such a kernel is merely a distribution, and applications to deterministic and stochastic partial differential equations. 

Behavioural Microsimulation Approach to Social Policy and Behavioural Economics 15:10 Fri 20 May, 2016 :: S112 Engineering South :: Dr Drew Mellor :: Ernst & Young
SIMULAIT is a general purpose, behavioural microsimulation system designed to predict behavioural trends in human populations. This type of predictive capability grew out of original research initially conducted in conjunction with the Defence Science and Technology Group (DSTO) in South Australia, and has been fully commercialised and is in current use by a global customer base. To our customers, the principal value of the system lies in its ability to predict likely outcomes to scenarios that challenge conventional approaches based on extrapolation or generalisation. These types of scenarios include: the impact of disruptive technologies, such as the impact of widespread adoption of autonomous vehicles for transportation or batteries for household energy storage; and the impact of effecting policy elements or interventions, such as the impact of imposing water usage restrictions.
SIMULAIT employs a multidisciplinary methodology, drawing from agentbased modelling, behavioural science and psychology, microeconomics, artificial intelligence, simulation, game theory, engineering, mathematics and statistics. In this seminar, we start with a highlevel view of the system followed by a look under the hood to see how the various elements come together to answer questions about behavioural trends. The talk will conclude with a case study of a recent application of SIMULAIT to a significant policy problem  how to address the deficiency of STEM skilled teachers in the Victorian teaching workforce. 

Student Performance Issues in First Year University Calculus 15:10 Fri 10 Jun, 2016 :: Engineering South S112 :: Dr Christine Mangelsdorf :: University of Melbourne
Media...MAST10006 Calculus 2 is the largest subject in the School of Mathematics and Statistics at the University of Melbourne, accounting for about 2200 out of 7400 first year enrolments. Despite excellent and consistent feedback from students on lectures, tutorials and teaching materials, scaled failure rates in Calculus 2 averaged an unacceptably high 29.4% (with raw failure rates reaching 40%) by the end of 2014. To understand the issues behind the poor student performance, we studied the exam papers of students with grades of 4049% over a threeyear period. In this presentation, I will present data on areas of poor performance in the final exam, show samples of student work, and identify possible causes for their errors. Many of the performance issues are found to relate to basic weaknesses in the studentsâ secondary school mathematical skills that inhibit their ability to successfully complete Calculus 2. Since 2015, we have employed a number of approaches to support studentsâ learning that significantly improved student performance in assessment. I will discuss the changes made to assessment practices and extra support materials provided online and in person, that are driving the improvement. 

SIR epidemics with stages of infection 12:10 Wed 28 Sep, 2016 :: EM218 :: Matthieu Simon :: Universite Libre de Bruxelles
Media...This talk is concerned with a stochastic model for the spread of an epidemic in a closed homogeneously mixing population. The population is subdivided into three classes of individuals: the susceptibles, the infectives and the removed cases. In short, an infective remains infectious during a random period of time. While infected, it can contact all the susceptibles present, independently of the other infectives. At the end of the infectious period, it becomes a removed case and has no further part in the infection process.
We represent an infectious period as a set of different stages that an infective can go through before being removed. The transitions between stages are ruled by either a Markov process or a semiMarkov process. In each stage, an infective makes contaminations at the epochs of a Poisson process with a specific rate.
Our purpose is to derive closed expressions for a transform of different statistics related to the end of the epidemic, such as the final number of susceptibles and the area under the trajectories of all the infectives. The analysis is performed by using simple matrix analytic methods and martingale arguments. Numerical illustrations will be provided at the end of the talk. 

Transmission Dynamics of Visceral Leishmaniasis: designing a test and treat control strategy 12:10 Thu 29 Sep, 2016 :: EM218 :: Graham Medley :: London School of Hygiene & Tropical Medicine
Media...Visceral Leishmaniasis (VL) is targeted for elimination from the Indian SubContinent. Progress has been much better in some areas than others. Current control is based on earlier diagnosis and treatment and on insecticide spraying to reduce the density of the vector. There is a surprising dearth of specific information on the epidemiology of VL, which makes modelling more difficult. In this seminar, I describe a simple framework that gives some insight into the transmission dynamics. We conclude that the majority of infection comes from cases prior to diagnosis. If this is the case then, early diagnosis will be advantageous, but will require a test with high specificity. This is a paradox for many clinicians and public health workers, who tend to prioritise high sensitivity.
Medley, G.F., Hollingsworth, T.D., Olliaro, P.L. & Adams, E.R. (2015) Healthseeking, diagnostics and transmission in the control of visceral leishmaniasis. Nature 528, S102S108 (3 December 2015), DOI: 10.1038/nature16042 

Measuring and mapping carbon dioxide from remote sensing satellite data 15:10 Fri 21 Oct, 2016 :: Napier G03 :: Prof Noel Cressie :: University of Wollongong
Media...This talk is about environmental statistics for global remote sensing of atmospheric carbon dioxide, a leading greenhouse gas. An important compartment of the carbon cycle is atmospheric carbon dioxide (CO2), where it (and other gases) contribute to climate change through a greenhouse effect. There are a number of CO2 observational programs where measurements are made around the globe at a small number of groundbased locations at somewhat regular time intervals. In contrast, satellitebased programs are spatially global but give up some of the temporal richness. The most recent satellite launched to measure CO2 was NASA's Orbiting Carbon Observatory2 (OCO2), whose principal objective is to retrieve a geographical distribution of CO2 sources and sinks. OCO2's measurement of columnaveraged mole fraction, XCO2, is designed to achieve this, through a dataassimilation procedure that is statistical at its basis. Consequently, uncertainty quantification is key, starting with the spectral radiances from an individual sounding to borrowing of strength through spatialstatistical modelling. 

Minimal surfaces and complex analysis 12:10 Fri 24 Mar, 2017 :: Napier 209 :: Antonio Alarcon :: University of Granada
Media...A surface in the Euclidean space R^3 is said to be minimal if it is locally areaminimizing, meaning that every point in the surface admits a compact neighborhood with the least area among all the surfaces with the same boundary. Although the origin of minimal surfaces is in physics, since they can be realized locally as soap films, this family of surfaces lies in the intersection of many fields of mathematics. In particular, complex analysis in one and several variables plays a fundamental role in the theory. In this lecture we will discuss the influence of complex analysis in the study of minimal surfaces. 

Hodge theory on the moduli space of Riemann surfaces 12:10 Fri 5 May, 2017 :: Napier 209 :: Jesse GellRedman :: University of Melbourne
Media...The Hodge theorem on a closed Riemannian manifold identifies the deRham cohomology with the space of harmonic differential forms. Although there are various extensions of the Hodge theorem to singular or complete but noncompact spaces, when there is an identification of L^2 Harmonic forms with a topological feature of the underlying space, it is highly dependent on the nature of infinity (in the noncompact case) or the locus of incompleteness; no unifying theorem treats all cases. We will discuss work toward extending the Hodge theorem to singular Riemannian manifolds where the singular locus is an incomplete cusp edge. These can be pictured locally as a bundle of horns, and they provide a model for the behavior of the WeilPetersson metric on the compactified Riemann moduli space near the interior of a divisor. Joint with J. Swoboda and R. Melrose. 

Constructing differential string structures 14:10 Wed 7 Jun, 2017 :: EM213 :: David Roberts :: University of Adelaide
Media...String structures on a manifold are analogous to spin structures, except instead of lifting the structure group through the extension Spin(n)\to SO(n) of Lie groups, we need to lift through the extension String(n)\to Spin(n) of Lie *2groups*. Such a thing exists if the first fractional Pontryagin class (1/2)p_1 vanishes in cohomology. A differential string structure also lifts connection data, but this is rather complicated, involving a number of locally defined differential forms satisfying cocyclelike conditions. This is an expansion of the geometric string structures of Stolz and Redden, which is, for a given connection A, merely a 3form R on the frame bundle such that dR = tr(F^2) for F the curvature of A; in other words a trivialisation of the de Rham class of (1/2)p_1. I will present work in progress on a framework (and specific results) that allows explicit calculation of the differential string structure for a large class of homogeneous spaces, which also yields formulas for the StolzRedden form. I will comment on the application to verifying the refined Stolz conjecture for our particular class of homogeneous spaces. Joint work with Ray Vozzo. 

Equivariant formality of homogeneous spaces 12:10 Fri 29 Sep, 2017 :: Engineering Sth S111 :: Alex ChiKwong Fok :: University of Adelaide
Equivariant formality, a notion in equivariant topology introduced by GoreskyKottwitzMacpherson, is a desirable property of spaces with group actions, which allows the application of localisation formula to evaluate integrals of any top closed forms and enables one to compute easily the equivariant cohomology. Broad classes of spaces of especial interest are wellknown to be equivariantly formal, e.g., compact symplectic manifolds equipped with Hamiltonian compact Lie group actions and projective varieties equipped with linear algebraic torus actions, of which flag varieties are examples. Less is known about compact homogeneous spaces G/K equipped with the isotropy action of K, which is not necessarily of maximal rank. In this talk we will review previous attempts of characterizing equivariant formality of G/K, and present our recent results on this problem using an analogue of equivariant formality in Ktheory. Part of the work presented in this talk is joint with Jeffrey Carlson. 

How oligomerisation impacts steady state gradient in a morphogenreceptor system 15:10 Fri 20 Oct, 2017 :: Ingkarni Wardli 5.57 :: Mr Phillip Brown :: University of Adelaide
In developmental biology an important process is cell fate determination, where cells start to differentiate their form and function. This is an element of the broader concept of morphogenesis. It has long been held that cell differentiation can occur by a chemical signal providing positional information to 'undecided' cells. This chemical produces a gradient of concentration that indicates to a cell what path it should develop along. More recently it has been shown that in a particular system of this type, the chemical (protein) does not exist purely as individual molecules, but can exist in multiprotein complexes known as oligomers.
Mathematical modelling has been performed on systems of oligomers to determine if this concept can produce useful gradients of concentration. However, there are wide range of possibilities when it comes to how oligomer systems can be modelled and most of them have not been explored.
In this talk I will introduce a new monomer system and analyse it, before extending this model to include oligomers. A number of oligomer models are proposed based on the assumption that proteins are only produced in their oligomer form and can only break apart once they have left the producing cell. It will be shown that when oligomers are present under these conditions, but only monomers are permitted to bind with receptors, then the system can produce robust, biologically useful gradients for a significantly larger range of model parameters (for instance, degradation, production and binding rates) compared to the monomer system. We will also show that when oligomers are permitted to bind with receptors there is negligible difference compared to the monomer system. 

Stochastic Modelling of Urban Structure 11:10 Mon 20 Nov, 2017 :: Engineering Nth N132 :: Mark Girolami :: Imperial College London, and The Alan Turing Institute
Media...Urban systems are complex in nature and comprise of a large number of individuals that act according to utility, a measure of net benefit pertaining to preferences. The actions of individuals give rise to an emergent behaviour, creating the socalled urban structure that we observe. In this talk, I develop a stochastic model of urban structure to formally account for uncertainty arising from the complex behaviour. We further use this stochastic model to infer the components of a utility function from observed urban structure. This is a more powerful modelling framework in comparison to the ubiquitous discrete choice models that are of limited use for complex systems, in which the overall preferences of individuals are difficult to ascertain. We model urban structure as a realization of a Boltzmann distribution that is the invariant distribution of a related stochastic differential equation (SDE) that describes the dynamics of the urban system. Our specification of Boltzmann distribution assigns higher probability to stable configurations, in the sense that consumer surplus (demand) is balanced with running costs (supply), as characterized by a potential function. We specify a Bayesian hierarchical model to infer the components of a utility function from observed structure. Our model is doublyintractable and poses significant computational challenges that we overcome using recent advances in Markov chain Monte Carlo (MCMC) methods. We demonstrate our methodology with case studies on the London retail system and airports in England. 

A Hecke module structure on the KKtheory of arithmetic groups 13:10 Fri 2 Mar, 2018 :: Barr Smith South Polygon Lecture theatre :: Bram Mesland :: University of Bonn
Media...Let $G$ be a locally compact group, $\Gamma$ a discrete subgroup and $C_{G}(\Gamma)$ the commensurator of $\Gamma$ in $G$. The cohomology of $\Gamma$ is a module over the Shimura Hecke ring of the pair $(\Gamma,C_G(\Gamma))$. This construction recovers the action of the Hecke operators on modular forms for $SL(2,\mathbb{Z})$ as a particular case. In this talk I will discuss how the Shimura Hecke ring of a pair $(\Gamma, C_{G}(\Gamma))$ maps into the $KK$ring associated to an arbitrary $\Gamma$C*algebra. From this we obtain a variety of $K$theoretic Hecke modules. In the case of manifolds the Chern character provides a Hecke equivariant transformation into cohomology, which is an isomorphism in low dimensions. We discuss Hecke equivariant exact sequences arising from possibly noncommutative compactifications of $\Gamma$spaces. Examples include the BorelSerre and geodesic compactifications of the universal cover of an arithmetic manifold, and the totally disconnected boundary of the BruhatTits tree of $SL(2,\mathbb{Z})$. This is joint work with M.H. Sengun (Sheffield). 

Calculating optimal limits for transacting credit card customers 15:10 Fri 2 Mar, 2018 :: Horace Lamb 1022 :: Prof Peter Taylor :: University of Melbourne
Credit card users can roughly be divided into `transactors', who pay off their balance each month, and `revolvers', who maintain an outstanding balance, on which they pay substantial interest.
In this talk, we focus on modelling the behaviour of an individual transactor customer. Our motivation is to calculate an optimal credit limit from the bank's point of view. This requires an expression for the expected outstanding balance at the end of a payment period.
We establish a connection with the classical newsvendor model. Furthermore, we derive the Laplace transform of the outstanding balance, assuming that purchases are made according to a marked point process and that there is a simplified balance control policy which prevents all purchases in the rest of the payment period when the credit limit is exceeded. We then use the newsvendor model and our modified model to calculate bounds on the optimal credit limit for the more realistic balance control policy that accepts all purchases that do not exceed the limit.
We illustrate our analysis using a compound Poisson process example and show that the optimal limit scales with the distribution of the purchasing process, while the probability of exceeding the optimal limit remains constant.
Finally, we apply our model to some real credit card purchase data. 

Quantum Airy structures and topological recursion 13:10 Wed 14 Mar, 2018 :: Ingkarni Wardli B17 :: Gaetan Borot :: MPI Bonn
Media...Quantum Airy structures are Lie algebras of quadratic differential operators  their classical limit describes Lagrangian subvarieties in symplectic vector spaces which are tangent to the zero section and cut out by quadratic equations. Their partition function  which is the function annihilated by the collection of differential operators  can be computed by the topological recursion. I will explain how to obtain quantum Airy structures from spectral curves, and explain how we can retrieve from them correlation functions of semisimple cohomological field theories, by exploiting the symmetries. This is based on joint work with Andersen, Chekhov and Orantin. 

Topological Data Analysis 15:10 Fri 31 Aug, 2018 :: Napier 208 :: Dr Vanessa Robins :: Australian National University
Topological Data Analysis has grown out of work focussed on deriving qualitative and yet quantifiable information about the shape of data. The underlying assumption is that knowledge of shape  the way the data are distributed  permits highlevel reasoning and modelling of the processes that created this data. The 0th order aspect of shape is the number pieces: "connected components" to a topologist; "clustering" to a statistician. Higherorder topological aspects of shape are holes, quantified as "nonbounding cycles" in homology theory. These signal the existence of some type of constraint on the datagenerating process.
Homology lends itself naturally to computer implementation, but its naive application is not robust to noise. This inspired the development of persistent homology: an algebraic topological tool that measures changes in the topology of a growing sequence of spaces (a filtration). Persistent homology provides invariants called the barcodes or persistence diagrams that are sets of intervals recording the birth and death parameter values of each homology class in the filtration. It captures information about the shape of data over a range of length scales, and enables the identification of "noisy" topological structure.
Statistical analysis of persistent homology has been challenging because the raw information (the persistence diagrams) are provided as sets of intervals rather than functions. Various approaches to converting persistence diagrams to functional forms have been developed recently, and have found application to data ranging from the distribution of galaxies, to porous materials, and cancer detection. 
News matching "Quadratic Forms in Statistics: Evaluating Contribu" 
The Armitage Lecture Associate Professor Patty Solomon (Statistics) has been invited to present the prestigious Armitage Lecture for 2007 in Cambridge. Posted Thu 18 Jan 07. 

Potts Medal Winner Professor Charles Pearce, the Elder Profesor of Mathematics, was awarded the Ren Potts Medal by the Australian Society for Operations
Research at its annual meeting in December. This is a national award for outstanding
contributions to Operations Research in Australia.
Posted Tue 22 Jan 08. 

Positions available in the School (5) The School is currently seeking a Professor of Statistics, an Associate Professor of Statistics, a Lecturer/Senior Lecturer in Applied Mathematics, a Lecturer in Applied Mathematics and a Lecturer in Pure Mathematics. See the University's jobs website for full details, including the selection criteria. Posted Fri 23 May 08. 

Teaching Fellow Position Visiting Teaching Fellow School of Mathematical Sciences (Ref: 3808)
We are seeking a Visiting Teaching Fellow (Associate Lecturer) who will be
responsible for developing better links between the University of Adelaide
and secondary schools and developing new approaches for firstyear
undergraduate teaching. You will be required to conduct tutorials in first
year mathematics and statistics subjects for up to 16 hours per week, and
assist in subject assessment and curriculum development.
This position would suit an experienced mathematics teacher with strong
mathematical training and an interest and recent involvement in teaching
advanced mathematics units in years 11 and 12. Fixedterm position available
from 19 January 2009 to 31 December 2009. Salary: (Level A) $49,053 
$66,567 per annum.Plus an employer superannuation contribution of 17%
applies. (Closing date 14/11/08.)
Please see the University web site for further details. Posted Wed 17 Sep 08. 

ARC Future Fellowship success Associate Professor Zudi Lu has been awarded an ARC Future Fellowship. Associate Professor Lu, and Associate Professor in Statistics, will use the support provided by his Future Fellowship to further improve the theory and practice of econometric modelling of nonlinear spatial time series. Congratulations Zudi. Posted Thu 12 May 11. 

IGAAMSI Workshop: Groupvalued moment maps with applications to mathematics and physics (5–9 September 2011) Lecture series by Eckhard Meinrenken, University of Toronto. Titles of
individual lectures: 1) Introduction to Gvalued moment maps. 2) Dirac
geometry and Witten's volume formulas. 3) DixmierDouady theory and
prequantization. 4) Quantization of groupvalued moment maps. 5)
Application to Verlinde formulas. These lectures will be supplemented by
additional talks by invited speakers. For more details, please see the
conference webpage
Posted Wed 27 Jul 11.More information... 

Two contract positions are available As a result of the School's success in securing two prestigious Australian Research Council Future Fellowships, we now have two limited term positions available, one in Pure Mathematics and one in Statistics. Posted Wed 14 Dec 11. 

A/Prof Joshua Ross, 2017 Moran Medal recipient Congratulations to Associate Professor Joshua Ross who has won the 2017 Moran Medal, awarded by the Australian Academy of Science.
The Moran Medal recognises outstanding research by scientists up to 10 years postPhD in applied probability, biometrics, mathematical genetics, psychometrics and statistics.
Associate Professor Ross has made influential contributions to public health and conservation biology using mathematical modelling and statistics to help in decision making.
Posted Fri 23 Dec 16.More information... 
Publications matching "Quadratic Forms in Statistics: Evaluating Contribu"Publications 

Conditional expectation formulae for copulas Crane, Glenis Jayne; Van Der Hoek, John, Australian & New Zealand Journal of Statistics 50 (53–67) 2008  Some UStatistics in goodnessoffit tests derived from characterizations via record values Morris, Kerwin; Szynal, D, International Journal of Pure and Applied Mathematics 46 (507–582) 2008  The need for simulation in evaluating anomaly detectors Ringberg, H; Roughan, Matthew; Rexford, J, Computer Communication Review 38 (55–59) 2008  Strengthened forms of an integral inequality arising in connection with the large sieve Pearce, Charles; Pecaric, Josip, 8th International Conference on Nonlinear Functional Analysis and Applications, Seoul, South Korea 09/08/04  Goodnessoffit tests based on characterizations involving moments of order statistics Morris, Kerwin; Szynal, D, International Journal of Pure and Applied Mathematics 38 (83–121) 2007  Optimal multilinear estimation of a random vector under constraints of casualty and limited memory Howlett, P; Torokhti, Anatoli; Pearce, Charles, Computational Statistics & Data Analysis 52 (869–878) 2007  Special tensors in the deformation theory of quadratic algebras for the classical Lie algebras Eastwood, Michael; Somberg, P; Soucek, V, Journal of Geometry and Physics 57 (2539–2546) 2007  Statistics in review; Part 1: graphics, data summary and linear models Moran, John; Solomon, Patricia, Critical care and Resuscitation 9 (81–90) 2007  Statistics in review; Part 2: Generalised linear models, timetoevent and timeseries analysis, evidence synthesis and clinical trials Moran, John; Solomon, Patricia, Critical care and Resuscitation 9 (187–197) 2007  Computer algebra derives normal forms of stochastic differential equations Roberts, Anthony John,  Identifying persistence in rainfall and streamflow extremes and other hydrological variables Whiting, Julian; Lambert, Martin; Metcalfe, Andrew, 30th Hydrology and Water Resources Symposium, Launceston, Tasmania 04/12/06  The effect on survival of early detection of breast cancer in South Australia Tallis, George; Leppard, Phillip; O'Neill, Terence, Model Assisted Statistics and Applications 1 (115–123) 2006  A note on some mahonian statistics Clarke, Robert, Seminaire Lotharingien de Combinatoire 53 (1–5) 2005  Designing followup intervals Raab, G; Davies, J; Salter, Amy, Statistics in Medicine 23 (3125–3137) 2004  Geometrical contributions to secret sharing theory Jackson, WenAi; Martin, K; O'Keefe, Christine, Journal of Geometry 79 (102–133) 2004  Goodnessoffit tests using dual versions of characterizations via moments of order statistics Morris, Kerwin; Szynal, D, Journal of Mathematical Sciences 122 (3365–3383) 2004  On dual characterizations of continuous distributions in terms of expected values of two functions of order statistics and record values Alinowska, I; Morris, Kerwin; Szynal, D, Journal of Mathematical Sciences 121 (2664–2673) 2004  A note on the sensitivities of selfreporting and screen detection of primary breast tumours Tallis, George; Leppard, Phillip; O'Neill, Terence, Australian & New Zealand Journal of Statistics 45 (7–18) 2003  Goodnessoffit tests based on characterizations in terms of moments of order statistics Morris, Kerwin; Szynal, D, Applicationes Mathematicae 29 (251–283) 2002  On the best quadratic approximation of nonlinear systems Torokhti, Anatoli; Howlett, P, IEEE Transactions on Circuits and Systems I  regular papers 48 (595–602) 2001  Positive random variables and the AGH inequality Pearce, Charles, Australian Mathematical Society Gazette 27 (91–95) 2000 
Advanced search options
You may be able to improve your search results by using the following syntax:
Query  Matches the following 

Asymptotic Equation  Anything with "Asymptotic" or "Equation". 
+Asymptotic +Equation  Anything with "Asymptotic" and "Equation". 
+Stokes "NavierStokes"  Anything containing "Stokes" but not "NavierStokes". 
Dynam*  Anything containing "Dynamic", "Dynamical", "Dynamicist" etc. 
