January
2020  M  T  W  T  F  S  S    1  2  3  4  5  6  7  8  9  10  11  12  13  14  15  16  17  18  19  20  21  22  23  24  25  26  27  28  29  30  31          

Search the School of Mathematical SciencesEvents matching "Epidemiological consequences of householdbased an" 
Mathematical modelling of multidimensional tissue growth 16:10 Tue 24 Oct, 2006 :: Benham Lecture Theatre :: Prof John King
Some simple continuummechanicsbased models for the
growth of biological tissue will be formulated and their properties
(particularly with regard to stability) described. 

A Bivariate Zeroinflated Poisson Regression Model and application to some Dental Epidemiological data 14:10 Fri 27 Oct, 2006 :: G08 Mathematics Building University of Adelaide :: University Prof Sudhir Paul
Data in the form of paired (pretreatment, posttreatment) counts arise in the study of the effects of several treatments after accounting for possible covariate effects. An example of such a data set comes from a dental epidemiological study in Belo Horizonte (the Belo Horizonte caries prevention study) which evaluated various programmes for reducing caries. Also, these data may show extra pairs of zeros than can be accounted for by a simpler model, such as, a bivariate Poisson regression model. In such situations we propose to use a zeroinflated bivariate Poisson regression (ZIBPR) model for the paired (pretreatment, posttreatment) count data. We develop EM algorithm to obtain maximum likelihood estimates of the parameters of the ZIBPR model. Further, we obtain exact Fisher information matrix of the maximum likelihood estimates of the parameters of the ZIBPR model and develop a procedure for testing treatment effects. The procedure to detect treatment effects based on the ZIBPR model is compared, in terms of size, by simulations, with an earlier procedure using a zeroinflated Poisson regression (ZIPR) model of the posttreatment count with the pretreatment count treated as a covariate. The procedure based on the ZIBPR model holds level most effectively. A further simulation study indicates good power property of the procedure based on the ZIBPR model. We then compare our analysis, of the decayed, missing and filled teeth (DMFT) index data from the caries prevention study, based on the ZIBPR model with the analysis using a zeroinflated Poisson regression model in which the pretreatment DMFT index is taken to be a covariate 

Good and Bad Vibes 15:10 Fri 23 Feb, 2007 :: G08 Mathematics Building University of Adelaide :: Prof. Maurice Dodson
Media...Collapsing bridges and exploding rockets have been associated with vibrations in resonance with natural frequencies. As well, the stability of the solar system and the existence of solutions of SchrÃ¶dinger\'s equation and the wave equation are problematic in the presence of resonances. Such resonances can be avoided, or at least mitigated, by using ideas from Diophantine approximation, a branch of number theory. Applications of Diophantine approximation to these problems will be given and will include a connection with LISA (Laser Interferometer Space Antenna), a spacebased gravity wave detector under construction. 

Insights into the development of the enteric nervous system and Hirschsprung's disease 15:10 Fri 24 Aug, 2007 :: G08 Mathematics building University of Adelaide :: Assoc. Prof. Kerry Landman :: Department of Mathematics and Statistics, University of Melbourne
During the development of the enteric nervous system, neural crest (NC) cells must first migrate into and colonise the entire gut from stomach to anal end. The migratory precursor NC cells change type and differentiate into neurons and glia cells. These cells form the enteric nervous system, which gives rise to normal gut function and peristaltic contraction. Failure of the NC cells to invade the whole gut results in a lack of neurons in a length of the terminal intestine. This potentially fatal condition, marked by intractable constipation, is called Hirschsprung's Disease. The interplay between cell migration, cell proliferation and embryonic gut growth are important to the success of the NC cell colonisation process.
Multiscale models are needed in order to model the different spatiotemporal scales of the NC invasion. For example, the NC invasion wave moves into unoccupied regions of the gut with a wave speed of around 40 microns per hour. New timelapse techniques have shown that there is a weblike network structure within the invasion wave. Furthermore, within this network, individual cell trajectories vary considerably.
We have developed a populationscale model for basic rules governing NC cell invasive behaviour incorporating the important mechanisms. The model predictions were tested experimentally. Mathematical and experimental results agreed. The results provide an understanding of why many of the genes implicated in Hirschsprung's Disease influence NC population size. Our recently developed individual cellbased model also produces an invasion wave with a welldefined wave speed; however, in addition Individual cell trajectories within the invasion wave can be extracted. Further challenges in modeling the various scales of the developmental system will be discussed. 

Statistical Critique of the International Panel on Climate Change's work on Climate Change. 18:00 Wed 17 Oct, 2007 :: Union Hall University of Adelaide :: Mr Dennis Trewin
Climate change is one of the most important issues facing us today. Many governments have introduced or are developing appropriate policy interventions to (a) reduce the growth of greenhouse gas emissions in order to mitigate future climate change, or (b) adapt to future climate change.
This important work deserves a high quality statistical data base but there are statistical shortcomings in the work of the International Panel on Climate Change (IPCC). There has been very little involvement of qualified statisticians in the very important work of the IPCC which appears to be scientifically meritorious in most other ways.
Mr Trewin will explain these shortcomings and outline his views on likely future climate change, taking into account the statistical deficiencies.
His conclusions suggest climate change is still an important issue that needs to be addressed but the range of likely outcomes is a lot lower than has been suggested by the IPCC.
This presentation will be based on an invited paper presented at the OECD World Forum.


Values of transcendental entire functions at algebraic points. 15:10 Fri 28 Mar, 2008 :: LG29 Napier Building University of Adelaide :: Prof. Eugene Poletsky :: Syracuse University, USA
Algebraic numbers are roots of polynomials with integer coefficients, so their set is countable. All other numbers are called transcendental. Although most numbers are transcendental, it was only in 1873 that Hermite proved that the base $e$ of natural logarithms is not algebraic. The proof was based on the fact that $e$ is the value at 1 of the exponential function $e^z$ which is entire and does not change under differentiation.
This achievement raised two questions: What entire functions take only transcendental values at algebraic points? Also, given an entire transcendental function $f$, describe, or at least find properties of, the set of algebraic numbers where the values of $f$ are also algebraic. The first question, developed by Siegel, Shidlovsky, and others, led to the notion of $E$functions, which have controlled derivatives. Answering the second question, Polya and Gelfond obtained restrictions for entire functions that have integer values at integer points (Polya) or Gaussian integer values at Gaussian integer points (Gelfond). For more general sets of points only counterexamples were known.
Recently D. Coman and the speaker developed new tools for the second question, which give an answer, at least partially, for general entire functions and their values at general sets of algebraic points.
In my talk we will discuss old and new results in this direction. All relevant definitions will be provided and the talk will be accessible to postgraduates and honours students. 

Global and Local stationary modelling in finance: Theory and empirical evidence 14:10 Thu 10 Apr, 2008 :: G04 Napier Building University of Adelaide :: Prof. Dominique Guégan :: Universite Paris 1 PantheonSorbonne
To model real data sets using second order stochastic processes imposes that the data sets verify the second order stationarity condition. This stationarity condition concerns the unconditional moments of the process. It is in that context that most of models developed from the sixties' have been studied; We refer to the ARMA processes (Brockwell and Davis, 1988), the ARCH, GARCH and EGARCH models (Engle, 1982, Bollerslev, 1986, Nelson, 1990), the SETAR process (Lim and Tong, 1980 and Tong, 1990), the bilinear model (Granger and Andersen, 1978, Guégan, 1994), the EXPAR model (Haggan and Ozaki, 1980), the long memory process (Granger and Joyeux, 1980, Hosking, 1981, Gray, Zang and Woodward, 1989, Beran, 1994, Giraitis and Leipus, 1995, Guégan, 2000), the switching process (Hamilton, 1988). For all these models, we get an invertible causal solution under specific conditions on the parameters, then the forecast points and the forecast intervals are available.
Thus, the stationarity assumption is the basis for a general asymptotic theory for identification, estimation and forecasting. It guarantees that the increase of the sample size leads to more and more information of the same kind which is basic for an asymptotic theory to make sense.
Now nonstationarity modelling has also a long tradition in econometrics. This one is based on the conditional moments of the data generating process. It appears mainly in the heteroscedastic and volatility models, like the GARCH and related models, and stochastic volatility processes (Ghysels, Harvey and Renault 1997). This nonstationarity appears also in a different way with structural changes models like the switching models (Hamilton, 1988), the stopbreak model (Diebold and Inoue, 2001, Breidt and Hsu, 2002, Granger and Hyung, 2004) and the SETAR models, for instance. It can also be observed from linear models with time varying coefficients (Nicholls and Quinn, 1982, Tsay, 1987).
Thus, using stationary unconditional moments suggest a global stationarity for the model, but using nonstationary unconditional moments or nonstationary conditional moments or assuming existence of states suggest that this global stationarity fails and that we only observe a local stationary behavior.
The growing evidence of instability in the stochastic behavior of stocks, of exchange rates, of some economic data sets like growth rates for instance, characterized by existence of volatility or existence of jumps in the variance or on the levels of the prices imposes to discuss the assumption of global stationarity and its consequence in modelling, particularly in forecasting. Thus we can address several questions with respect to these remarks.
1. What kinds of nonstationarity affect the major financial and economic data sets? How to detect them?
2. Local and global stationarities: How are they defined?
3. What is the impact of evidence of nonstationarity on the statistics computed from the global non stationary data sets?
4. How can we analyze data sets in the nonstationary global framework? Does the asymptotic theory work in nonstationary framework?
5. What kind of models create local stationarity instead of global stationarity? How can we use them to develop a modelling and a forecasting strategy?
These questions began to be discussed in some papers in the economic literature. For some of these questions, the answers are known, for others, very few works exist. In this talk I will discuss all these problems and will propose 2 new stategies and modelling to solve them. Several interesting topics in empirical finance awaiting future research will also be discussed.


The Mathematics of String Theory 15:10 Fri 2 May, 2008 :: LG29 Napier Building University of Adelaide :: Prof. Peter Bouwknegt :: Department of Mathematics, ANU
String Theory has had, and continues to have, a profound impact on
many areas of mathematics and vice versa. In this talk I want to
address some relatively recent developments. In particular I will
argue, following Witten and others, that Dbrane charges take values
in the Ktheory of spacetime, rather than in integral cohomology as
one might have expected. I will also explore the mathematical
consequences of a particular symmetry, called Tduality, in this context.
I will give an intuitive introduction into Dbranes and Ktheory.
No prior knowledge about either String Theory, Dbranes or Ktheory
is required. 

Puzzlebased learning: Introduction to mathematics 15:10 Fri 23 May, 2008 :: LG29 Napier Building University of Adelaide :: Prof. Zbigniew Michalewicz :: School of Computer Science, University of Adelaide
Media...The talk addresses a gap in the educational curriculum for 1st year students by proposing a new course that aims at getting students to think about how to frame and solve unstructured problems. The idea is to increase the student's mathematical awareness and problemsolving skills by discussing a variety of puzzles. The talk makes an argument that this approach  called PuzzleBased Learning  is very beneficial for introducing mathematics, critical thinking, and problemsolving skills.
The new course has been approved by the University of Adelaide for Faculty of Engineering, Computer Science, and Mathematics. Many other universities are in the process of introducing such a course. The course will be offered in two versions: (a) fullsemester course and (b) a unit within general course (e.g. Introduction to Engineering). All teaching materials (power point slides, assignments, etc.) are being prepared. The new textbook (PuzzleBased Learning: Introduction to Critical Thinking, Mathematics, and Problem Solving) will be available from June 2008. The talk provides additional information on this development.
For further information see http://www.PuzzleBasedlearning.edu.au/ 

Computational Methods for Phase Response Analysis of Circadian Clocks 15:10 Fri 18 Jul, 2008 :: G04 Napier Building University of Adelaide. :: Prof. Linda Petzold :: Dept. of Mechanical and Environmental Engineering, University of California, Santa Barbara
Circadian clocks govern daily behaviors of organisms in all kingdoms of life. In mammals, the master clock resides in the suprachiasmatic nucleus (SCN) of the hypothalamus. It is composed of thousands of neurons, each of which contains a sloppy oscillator  a molecular clock governed by a transcriptional feedback network. Via intercellular signaling, the cell population synchronizes spontaneously, forming a coherent oscillation. This multioscillator is then entrained to its environment by the daily light/dark cycle.
Both at the cellular and tissular levels, the most important feature of the clock is its ability not simply to keep time, but to adjust its time, or phase, to signals. We present the parametric impulse phase response curve (pIPRC), an analytical analog to the phase response curve (PRC) used experimentally. We use the pIPRC to understand both the consequences of intercellular signaling and the light entrainment process. Further, we determine which model components determine the phase response behavior of a single oscillator by using a novel model reduction technique. We reduce the number of model components while preserving the pIPRC and then incorporate the resultant model into a couple SCN tissue model. Emergent properties, including the ability of the population to synchronize spontaneously are preserved in the reduction. Finally, we present some mathematical tools for the study of synchronization in a network of coupled, noisy oscillators.


Assisted reproduction technology: how maths can contribute 13:10 Wed 22 Oct, 2008 :: Napier 210 :: Dr Yvonne Stokes
Media...Most people will have heard of IVF (in vitro fertilisation), a
technology for helping infertile couples have a baby. Although there are
many IVF babies, many will also know that the success rate is still low
for the cost and inconvenience involved. The fact that some women
cannot make use of IVF because of lifethreatening consequences is less
well known but motivates research into other technologies, including
IVM (in vitro maturation).
What has all this to do with maths? Come along and find out how
mathematical modelling is contributing to understanding and
improvement in this important and interesting field.


Oceanographic Research at the South Australian Research and Development Institute: opportunities for collaborative research 15:10 Fri 21 Nov, 2008 :: Napier G04 :: Associate Prof John Middleton :: South Australian Research and Development Institute
Increasing threats to S.A.'s fisheries and marine environment have underlined the increasing need for soundly based research into the ocean circulation and ecosystems (phyto/zooplankton) of the shelf and gulfs. With support of Marine Innovation SA, the Oceanography Program has within 2 years, grown to include 6 FTEs and a budget of over $4.8M. The program currently leads two major research projects, both of which involve numerical and applied mathematical modelling of oceanic flow and ecosystems as well as statistical techniques for the analysis of data. The first is the implementation of the Southern Australian Integrated Marine Observing System (SAIMOS) that is providing data to understand the dynamics of shelf boundary currents, monitor for climate change and understand the phyto/zooplankton ecosystems that underpin SA's wild fisheries and aquaculture. SAIMOS involves the use of shipbased sampling, the deployment of underwater marine moorings, underwater gliders, HF Ocean RADAR, acoustic tracking of tagged fish and Autonomous Underwater vehicles.
The second major project involves measuring and modelling the ocean circulation and biological systems within Spencer Gulf and the impact on prawn larval dispersal and on the sustainability of existing and proposed aquaculture sites. The discussion will focus on opportunities for collaborative research with both faculty and students in this exciting growth area of S.A. science.


Key Predistribution in GridBased Wireless Sensor Networks 15:10 Fri 12 Dec, 2008 :: Napier G03 :: Dr Maura Paterson :: Information Security Group at Royal Holloway, University of London.
Wireless sensors are small, batterypowered devices that are deployed to
measure quantities such as temperature within a given region, then form
a wireless network to transmit and process the data they collect.
We discuss the problem of distributing symmetric cryptographic keys to
the nodes of a wireless sensor network in the case where the sensors are
arranged in a square or hexagonal grid, and we propose a key
predistribution scheme for such networks that is based on Costas arrays.
We introduce more general structures known as distinctdifference
configurations, and show that they provide a flexible choice of
parameters in our scheme, leading to more efficient performance than
that achieved by prior schemes from the literature. 

Nonlinear diffusiondriven flow in a stratified viscous fluid 15:00 Fri 26 Jun, 2009 :: Macbeth Lecture Theatre :: Associate Prof Michael Page :: Monash University
In 1970, two independent studies (by Wunsch and Phillips) of the behaviour of a linear densitystratified viscous fluid in a closed container demonstrated a slow flow can be generated simply due to the container having a sloping boundary surface This remarkable motion is generated as a result of the curvature of the lines of constant density near any sloping surface, which in turn enables a zero normalflux condition on the density to be satisfied along that boundary. When the Rayleigh number is large (or equivalently Wunsch's parameter $R$ is small) this motion is concentrated in the near vicinity of the sloping surface, in a thin `buoyancy layer' that has many similarities to an Ekman layer in a rotating fluid.
A number of studies have since considered the consequences of this type of `diffusivelydriven' flow in a semiinfinite domain, including in the deep ocean and with turbulent effects included. More recently, Page & Johnson (2008) described a steady linear theory for the broaderscale mass recirculation in a closed container and demonstrated that, unlike in previous studies, it is possible for the buoyancy layer to entrain fluid from that recirculation. That work has since been extended (Page & Johnson, 2009) to the nonlinear regime of the problem and some of the similarities to and differences from the linear case will be described in this talk. Simple and elegant analytical solutions in the limit as $R \to 0$ still exist in some situations, and they will be compared with numerical simulations in a tilted square container at small values of $R$. Further work on both the unsteady flow properties and the flow for other geometrical configurations will also be described. 

Predicting turbulence 12:10 Wed 12 Aug, 2009 :: Napier 210 :: Dr Trent Mattner :: University of Adelaide
Media...Turbulence is characterised by threedimensional unsteady fluid motion over a wide range of spatial and temporal scales. It is important in many problems of technological and scientific interest, such as drag reduction, energy production and climate prediction. In this talk, I will explain why turbulent flows are difficult to predict and describe a modern mathematical model of turbulence based on a random collection of fluid vortices.


Statistical analysis for harmonized development of systemic organs in human fetuses 11:00 Thu 17 Sep, 2009 :: School Board Room :: Prof Kanta Naito :: Shimane University
The growth processes of human babies have been studied
sufficiently in scientific fields, but there have still been many issues
about the developments of human fetus which are not clarified. The aim of
this research is to investigate the developing process of systemic organs of
human fetuses based on the data set of measurements of fetus's bodies and
organs. Specifically, this talk is concerned with giving a mathematical
understanding for the harmonized developments of the organs of human
fetuses. The method to evaluate such harmonies is proposed by the use of the
maximal dilatation appeared in the theory of quasiconformal mapping. 

Eigenanalysis of fluidloaded compliant panels 15:10 Wed 9 Dec, 2009 :: Santos Lecture Theatre :: Prof Tony Lucey :: Curtin University of Technology
This presentation concerns the fluidstructure interaction (FSI) that occurs between a fluid flow and an arbitrarily deforming flexible boundary considered to be a flexible panel or a compliant coating that comprises the wetted surface of a marine vehicle. We develop and deploy an approach that is a hybrid of computational and theoretical techniques. The system studied is twodimensional and linearised disturbances are assumed. Of particular novelty in the present work is the ability of our methods to extract a full set of fluidstructure eigenmodes for systems that have strong spatial inhomogeneity in the structure of the flexible wall.
We first present the approach and some results of the system in which an ideal, zeropressure gradient, flow interacts with a flexible plate held at both its ends. We use a combination of boundaryelement and finitedifference methods to express the FSI system as a single matrix equation in the interfacial variable. This is then couched in statespace form and standard methods used to extract the system eigenvalues. It is then shown how the incorporation of spatial inhomogeneity in the stiffness of the plate can be either stabilising or destabilising. We also show that adding a further restraint within the streamwise extent of a homogeneous panel can trigger an additional type of hydroelastic instability at low flow speeds. The mechanism for the fluidtostructure energy transfer that underpins this instability can be explained in terms of the pressuresignal phase relative to that of the wall motion and the effect on this relationship of the added wall restraint.
We then show how the idealflow approach can be conceptually extended to include boundarylayer effects. The flow field is now modelled by the continuity equation and the linearised perturbation momentum equation written in velocityvelocity form. The nearwall flow field is spatially discretised into rectangular elements on an Eulerian grid and a variant of the discretevortex method is applied. The entire fluidstructure system can again be assembled as a linear system for a single set of unknowns  the flowfield vorticity and the wall displacements  that admits the extraction of eigenvalues. We then show how stability diagrams for the fullycoupled finite flowstructure system can be assembled, in doing so identifying classes of wallbased or fluidbased and spatiotemporal wave behaviour.


Integrable systems: noncommutative versus commutative 14:10 Thu 4 Mar, 2010 :: School Board Room :: Dr Cornelia Schiebold :: Mid Sweden University
After a general introduction to integrable systems, we will explain an
approach to their solution theory, which is based on Banach space theory. The
main point is first to shift attention to noncommutative integrable systems and
then to extract information about the original setting via projection techniques.
The resulting solution formulas turn out to be particularly wellsuited to the
qualitative study of certain solution classes. We will show how one can obtain
a complete asymptotic description of the so called multiple pole solutions, a
problem that was only treated for special cases before. 

The fluid mechanics of gels used in tissue engineering 15:10 Fri 9 Apr, 2010 :: Santos Lecture Theatre :: Dr Edward Green :: University of Western Australia
Tissue engineering could be called 'the science of spare parts'.
Although currently in its infancy, its longterm aim is to grow
functional tissues and organs in vitro to replace those which have
become defective through age, trauma or disease. Recent experiments
have shown that mechanical interactions between cells and the materials
in which they are grown have an important influence on tissue
architecture, but in order to understand these effects, we first need to
understand the mechanics of the gels themselves.
Many biological gels (e.g. collagen) used in tissue engineering have a
fibrous microstructure which affects the way forces are transmitted
through the material, and which in turn affects cell migration and other
behaviours. I will present a simple continuum model of gel mechanics,
based on treating the gel as a transversely isotropic viscous material.
Two canonical problems are considered involving thin twodimensional
films: extensional flow, and squeezing flow of the fluid between two
rigid plates. Neglecting inertia, gravity and surface tension, in each
regime we can exploit the thin geometry to obtain a leadingorder
problem which is sufficiently tractable to allow the use of analytical
methods. I discuss how these results could be exploited practically to
determine the mechanical properties of real gels. If time permits, I
will also talk about work currently in progress which explores the
interaction between gel mechanics and cell behaviour. 

Loop groups and characteristic classes 13:10 Fri 23 Apr, 2010 :: School Board Room :: Dr Raymond Vozzo :: University of Adelaide
Suppose $G$ is a compact Lie group, $LG$ its (free) loop group and $\Omega G \subseteq LG$ its based loop group. Let $P \to M$ be a principal bundle with structure group one of these loop groups. In general, differential form representatives of characteristic classes for principal bundles can be easily obtained using the ChernWeil homomorphism, however for infinitedimensional bundles such as $P$ this runs into analytical problems and classes are more difficult to construct. In this talk I will explain some new results on characteristic classes for loop group bundles which demonstrate how to construct certain classeswhich we call string classesfor such bundles. These are obtained by making heavy use of a certain $G$bundle associated to any loop group bundle (which allows us to avoid the problems of dealing with infinitedimensional bundles). We shall see that the free loop group case naturally involves equivariant cohomology. 

Mathematical epidemiology with a focus on households 15:10 Fri 23 Apr, 2010 :: Napier G04 :: Dr Joshua Ross :: University of Adelaide
Mathematical models are now used routinely to inform national and global policymakers on issues that threaten human health or which have an adverse impact on the economy. In the first part of this talk I will provide an overview of mathematical epidemiology starting with the classical deterministic model and leading to some of the current challenges. I will then present some of my recently published work which provides computationallyefficient methods for studying a mathematical model incorporating household structure. We will conclude by briefly discussing some "workinprogess" which utilises these methods to address the issues of inference, and mixing pattern and contact structure, for emerging infections. 

Estimation of sparse Bayesian networks using a scorebased approach 15:10 Fri 30 Apr, 2010 :: School Board Room :: Dr Jessica Kasza :: University of Copenhagen
The estimation of Bayesian networks given highdimensional data sets, with more variables than there are observations, has been the focus of much recent research. These structures provide a flexible framework for the representation of the conditional independence relationships of a set of variables, and can be particularly useful in the estimation of genetic regulatory networks given gene expression data.
In this talk, I will discuss some new research on learning sparse networks, that is, networks with many conditional independence restrictions, using a scorebased approach. In the case of genetic regulatory networks, such sparsity reflects the view that each gene is regulated by relatively few other genes. The presented approach allows prior information about the overall sparsity of the underlying structure to be included in the analysis, as well as the incorporation of prior knowledge about the connectivity of individual nodes within the network.


Understanding convergence of meshless methods: Vortex methods and smoothed particle hydrodynamics 15:10 Fri 14 May, 2010 :: Santos Lecture Theatre :: A/Prof Lou Rossi :: University of Delaware
Meshless methods such as vortex methods (VMs) and smoothed particle
hydrodynamics (SPH) schemes offer many advantages in fluid flow computations.
Particlebased computations naturally adapt to complex flow geometries
and so provide a high degree of computational efficiency. Also, particle
based methods avoid CFL conditions because flow quantities are
integrated along characteristics. There are many approaches to
improving numerical methods, but one of the most effective routes
is quantifying the error through the direct estimate of residual
quantities. Understanding the residual for particle schemes requires
a different approach than for meshless schemes but the rewards are
significant. In this seminar, I will outline a general approach to
understanding convergence that has been effective in creating high
spatial accuracy vortex methods, and then I will discuss some recent
investigations in the accuracy of diffusion operators used in SPH
computations. Finally, I will provide some sample NavierStokes
computations of high Reynolds number flows using BlobFlow, an open
source implementation of the high precision vortex method. 

Whole genome analysis of repetitive DNA 15:10 Fri 21 May, 2010 :: Napier 209 :: Prof David Adelson :: University of Adelaide
The interspersed repeat content of mammalian genomes has been best characterized in human, mouse and cow. We carried out de novo identification of repeated elements in the equine genome and identified previously unknown elements present at low copy number. The equine genome contains typical eutherian mammal repeats. We analysed both interspersed and simple sequence repeats (SSR) genomewide, finding that some repeat classes are spatially correlated with each other as well as with G+C content and gene density. Based on these
spatial correlations, we have confirmed recentlydescribed ancestral vs cladespecific genome territories defined by repeat content. Territories enriched for ancestral repeats tended to be contiguous domains. To determine if these territories were evolutionarily conserved, we compared these results with a similar analysis of the human genome, and observed similar ancestral repeat enriched domains. These results indicate that ancestral, evolutionarily conserved mammalian genome territories can be identified on the basis of repeat content alone. Interspersed repeats of different ages appear to be analogous to geologic strata, allowing identification of ancient vs newly remodelled regions of mammalian genomes. 

The mathematics of theoretical inference in cognitive psychology 15:10 Fri 11 Jun, 2010 :: Napier LG24 :: Prof John Dunn :: University of Adelaide
The aim of psychology in general, and of cognitive psychology in particular, is to construct theoretical accounts of mental processes based on observed changes in performance on one or more cognitive tasks. The fundamental problem faced by the researcher is that these mental processes are not directly observable but must be inferred from changes in performance between different experimental conditions. This inference is further complicated by the fact that performance measures may only be monotonically related to the underlying psychological constructs. Statetrace analysis provides an approach to this problem which has gained increasing interest in recent years. In this talk, I explain statetrace analysis and discuss the set of mathematical issues that flow from it. Principal among these are the challenges of statistical inference and an unexpected connection to the mathematics of oriented matroids. 

A polyhedral model for boron nitride nanotubes 15:10 Fri 3 Sep, 2010 :: Napier G04 :: Dr Barry Cox :: University of Adelaide
The conventional rolledup model of nanotubes does not apply to the very small radii tubes, for which curvature effects become significant. In this talk an existing geometric model for carbon nanotubes proposed by the authors, which accommodates this deficiency and which is based on the exact polyhedral cylindrical structure, is extended to a nanotube structure involving two species of atoms in equal proportion, and in particular boron nitride nanotubes. This generalisation allows the principle features to be included as the fundamental assumptions of the model, such as equal bond length but distinct bond angles and radii between the two species. The polyhedral model is based on the five simple geometric assumptions: (i) all bonds are of equal length, (ii) all bond angles for the boron atoms are equal, (iii) all boron atoms lie at an equal distance from the nanotube axis, (iv) all nitrogen atoms lie at an equal distance from the nanotube axis, and (v) there exists a fixed ratio of pyramidal height H, between the boron species compared with the corresponding height in a symmetric single species nanotube.
Working from these postulates, expressions are derived for the various structural parameters such as radii and bond angles for the two species for specific values of the chiral vector numbers (n,m). The new model incorporates an additional constant of proportionality H, which we assume applies to all nanotubes comprising the same elements and is such that H = 1 for a single species nanotube. Comparison with `ab initio' studies suggest that this assumption is entirely reasonable, and in particular we determine the value H = 0.56\pm0.04 for boron nitride, based on computational results in the literature.
This talk relates to work which is a couple of years old and given time at the end we will discuss some newer results in geometric models developed with our former student Richard Lee (now also at the University of Adelaide as a post doc) and some workinprogress on carbon nanocones.
Note: pyramidal height is our own terminology and will be explained in the talk.


Explicit numerical simulation of multiphase and confined flows 15:10 Fri 8 Oct, 2010 :: Napier G04 :: Prof Mark Biggs :: University of Adelaide
Simulations in which the system of interest is essentially mimicked are termed explicit numerical simulations (ENS). Direct numerical simulation (DNS) of turbulence is a well known and longstanding example of ENS. Such simulations provide a basis for elucidating fundamentals in a way that is impossible experimentally and formulating and parameterizing engineering models with reduced experimentation. In this presentation, I will first outline the concept of ENS. I will then report a number of ENSbased studies of various multiphase fluid systems and flows in porous media. In the first of these studies, which is concerned with flow of suspensions in porous media accompanied by deposition, ENS is used to demonstrate the significant inadequacies of the classical trajectory models typically used for the study of such problems. In the second study, which is concerned with elucidating the change in binary droplet collision behaviour with Capillary number (Ca) and Reynolds number (Re), a range of collision scenarios are revealed as a function of Ca and Re and it appears that the boundaries between these scenarios in the CaRe space are not distinct but, rather, smeared. In the final study, it is shown that ENS an be used to predict ab initio the hydrodynamic properties of single phase flow through porous media from the Darcy to the turbulent regimes. 

Principal Component Analysis Revisited 15:10 Fri 15 Oct, 2010 :: Napier G04 :: Assoc. Prof Inge Koch :: University of Adelaide
Since the beginning of the 20th century, Principal Component Analysis (PCA) has been an important tool in the analysis of multivariate data. The principal components summarise data in fewer than the original number of variables without losing essential information, and thus allow a split of the data into signal and noise components. PCA is a linear method, based on elegant mathematical theory.
The increasing complexity of data together with the emergence of fast computers in the later parts of the 20th century has led to a renaissance of PCA. The growing numbers of variables (in particular, highdimensional low sample size problems), nonGaussian data, and functional data (where the data are curves) are posing exciting challenges to statisticians, and have resulted in new research which extends the classical theory.
I begin with the classical PCA methodology and illustrate the challenges presented by the complex data that we are now able to collect. The main part of the talk focuses on extensions of PCA: the duality of PCA and the Principal Coordinates of Multidimensional Scaling, Sparse PCA, and consistency results relating to principal components, as the dimension grows. We will also look at newer developments such as Principal Component Regression and Supervised PCA, nonlinear PCA and Functional PCA.


Queues with skill based routing under FCFS–ALIS regime 15:10 Fri 11 Feb, 2011 :: B17 Ingkarni Wardli :: Prof Gideon Weiss :: The University of Haifa, Israel
We consider a system where jobs of several types are served by servers
of several types, and a bipartite graph between server types and job types
describes feasible assignments. This is a common situation in manufacturing,
call centers with skill based routing, matching of parentchild in adoption or
matching in kidney transplants etc. We consider the case of first come first
served policy: jobs are assigned to the first available feasible server in
order of their arrivals. We consider two types of policies for assigning
customers to idle servers  a random assignment and assignment to the longest
idle server (ALIS) We survey some results for four different situations:
 For a loss system we find conditions for reversibility and insensitivity.
 For a manufacturing type system, in which there is enough capacity to serve
all jobs, we discuss a product form solution and waiting times.
 For an infinite matching model in which an infinite sequence of customers of
IID types, and infinite sequence of servers of IID types are matched
according to first come first, we obtain a product form stationary
distribution for this system, which we use to calculate matching rates.
 For a call center model with overload and abandonments we make some plausible
observations.
This talk surveys joint work with Ivo Adan, Rene Caldentey, Cor Hurkens, Ed
Kaplan and Damon Wischik, as well as work by Jeremy Visschers, Rishy Talreja and
Ward Whitt.


Nanotechnology: The mathematics of gas storage in metalorganic frameworks. 12:10 Mon 28 Mar, 2011 :: 5.57 Ingkarni Wardli :: Wei Xian Lim :: University of Adelaide
Have you thought about what sort of car you would be driving in the future? Would it be a hybrid, solar, hydrogen or electric car? I would like to be driving a hydrogen car because my field of research may aid in their development! In my presentation I will introduce you to the world of metalorganic frameworks, which are an exciting new class of materials that have great potential in applications such as hydrogen gas storage. I will also discuss about the mathematical model that I am using to model the performance of metalorganic frameworks based on beryllium. 

Classification for highdimensional data 15:10 Fri 1 Apr, 2011 :: Conference Room Level 7 Ingkarni Wardli :: Associate Prof Inge Koch :: The University of Adelaide
For twoclass classification problems Fisher's discriminant rule performs
well in many scenarios provided the dimension, d, is much smaller than the sample
size n. As the dimension increases, Fisher's rule may no longer be
adequate, and can perform as poorly as random guessing.
In this talk we look at new ways of overcoming this poor performance for
highdimensional data by suitably modifying Fisher's rule, and in particular
we describe the 'Features Annealed Independence Rule (FAIR)? of Fan and Fan
(2008) and a rule based on canonical correlation analysis. I describe some
theoretical developments, and also show analysis of data which illustrate the
performance of these modified rule. 

Algebraic hypersurfaces arising from Gorenstein algebras 15:10 Fri 8 Apr, 2011 :: 7.15 Ingkarni Wardli :: Associate Prof Alexander Isaev :: Australian National University
Media...To every Gorenstein algebra of finite dimension greater than 1 over a field of characteristic zero, and a projection on its maximal ideal with range equal to the annihilator of the ideal, one can associate a certain algebraic hypersurface lying in the ideal. Such hypersurfaces possess remarkable properties. They can be used, for instance, to help decide whether two given Gorenstein algebras are isomorphic, which for the case of complex numbers leads to interesting consequences in singularity theory. Also, for the case of real numbers such hypersurfaces naturally arise in CRgeometry. In my talk I will discuss these hypersurfaces and some of their applications. 

Comparison of Spectral and Wavelet Estimation of the Dynamic Linear System of a Wade Energy Device 12:10 Mon 2 May, 2011 :: 5.57 Ingkarni Wardli :: Mohd Aftar :: University of Adelaide
Renewable energy has been one of the main issues nowadays. The implications of fossil energy and nuclear energy along with its limited source have triggered researchers and industries to find another source of renewable energy for example hydro energy, wind energy and also wave energy. In this seminar, I will talk about the spectral estimation and wavelet estimation of a linear dynamical system of motion for a heaving buoy wave energy device. The spectral estimates was based on the Fourier transform, while the wavelet estimate was based on the wavelet transform. Comparisons between two spectral estimates with a wavelet estimate of the amplitude response operator(ARO) for the dynamical system of the wave energy device shows that the wavelet estimate ARO is much better for data with and without noise. 

Statistical challenges in molecular phylogenetics 15:10 Fri 20 May, 2011 :: Mawson Lab G19 lecture theatre :: Dr Barbara Holland :: University of Tasmania
Media...This talk will give an introduction to the ways that mathematics and statistics gets used in the inference of evolutionary (phylogenetic) trees. Taking a modelbased approach to estimating the relationships between species has proven to be an enormously effective, however, there are some tricky statistical challenges that remain. The increasingly plentiful amount of DNA sequence data is a boon, but it is also throwing a spotlight on some of the shortcomings of current best practice particularly in how we (1) assess the reliability of our phylogenetic estimates, and (2) how we choose appropriate models. This talk will aim to give a general introduction this area of research and will also highlight some results from two of my recent PhD students. 

Inference and optimal design for percolation and general random graph models (Part I) 09:30 Wed 8 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Andrei Bejan :: The University of Cambridge
The problem of optimal arrangement of nodes of a random weighted graph
is discussed in this workshop. The nodes of graphs under study are fixed, but
their edges are random and established according to the so called
edgeprobability function. This function is assumed to depend on the weights
attributed to the pairs of graph nodes (or distances between them) and a
statistical parameter. It is the purpose of experimentation to make inference on
the statistical parameter and thus to extract as much information about it as
possible. We also distinguish between two different experimentation scenarios:
progressive and instructive designs.
We adopt a utilitybased Bayesian framework to tackle the optimal design problem
for random graphs of this kind. Simulation based optimisation methods, mainly
Monte Carlo and Markov Chain Monte Carlo, are used to obtain the solution. We
study optimal design problem for the inference based on partial observations of
random graphs by employing data augmentation technique. We prove that the
infinitely growing or diminishing node configurations asymptotically represent
the worst node arrangements. We also obtain the exact solution to the optimal
design problem for proximity (geometric) graphs and numerical solution for
graphs with threshold edgeprobability functions.
We consider inference and optimal design problems for finite clusters from bond
percolation on the integer lattice $\mathbb{Z}^d$ and derive a range of both
numerical and analytical results for these graphs. We introduce innerouter
plots by deleting some of the lattice nodes and show that the ÃÂÃÂ«mostly populatedÃÂÃÂ
designs are not necessarily optimal in the case of incomplete observations under
both progressive and instructive design scenarios. Some of the obtained results
may generalise to other lattices. 

Inference and optimal design for percolation and general random graph models (Part II) 10:50 Wed 8 Jun, 2011 :: 7.15 Ingkarni Wardli :: Dr Andrei Bejan :: The University of Cambridge
The problem of optimal arrangement of nodes of a random weighted graph
is discussed in this workshop. The nodes of graphs under study are fixed, but
their edges are random and established according to the so called
edgeprobability function. This function is assumed to depend on the weights
attributed to the pairs of graph nodes (or distances between them) and a
statistical parameter. It is the purpose of experimentation to make inference on
the statistical parameter and thus to extract as much information about it as
possible. We also distinguish between two different experimentation scenarios:
progressive and instructive designs.
We adopt a utilitybased Bayesian framework to tackle the optimal design problem
for random graphs of this kind. Simulation based optimisation methods, mainly
Monte Carlo and Markov Chain Monte Carlo, are used to obtain the solution. We
study optimal design problem for the inference based on partial observations of
random graphs by employing data augmentation technique. We prove that the
infinitely growing or diminishing node configurations asymptotically represent
the worst node arrangements. We also obtain the exact solution to the optimal
design problem for proximity (geometric) graphs and numerical solution for
graphs with threshold edgeprobability functions.
We consider inference and optimal design problems for finite clusters from bond
percolation on the integer lattice $\mathbb{Z}^d$ and derive a range of both
numerical and analytical results for these graphs. We introduce innerouter
plots by deleting some of the lattice nodes and show that the ÃÂÃÂÃÂÃÂ«mostly populatedÃÂÃÂÃÂÃÂ
designs are not necessarily optimal in the case of incomplete observations under
both progressive and instructive design scenarios. Some of the obtained results
may generalise to other lattices. 

Modelling computer network topologies through optimisation 12:10 Mon 1 Aug, 2011 :: 5.57 Ingkarni Wardli :: Mr Rhys Bowden :: University of Adelaide
The core of the Internet is made up of many different computers (called routers) in many different interconnected networks, owned and operated by many different organisations. A popular and important field of study in the past has been "network topology": for instance, understanding which routers are connected to which other routers, or which networks are connected to which other networks; that is, studying and modelling the connection structure of the Internet. Previous study in this area has been plagued by unreliable or flawed experimental data and debate over appropriate models to use. The Internet Topology Zoo is a new source of network data created from the information that network operators make public. In order to better understand this body of network information we would like the ability to randomly generate network topologies resembling those in the zoo. Leveraging previous wisdom on networks produced as a result of optimisation processes, we propose a simple objective function based on possible economic constraints. By changing the relative costs in the objective function we can change the form of the resulting networks, and we compare these optimised networks to a variety of networks found in the Internet Topology Zoo. 

The real thing 12:10 Wed 3 Aug, 2011 :: Napier 210 :: Dr Paul McCann :: School of Mathematical Sciences
Media...Let x be a real number. This familiar and seemingly innocent assumption opens up a world of infinite variety and information. We use some simple techniques (powers of two, geometric series) to examine some interesting consequences of generating random real numbers, and encounter both the best flash drive and the worst flash drive you will ever meet. Come "hold infinity in the palm of your hand", and contemplate eternity for about half an hour. Almost nothing is assumed, almost everything is explained, and absolutely all are welcome. 

Spectra alignment/matching for the classification of cancer and control patients 12:10 Mon 8 Aug, 2011 :: 5.57 Ingkarni Wardli :: Mr Tyman Stanford :: University of Adelaide
Proteomic timeofflight mass spectrometry produces a spectrum based on the peptides (chains of amino acids) in each patientâs serum sample. The spectra contain data points for an xaxis (peptide weight) and a yaxis (peptide frequency/count/intensity). It is our end goal to differentiate cancer (and subtypes) and control patients using these spectra. Before we can do this, peaks in these data must be found and common peptides to different spectra must be found. The data are noisy because of biotechnological variation and calibration error; data points for different peptide weights may in fact be same peptide. An algorithm needs to be employed to find common peptides between spectra, as performing alignment âby handâ is almost infeasible. We borrow methods suggested in the literature by metabolomic gas chromatographymass spectrometry and extend the methods for our purposes. In this talk I will go over the basic tenets of what we hope to achieve and the process towards this.


Statistical analysis of metagenomic data from the microbial community involved in industrial bioleaching 12:10 Mon 19 Sep, 2011 :: 5.57 Ingkarni Wardli :: Ms Susana SotoRojo :: University of Adelaide
In the last two decades heap bioleaching has become established as a successful commercial option for recovering copper from lowgrade secondary sulfide ores. Geneticsbased approaches have recently been employed in the task of characterizing mineral processing bacteria. Data analysis is a key issue and thus the implementation of adequate mathematical and statistical tools is of fundamental importance to draw reliable conclusions. In this talk I will give a recount of two specific problems that we have been working on. The first regarding experimental design and the latter on modeling composition and activity of the microbial consortium. 

Can statisticians do better than random guessing? 12:10 Tue 20 Sep, 2011 :: Napier 210 :: A/Prof Inge Koch :: School of Mathematical Sciences
In the finance or credit risk area, a bank may want to assess whether a client is going to default, or be able to meet the repayments. In the assessment of benign or malignant tumours, a correct diagnosis is required. In these and similar examples, we make decisions based on data. The classical ttests provide a tool for making such decisions. However, many modern data sets have more variables than observations, and the classical rules may not be any better than random guessing. We consider Fisher's rule for classifying data into two groups, and show that it can break down for highdimensional data. We then look at ways of overcoming some of the weaknesses of the classical rules, and I show how these "postmodern" rules perform in practice. 

Estimating transmission parameters for the swine flu pandemic 15:10 Fri 23 Sep, 2011 :: 7.15 Ingkarni Wardli :: Dr Kathryn Glass :: Australian National University
Media...Following the onset of a new strain of influenza with pandemic potential, policy makers need specific advice on how fast the disease is spreading, who is at risk, and what interventions are appropriate for slowing transmission. Mathematical models play a key role in comparing interventions and identifying the best response, but models are only as good as the data that inform them. In the early stages of the 2009 swine flu outbreak, many researchers estimated transmission parameters  particularly the reproduction number  from outbreak data. These estimates varied, and were often biased by data collection methods, misclassification of imported cases or as a result of early stochasticity in case numbers. I will discuss a number of the pitfalls in achieving good quality parameter estimates from early outbreak data, and outline how best to avoid them.
One of the early indications from swine flu data was that children were disproportionately responsible for disease spread. I will introduce a new method for estimating agespecific transmission parameters from both outbreak and seroprevalence data. This approach allows us to take account of empirical data on human contact patterns, and highlights the need to allow for asymmetric mixing matrices in modelling disease transmission between age groups. Applied to swine flu data from a number of different countries, it presents a consistent picture of higher transmission from children. 

Estimating disease prevalence in hidden populations 14:05 Wed 28 Sep, 2011 :: B.18 Ingkarni Wardli :: Dr Amber Tomas :: The University of Oxford
Estimating disease prevalence in "hidden" populations such as injecting
drug users or men who have sex with men is an important public health
issue. However, traditional designbased estimation methods are
inappropriate because they assume that a list of all members of the
population is available from which to select a sample. Respondent Driven
Sampling (RDS) is a method developed over the last 15 years for sampling
from hidden populations. Similarly to snowball sampling, it leverages the
fact that members of hidden populations are often socially connected to
one another. Although RDS is now used around the world, there are several
common population characteristics which are known to cause estimates
calculated from such samples to be significantly biased. In this talk I'll
discuss the motivation for RDS, as well as some of the recent developments
in methods of estimation. 

Statistical analysis of schoolbased student performance data 12:10 Mon 10 Oct, 2011 :: 5.57 Ingkarni Wardli :: Ms Jessica Tan :: University of Adelaide
Join me in the journey of being a statistician for 15 minutes of your day (if you are not already one) and experience the task of data cleaning without having to get your own hands dirty. Most of you may have sat the Basic Skills Tests when at school or know someone who currently has to do the NAPLAN (National Assessment Program  Literacy and Numeracy) tests. Tests like these assess student progress and can be used to accurately measure school performance. In trying to answer the research question: "what conclusions about student progress and school performance can be drawn from NAPLAN data or data of a similar nature, using mathematical and statistical modelling and analysis techniques?", I have uncovered some interesting results about the data in my initial data analysis which I shall explain in this talk. 

On the role of mixture distributions in the modelling of heterogeneous data 15:10 Fri 14 Oct, 2011 :: 7.15 Ingkarni Wardli :: Prof Geoff McLachlan :: University of Queensland
Media...We consider the role that finite mixture distributions have played in the modelling of heterogeneous data, in particular for clustering continuous data via mixtures of normal distributions. A very brief history is given starting with the seminal papers by Day and Wolfe in the sixties before the appearance of the EM algorithm. It was the publication in 1977 of the latter algorithm by Dempster, Laird, and Rubin that greatly stimulated interest in the use of finite mixture distributions to model heterogeneous data. This is because the fitting of mixture models by maximum likelihood is a classic example of a problem that is simplified considerably by the EM's conceptual unification of maximum likelihood estimation from data that can be viewed as being incomplete. In recent times there has been a proliferation of applications in which the number of experimental units n is comparatively small but the underlying dimension p is extremely large as, for example, in microarraybased genomics and other highthroughput experimental approaches. Hence there has been increasing attention given not only in bioinformatics and machine learning, but also in mainstream statistics, to the analysis of complex data in this situation where n is small relative to p. The latter part of the talk shall focus on the modelling of such highdimensional data using mixture distributions. 

Likelihoodfree Bayesian inference: modelling drug resistance in Mycobacterium tuberculosis 15:10 Fri 21 Oct, 2011 :: 7.15 Ingkarni Wardli :: Dr Scott Sisson :: University of New South Wales
Media...A central pillar of Bayesian statistical inference is Monte Carlo integration, which is based on obtaining random samples from the posterior distribution. There are a number of standard ways to obtain these samples, provided that the likelihood function can be numerically evaluated. In the last 10 years, there has been a substantial push to develop methods that permit Bayesian inference in the presence of computationally intractable likelihood functions. These methods, termed ``likelihoodfree'' or approximate Bayesian computation (ABC), are now being applied extensively across many disciplines.
In this talk, I'll present a brief, nontechnical overview of the ideas behind likelihoodfree methods. I'll motivate and illustrate these ideas through an analysis of the epidemiological fitness cost of drug resistance in Mycobacterium tuberculosis. 

Metric geometry in data analysis 13:10 Fri 11 Nov, 2011 :: B.19 Ingkarni Wardli :: Dr Facundo Memoli :: University of Adelaide
The problem of object matching under invariances can be
studied using certain tools from metric geometry. The central idea is
to regard
objects as metric spaces (or metric measure spaces). The type of
invariance that one wishes to have in the matching is encoded by the
choice of the metrics with which one endows the objects. The standard
example is matching objects in Euclidean space under rigid isometries:
in this
situation one would endow the objects with the Euclidean metric. More
general scenarios are possible in which the desired invariance cannot
be reflected by the preservation of an ambient space metric. Several
ideas due to M. Gromov are useful for approaching this problem. The
GromovHausdorff distance is a natural candidate for doing this.
However, this metric leads to very hard combinatorial optimization
problems and it is difficult to relate to previously reported
practical approaches to the problem of object matching. I will discuss
different variations of these ideas, and in particular will show a
construction of an L^p version of the GromovHausdorff metric, called
the GromovWassestein distance, which is based on mass transportation
ideas. This new metric directly leads to quadratic optimization
problems on continuous variables with linear constraints.
As a consequence of establishing several lower bounds, it turns out
that several invariants of metric measure spaces turn out to be
quantitatively stable in the GW sense. These invariants provide
practical tools for the discrimination of shapes and connect the GW
ideas to a number of preexisting approaches. 

Collision and instability in a rotating fluidfilled torus 15:10 Mon 12 Dec, 2011 :: Benham Lecture Theatre :: Dr Richard Clarke :: The University of Auckland
The simple experiment discussed in this talk, first conceived by Madden and
Mullin (JFM, 1994) as part of their investigations into the nonuniqueness
of decaying turbulent flow, consists of a fluidfilled torus which is
rotated in an horizontal plane. Turbulence within the contained flow is
triggered through a rapid change in its rotation rate. The flow
instabilities which transition the flow to this turbulent state, however,
are truly fascinating in their own right, and form the subject of this
presentation. Flow features observed in both UK and Aucklandbased
experiments will be highlighted, and explained through both boundarylayer
analysis and full DNS. In concluding we argue that this flow regime, with
its compact geometry and lack of cumbersome flow entry effects, presents an
ideal regime in which to study many prototype flow behaviours, very much in
the same spirit as TaylorCouette flow. 

Fluid mechanics: what's maths got to do with it? 13:10 Tue 20 Mar, 2012 :: 7.15 Ingkarni Wardli :: A/Prof Jim Denier :: School of Mathematical Sciences
Media...We've all heard about the grand challenges in mathematics. There was the Poincare Conjecture, which has now been resolved. There is the Riemann Hypothesis which many are seeking to prove. But one of the most intriguing is the so called "NavierStokes Equations" problem, intriguing because it not only involves some wickedly difficult mathematics but also involves questions about our deep understanding of nature as encountered in the flow of fluids. This talk will introduce the problem (without the wickedly difficult mathematics) and discuss some of the consequences of its resolution. 

New examples of totally disconnected, locally compact groups 13:10 Fri 20 Apr, 2012 :: B.20 Ingkarni Wardli :: Dr Murray Elder :: University of Newcastle
I will attempt to explain what a totally disconnected,
locally compact group is, and then describe some new work with George
Willis on an attempt to create new examples based on BaumslagSolitar
groups, which are well known, tried and tested
examples/counterexamples in geometric/combinatorial group theory. I
will describe how to compute invariants of scale and flat rank for
these groups. 

Multiscale models of collective cell behaviour: Linear or nonlinear diffusion? 15:10 Fri 4 May, 2012 :: B.21 Ingkarni Wardli :: Dr Matthew Simpson :: Queensland University of Technology
Media...Continuum diffusion models are often used to represent the collective motion of cell populations. Most previous studies have simply used linear diffusion to represent collective cell spreading, while others found that degenerate nonlinear diffusion provides a better match to experimental cell density profiles. There is no guidance available in the mathematical biology literature with regard to which approach is more appropriate. Furthermore, there is no knowledge of particular experimental measurements that can be made to distinguish between situations where these two models are appropriate. We provide a link between individualbased and continuum models using a multiscale approach in which we analyse the collective motion of a population of interacting agents in a generalized latticebased exclusion process. For round agents that occupy a single lattice site, we find that the relevant continuum description is a linear diffusion equation, whereas for elongated rodshaped agents that occupy L adjacent lattice sites we find that the relevant continuum description is a nonlinear diffusion equation related to the porous media equation. We show that there are several reasonable approaches for dealing with agent size effects, and that these different approaches are related mathematically through the concept of mean action time. We extend our results to consider proliferation and travelling waves where greater care must be taken to ensure that the continuum model replicates the discrete process. This is joint work with Dr Ruth Baker (Oxford) and Dr Scott McCue (QUT). 

Modelling protective antitumour immunity using a hybrid agentbased and delay differential equation approach 15:10 Fri 11 May, 2012 :: B.21 Ingkarni Wardli :: Dr Peter Kim :: University of Sydney
Media...Although cancers seem to consistently evade current medical treatments, the body's immune defences seem quite effective at controlling incipient tumours. Understanding how our immune systems provide such protection against earlystage tumours and how this protection could be lost will provide insight into designing nextgeneration immune therapies against cancer. To engage this problem, we formulate a mathematical model of the immune response against small, incipient tumours. The model considers the initial stimulation of the immune response in lymph nodes and the resulting immune attack on the tumour and is formulated as a hybrid agentbased and delay differential equation model. 

On the full holonomy group of special Lorentzian manifolds 13:10 Fri 25 May, 2012 :: Napier LG28 :: Dr Thomas Leistner :: University of Adelaide
The holonomy group of a semiRiemannian manifold is defined as the group of parallel transports along loops based at a point. Its connected component, the `restricted holonomy group', is given by restricting in this definition to contractible loops. The restricted holonomy can essentially be described by its Lie algebra and many classification results are obtained in this way. In contrast, the `full' holonomy group is a more global object and classification results are out of reach.
In the talk I will describe recent results with H. Baum and K. Laerz (both HU Berlin) about the full holonomy group of socalled `indecomposable' Lorentzian manifolds.
I will explain a construction method that arises from analysing the effects on holonomy when dividing the manifold by the action of a properly discontinuous group of isometries and present several examples of Lorentzian manifolds with disconnected holonomy groups.


Model turbulent floods based upon the Smagorinski large eddy closure 12:10 Mon 4 Jun, 2012 :: 5.57 Ingkarni Wardli :: Mr Meng Cao :: University of Adelaide
Media...Rivers, floods and tsunamis are often very turbulent. Conventional models of such environmental fluids are typically based on depthaveraged inviscid irrotational flow equations. We explore changing such a base to the turbulent Smagorinski large eddy closure. The aim is to more appropriately model the fluid dynamics of such complex environmental fluids by using such a turbulent closure. Large changes in fluid depth are allowed. Computer algebra constructs the slow manifold of the flow in terms of the fluid depth h and the mean turbulent lateral velocities u and v. The major challenge is to deal with the nonlinear stress tensor in the Smagorinski closure. The model integrates the effects of inertia, selfadvection, bed drag, gravitational forcing and turbulent dissipation with minimal assumptions. Although the resultant model is close to established models, the real outcome is creating a sound basis for the modelling so others, in their modelling of more complex situations, can systematically include more complex physical processes. 

Epidemiological consequences of householdbased antiviral prophylaxis for pandemic influenza 14:10 Fri 8 Jun, 2012 :: 7.15 Ingkarni Wardli :: Dr Joshua Ross :: The University of Adelaide
Media...Antiviral treatment offers a fast acting alternative to vaccination. It is viewed as a firstline of defence against pandemic influenza, protecting families and household members once infection has been detected. In clinical trials antiviral treatment has been shown to be efficacious in preventing infection, limiting disease and reducing transmission, yet their impact at containing the 2009 influenza A(H1N1)pdm outbreak was limited. I will describe some of our work, which attempts to understand this seeming discrepancy, through the development of a general model and computationally efficient methodology for studying householdbased interventions.
This is joint work with Dr Andrew Black (Adelaide), and Prof. Matt Keeling and Dr Thomas House (Warwick, U.K.). 

Comparison of spectral and wavelet estimators of transfer function for linear systems 12:10 Mon 18 Jun, 2012 :: B.21 Ingkarni Wardli :: Mr Mohd Aftar Abu Bakar :: University of Adelaide
Media...We compare spectral and wavelet estimators of the response amplitude operator (RAO) of a linear system, with various input signals and added noise scenarios. The comparison is based on a model of a heaving buoy wave energy device (HBWED), which oscillates vertically as a single mode of vibration linear system.
HBWEDs and other single degree of freedom wave energy devices such as the oscillating wave surge convertors (OWSC) are currently deployed in the ocean, making single degree of freedom wave energy devices important systems to both model and analyse in some detail. However, the results of the comparison relate to any linear system.
It was found that the wavelet estimator of the RAO offers no advantage over the spectral estimators if both input and response time series data are noise free and long time series are available. If there is noise on only the response time series, only the wavelet estimator or the spectral estimator that uses the crossspectrum of the input and response signals in the numerator should be used. For the case of noise on only the input time series, only the spectral estimator that uses the crossspectrum in the denominator gives a sensible estimate of the RAO. If both the input and response signals are corrupted with noise, a modification to both the input and response spectrum estimates can provide a good estimator of the RAO. However, a combination of wavelet and spectral methods is introduced as an alternative RAO estimator.
The conclusions apply for autoregressive emulators of sea surface elevation, impulse, and pseudorandom binary sequences (PRBS) inputs. However, a wavelet estimator is needed in the special case of a chirp input where the signal has a continuously varying frequency. 

Inquirybased learning: yesterday and today 15:30 Mon 9 Jul, 2012 :: Ingkarni Wardli B19 :: Prof Ron Douglas :: Texas A&M University
Media...The speaker will report on a project to develop and promote approaches to mathematics instruction closely related to the Moore method  methods which are called inquirybased learning  as well as on his personal experience of the Moore method. For background, see the speaker's article in the May 2012 issue of the Notices of the American Mathematical Society. To download the article, click on "Media" above. 

2012 AMSISSAI Lecture: Approximate Bayesian computation (ABC): advances and limitations 11:00 Fri 13 Jul, 2012 :: Engineering South S112 :: Prof Christian Robert :: Universite ParisDauphine
Media...The lack of closed form likelihoods has been the bane of Bayesian computation for many years and, prior to the introduction of MCMC methods, a strong impediment to the propagation of the Bayesian paradigm. We are now facing models where an MCMC completion of the model towards closedform likelihoods seems unachievable and where a further degree of approximation appears unavoidable. In this talk, I will present the motivation for approximative Bayesian computation (ABC) methods, the consistency results already available, the various Monte Carlo implementations found in the current literature, as well as the inferential, rather than computational, challenges set by these methods. A recent advance based on empirical likelihood will also be discussed. 

Infectious diseases modelling: from biology to public health policy 15:10 Fri 24 Aug, 2012 :: B.20 Ingkarni Wardli :: Dr James McCaw :: The University of Melbourne
Media...The mathematical study of humantohuman transmissible pathogens has
established itself as a complementary methodology to the traditional
epidemiological approach. The classic susceptibleinfectiousrecovered
model paradigm has been used to great effect to gain insight into the
epidemiology of endemic diseases such as influenza and pertussis, and
the emergence of novel pathogens such as SARS and pandemic influenza.
The modelling paradigm has also been taken within the host and used to
explain the withinhost dynamics of viral (or bacterial or parasite)
infections, with implications for our understanding of infection,
emergence of drug resistance and optimal druginterventions.
In this presentation I will provide an overview of the mathematical
paradigm used to investigate both biological and epidemiological
infectious diseases systems, drawing on case studies from influenza,
malaria and pertussis research. I will conclude with a summary of how
infectious diseases modelling has assisted the Australian government in
developing its pandemic preparedness and response strategies.


Introduction to pairings in cryptography 13:10 Fri 21 Sep, 2012 :: Napier 209 :: Dr Naomi Benger :: University of Adelaide
From cryptanalysis to a powerful tool which made identity based cryptography possible, pairings have a range of applications in cryptography. I will present basic background (algebraic geometry) needed to understand pairings, hard problems associated with pairings and protocols which use pairings. 

Probability, what can it tell us about health? 13:10 Tue 9 Oct, 2012 :: 7.15 Ingkarni Wardli :: Prof Nigel Bean :: School of Mathematical Sciences
Media...Clinical trials are the way in which modern medical systems test whether individual treatments are worthwhile. Sophisticated statistics is used to try and make the conclusions from clinical trials as meaningful as possible. What can a very simple probability model then tell us about the worth of multiple treatments? What might the implications of this be for the whole health system?
This talk is based on research currently being conducted with a physician at a major Adelaide hospital. It requires no health knowledge and was not tested on animals. All you need is an enquiring and open mind.


Epidemic models in socially structured populations: when are simple models too simple? 14:00 Thu 25 Oct, 2012 :: 5.56 Ingkarni Wardli :: Dr Lorenzo Pellis :: The University of Warwick
Both age and household structure are recognised as important heterogeneities affecting epidemic spread of infectious pathogens, and many models exist nowadays that include either or both forms of heterogeneity. However, different models may fit aggregate epidemic data equally well and nevertheless lead to different predictions of public health interest. I will here present an overview of stochastic epidemic models with increasing complexity in their social structure, focusing in particular on households models. For these models, I will present recent results about the definition and computation of the basic reproduction number R0 and its relationship with other threshold parameters. Finally, I will use these results to compare models with no, either or both age and household structure, with the aim of quantifying the conditions under which each form of heterogeneity is relevant and therefore providing some criteria that can be used to guide model design for realtime predictions. 

Epidemic models in socially structured populations: when are simple models too simple? 14:00 Thu 25 Oct, 2012 :: 5.56 Ingkarni Wardli :: Dr Lorenzo Pellis :: The University of Warwick
Both age and household structure are recognised as important heterogeneities affecting epidemic spread of infectious pathogens, and many models exist nowadays that include either or both forms of heterogeneity. However, different models may fit aggregate epidemic data equally well and nevertheless lead to different predictions of public health interest. I will here present an overview of stochastic epidemic models with increasing complexity in their social structure, focusing in particular on households models. For these models, I will present recent results about the definition and computation of the basic reproduction number R0 and its relationship with other threshold parameters. Finally, I will use these results to compare models with no, either or both age and household structure, with the aim of quantifying the conditions under which each form of heterogeneity is relevant and therefore providing some criteria that can be used to guide model design for realtime predictions. 

Colour 12:10 Mon 13 May, 2013 :: B.19 Ingkarni Wardli :: Lyron Winderbaum :: University of Adelaide
Media...Colour is a powerful tool in presenting data, but it can be tricky to choose just the right colours to represent your data honestly  do the colours used in your heatmap overemphasise the differences between particular values over others? does your choice of colours overemphasize one when they should be represented as equal? etc. All these questions are fundamentally based in how we perceive colour. There has been alot of research into how we perceive colour in the past century, and some interesting results. I will explain how a `standard observer' was found empirically and used to develop an absolute reference standard for colour in 1931. How although the common RedGreenBlue representation of colour is useful and intuitive, distances between colours in this space do not reflect our perception of difference between colours and how alternative, perceptually focused colourspaces where introduced in 1976. I will go on to explain how these results can be used to provide simple mechanisms by which to choose colours that satisfy particular properties such as being equally different from each other, or being linearly more different in sequence, or maintaining such properties when transferred to greyscale, or for a colourblind person. 

The search for the exotic  subfactors and conformal field theory 13:10 Fri 26 Jul, 2013 :: EngineeringMaths 212 :: Prof David E. Evans :: Cardiff University
Subfactor theory provides a framework for studying modular invariant partition functions in conformal field theory,
and candidates for exotic modular tensor categories. I will describe work with Terry Gannon on the search for exotic theories
beyond those from symmetries based on loop groups, WessZuminoWitten models and finite groups. 

The LowenheimSkolem theorem 12:10 Mon 26 Aug, 2013 :: B.19 Ingkarni Wardli :: William Crawford :: University of Adelaide
Media...For those of us who didn't do an undergrad course in logic, the foundations of set theory are pretty daunting. I will give a run down of some of the basics and then talk about a lesser known, but interesting result; the LowenheimSkolem theorem. One of the consequences of the theorem is that a set can be countable in one model of set theory, while being uncountable in another. 

Medical Decision Analysis 12:10 Mon 2 Sep, 2013 :: B.19 Ingkarni Wardli :: Eka Baker :: University of Adelaide
Doctors make life changing decisions every day based on clinical trial data. However, this data is often obtained from studies on healthy individuals or on patients with only the disease that a treatment is targeting. Outside of these studies, many patients will have other conditions that may affect the predicted benefit of receiving a certain treatment. I will talk about what clinical trials are, how to measure the benefit of treatments, and how having multiple conditions (comorbidities) will affect the benefit of treatments. 

Thinfilm flow in helical channels 12:10 Mon 9 Sep, 2013 :: B.19 Ingkarni Wardli :: David Arnold :: University of Adelaide
Media...Spiral particle separators are used in the mineral processing industry to refine ores. A slurry, formed by mixing crushed ore with a fluid, is run down a helical channel and at the end of the channel, the particles end up sorted in different sections of the channel. Design of such devices is largely experimentally based, and mathematical modelling of flow in helical channels is relatively limited. In this talk, I will outline some of the work that I have been doing on thinfilm flow in helical channels. 

Symmetry gaps for geometric structures 15:10 Fri 20 Sep, 2013 :: B.18 Ingkarni Wardli :: Dr Dennis The :: Australian National University
Media...Klein's Erlangen program classified geometries based on their (transitive) groups of symmetries, e.g. Euclidean geometry is the quotient of the rigid motion group by the subgroup of rotations. While this perspective is homogeneous, Riemann's generalization of Euclidean geometry is in general very "lumpy"  i.e. there exist Riemannian manifolds that have no symmetries at all. A common generalization where a group still plays a dominant role is Cartan geometry, which first arose in Cartan's solution to the equivalence problem for geometric structures, and which articulates what a "curved version" of a flat (homogeneous) model means. Parabolic geometries are Cartan geometries modelled on (generalized) flag varieties (e.g. projective space, isotropic Grassmannians) which are wellknown objects from the representation theory of semisimple Lie groups. These curved versions encompass a zoo of interesting geometries, including conformal, projective, CR, systems of 2nd order ODE, etc. This interaction between differential geometry and representation theory has proved extremely fruitful in recent years. My talk will be an examplebased tour of various types of parabolic geometries, which I'll use to outline some of the main aspects of the theory (suppressing technical details). The main thread throughout the talk will be the symmetry gap problem: For a given type of Cartan geometry, the maximal symmetry dimension is realized by the flat model, but what is the next possible ("submaximal") symmetry dimension? I'll sketch a recent solution (in joint work with Boris Kruglikov) for a wide class of parabolic geometries which gives a combinatorial recipe for reading the submaximal symmetry dimension from a Dynkin diagram. 

Controlling disease, one household at a time. 12:10 Mon 23 Sep, 2013 :: B.19 Ingkarni Wardli :: Michael Lydeamore :: University of Adelaide
Pandemics and Epidemics have always caused significant disruption to society. Attempting to model each individual in any reasonable sized population is unfeasible at best, but we can get surprisingly good results just by looking at a single household in a population. In this talk, I'll try to guide you through the logic I've discovered this year, and present some of the key results we've obtained so far, as well as provide a brief indication of what's to come. 

Gravitational slingshot and space mission design 15:10 Fri 11 Oct, 2013 :: B.18 Ingkarni Wardli :: Prof Pawel Nurowski :: Polish Academy of Sciences
Media...When planning a space mission the weight of the spacecraft is the main issue. Every gram sent into the outer space costs a lot. A considerable part of the overall weight of the spaceship consists of a fuel needed to control it. I will explain how space agencies reduce the amount of fuel needed to go to a given place in the Solar System by using gravity of celestial bodies encountered along the trip. I will start with the explanation of an old trick called `gravitational slingshot', and end up with a modern technique which is based on the analysis of a 3body problem appearing in Newtonian mechanics. 

Modelling the South Australian garfish population slice by slice. 12:10 Mon 14 Oct, 2013 :: B.19 Ingkarni Wardli :: John Feenstra :: University of Adelaide
Media...In this talk I will provide a taste of how South Australian garfish populations are modelled. The role and importance of garfish 'slices' will be explained and how these help produce important reporting quantities of yearly recruitment, legalsize biomass, and exploitation rate within a framework of an age and length based population model. 

Geometric quantisation in the noncompact setting 12:10 Fri 7 Mar, 2014 :: Ingkarni Wardli B20 :: Peter Hochs :: University of Adelaide
Geometric quantisation is a way to construct quantum mechanical phase spaces (Hilbert spaces) from classical mechanical phase spaces (symplectic manifolds). In the presence of a group action, the quantisation commutes with reduction principle states that geometric quantisation should be compatible with the ways the group action can be used to simplify (reduce) the classical and quantum phase spaces. This has deep consequences for the link between symplectic geometry and representation theory.
The quantisation commutes with reduction principle has been given explicit meaning, and been proved, in cases where the symplectic manifold and the group acting on it are compact. There have also been results where just the group, or the orbit space of the action, is assumed to be compact. These are important and difficult, but it is somewhat frustrating that they do not even apply to the simplest example from the physics point of view: a free particle in Rn. This talk is about a joint result with Mathai Varghese where the group, manifold and orbit space may all be noncompact. 

Outlier removal using the Bayesian information criterion for groupbased trajectory modelling 12:10 Mon 28 Apr, 2014 :: B.19 Ingkarni Wardli :: Chris Davies :: University of Adelaide
Media...Attributes measured longitudinally can be used to define discrete paths of measurements, or trajectories, for each individual in a given population. Groupbased trajectory modelling methods can be used to identify subgroups of trajectories within a population, such that trajectories that are grouped together are more similar to each other than to trajectories in distinct groups. Existing methods generally allocate every individual trajectory into one of the estimated groups. However this does not allow for the possibility that some individuals may be following trajectories so different from the rest of the population that they should not be included in a groupbased trajectory model. This results in these outlying trajectories being treated as though they belong to one of the groups, distorting the estimated trajectory groups and any subsequent analyses that use them.
We have developed an algorithm for removing outlying trajectories based on the maximum change in Bayesian information criterion (BIC) due to removing a single trajectory. As well as deciding which trajectory to remove, the number of groups in the model can also change. The decision to remove an outlying trajectory is made by comparing the loglikelihood contributions of the observations to those of simulated samples from the estimated groupbased trajectory model. In this talk the algorithm will be detailed and an application of its use will be demonstrated. 

Networkbased approaches to classification and biomarker identification in metastatic melanoma 15:10 Fri 2 May, 2014 :: B.21 Ingkarni Wardli :: Associate Professor Jean Yee Hwa Yang :: The University of Sydney
Media...Finding prognostic markers has been a central question in much of current research in medicine and biology. In the last decade, approaches to prognostic prediction within a genomics setting are primarily based on changes in individual genes / protein. Very recently, however, network based approaches to prognostic prediction have begun to emerge which utilize interaction information between genes. This is based on the believe that largescale molecular interaction networks are dynamic in nature and changes in these networks, rather than changes in individual genes/proteins, are often drivers of complex diseases such as cancer.
In this talk, I use data from stage III melanoma patients provided by Prof. Mann from Melanoma Institute of Australia to discuss how network information can be utilize in the analysis of gene expression analysis to aid in biological interpretation. Here, we explore a number of novel and previously published networkbased prediction methods, which we will then compare to the common singlegene and geneset methods with the aim of identifying more biologically interpretable biomarkers in the form of networks. 

Fast computation of eigenvalues and eigenfunctions on bounded plane domains 15:10 Fri 1 Aug, 2014 :: B.18 Ingkarni Wardli :: Professor Andrew Hassell :: Australian National University
Media...I will describe a new method for numerically computing eigenfunctions and eigenvalues on certain plane domains, derived from the socalled "scaling method" of Vergini and Saraceno. It is based on properties of the DirichlettoNeumann map on the domain, which relates a function f on the boundary of the domain to the normal derivative (at the boundary) of the eigenfunction with boundary data f. This is a topic of independent interest in pure mathematics. In my talk I will try to emphasize the inteplay between theory and applications, which is very rich in this situation. This is joint work with numerical analyst Alex Barnett (Dartmouth). 

Modelling the meanfield behaviour of cellular automata 12:10 Mon 4 Aug, 2014 :: B.19 Ingkarni Wardli :: Kale Davies :: University of Adelaide
Media...Cellular automata (CA) are latticebased models in which agents fill the lattice sites and behave according to some specified rule. CA are particularly useful when modelling cell behaviour and as such many people consider CA model in which agents undergo motility and proliferation type events. We are particularly interested in predicting the average behaviour of these models. In this talk I will show how a system of differential equations can be derived for the system and discuss the difficulties that arise in even the seemingly simple case of a CA with motility and proliferation. 

Tduality and the chiral de Rham complex 12:10 Fri 22 Aug, 2014 :: Ingkarni Wardli B20 :: Andrew Linshaw :: University of Denver
The chiral de Rham complex of Malikov, Schechtman, and Vaintrob is a sheaf of vertex algebras that exists on any smooth manifold M. It has a squarezero differential D, and contains the algebra of differential forms on M as a subcomplex. In this talk, I'll give an introduction to vertex algebras and sketch this construction. Finally, I'll discuss a notion of Tduality in this setting. This is based on joint work in progress with V. Mathai. 

Software and protocol verification using Alloy 12:10 Mon 25 Aug, 2014 :: B.19 Ingkarni Wardli :: Dinesha Ranathunga :: University of Adelaide
Media...Reliable software isn't achieved by trial and error. It requires tools to support verification. Alloy is a tool based on set theory that allows expression of a logicbased model of software or a protocol, and hence allows checking of this model. In this talk, I will cover its key concepts, language syntax and analysis features. 

Problems in pandemic preparedness 15:10 Fri 12 Sep, 2014 :: N132 Engineering North :: Dr Joshua Ross :: The University of Adelaide
Media...The emergence of novel strains of viruses pose an everpresent
threat to our health and wellbeing. In this talk, I will provide an
overview of work I have done, or am doing, in collaboration with
colleagues and students on two topics related to pandemic preparedness:
the first being antiviral usage for pre and postexposure prophylaxis;
and the second being estimating transmissibility and severity from First
Few Hundred (FF100) studies. 

Translating solitons for mean curvature flow 12:10 Fri 19 Sep, 2014 :: Ingkarni Wardli B20 :: Julie Clutterbuck :: Monash University
Mean curvature flow gives a deformation of a submanifold in the direction of its mean curvature vector. Singularities may arise, and can be modelled by special solutions of the flow. I will describe the special solutions that move by only a translation under the flow, and give some explicit constructions of such surfaces. This is based on joint work with Oliver Schnuerer and Felix Schulze. 

Inferring absolute population and recruitment of southern rock lobster using only catch and effort data 12:35 Mon 22 Sep, 2014 :: B.19 Ingkarni Wardli :: John Feenstra :: University of Adelaide
Media...Abundance estimates from a datalimited version of catch survey analysis are compared to those from a novel oneparameter deterministic method. Bias of both methods is explored using simulation testing based on a more complex datarich stock assessment population dynamics fishery operating model, exploring the impact of both varying levels of observation error in data as well as model process error. Recruitment was consistently better estimated than legal size population, the latter most sensitive to increasing observation errors. A hybrid of the datalimited methods is proposed as the most robust approach. A more statistically conventional errorinvariables approach may also be touched upon if enough time. 

Exploration vs. Exploitation with Partially Observable Gaussian Autoregressive Arms 15:00 Mon 29 Sep, 2014 :: Engineering North N132 :: Julia Kuhn :: The University of Queensland & The University of Amsterdam
Media...We consider a restless bandit problem with Gaussian autoregressive arms, where the state of an arm is only observed when it is played and the statedependent reward is collected. Since arms are only partially observable, a good decision policy needs to account for the fact that information about the state of an arm becomes more and more obsolete while the arm is not being played. Thus, the decision maker faces a tradeoff between exploiting those arms that are believed to be currently the most rewarding (i.e. those with the largest conditional mean), and exploring arms with a high conditional variance. Moreover, one would like the decision policy to remain tractable despite the infinite state space and also in systems with many arms. A policy that gives some priority to exploration is the Whittle index policy, for which we establish structural properties. These motivate a parametric index policy that is computationally much simpler than the Whittle index but can still outperform the myopic policy. Furthermore, we examine the manyarm behavior of the system under the parametric policy, identifying equations describing its asymptotic dynamics. Based on these insights we provide a simple heuristic algorithm to evaluate the performance of index policies; the latter is used to optimize the parametric index. 

Micro Magnetofluidics  Wireless Manipulation for Microfluidics 15:10 Fri 24 Oct, 2014 :: N.132 Engineering North :: Professor NamTrung Nguyen :: Griffith University
Media...Microfluidics is rich in multiphysics phenomena, which offer fundamentally new capabilities in the manipulation and detection of biological particles. Most current microfluidic applications are based on hydrodynamic, electrokinetic, acoustic and optic actuation. Implementing these concepts requires bulky external pumping/valving systems and energy supplies. The required wires and connectors make their fabrication and handling difficult. Most of the conventional approaches induce heat that may affect sensitive bio particles such as cells. There is a need for a technology for fluid handling in microfluidic devices that is of lowcost, simple, wireless, free of induced heat and independent of pH level or ion concentration. The use of magnetism would provide a wireless solution for this need. Micro magnetofluidics is a newly established research field that links magnetism and microfluidics to gain new capabilities. Magnetism provides a convenient and wireless way for control and manipulation of fluid flow in the microscale. Investigation of magnetisminduced phenomena in a microfluidic device has the advantage of welldefined experimental condition such as temperature and magnetic field because of the system size. This talk presents recent interesting phenomena in both continuousflow and digital micro magnetofluidics. 

Topology Tomography with Spatial Dependencies 15:00 Tue 25 Nov, 2014 :: Engineering North N132 :: Darryl Veitch :: The University of Melbourne
Media...There has been quite a lot of tomography inference work on measurement networks with a tree topology. Here observations are made, at the leaves of the tree, of `probes' sent down from the root and copied at each branch point. Inference can be performed based on loss or delay information carried by probes, and used in order to recover loss parameters, delay parameters, or the topology, of the tree. In all of these a strong assumption of spatial independence between links in the tree has been made in prior work. I will describe recent work on topology inference, based on loss measurement, which breaks that assumption. In particular I will introduce a new model class for loss with non trivial spatial dependence, the `Jump Independent Models', which are well motivated, and prove that within this class the topology is identifiable. 

Boundary behaviour of Hitchin and hypo flows with leftinvariant initial data 12:10 Fri 27 Feb, 2015 :: Ingkarni Wardli B20 :: Vicente Cortes :: University of Hamburg
Hitchin and hypo flows constitute a system of first order pdes for the construction of
Ricciflat Riemannian mertrics of special holonomy in dimensions 6, 7 and 8.
Assuming that the initial geometric structure is leftinvariant, we study whether the resulting Ricciflat manifolds can be extended in a natural way to complete Ricciflat manifolds. This talk is based on joint work with Florin Belgun, Marco Freibert and Oliver Goertsches, see arXiv:1405.1866 (math.DG). 

Multiscale modelling of multicellular biological systems: mechanics, development and disease 03:10 Fri 6 Mar, 2015 :: Lower Napier LG24 :: Dr James Osborne :: University of Melbourne
When investigating the development and function of multicellular biological systems it is not enough to only consider the behaviour of individual cells in isolation. For example when studying tissue development, how individual cells interact, both mechanically and biochemically, influences the resulting tissues form and function. In this talk we present a multiscale modelling framework for simulating the development and function of multicellular biological systems (in particular tissues). Utilising the natural structural unit of the cell, the framework consists
of three main scales: the tissue level (macroscale); the cell level (mesoscale); and the subcellular level (microscale), with multiple interactions occurring between all scales. The cell level is central to the framework and cells are modelled as discrete interacting entities using one of a number of possible modelling paradigms, including lattice based models (cellular automata and cellular Potts) and offlattice based models (cell centre and vertex based representations). The subcellular level concerns numerous metabolic and biochemical processes represented by interaction networks rendered stochastically or into ODEs. The outputs from such systems influence the behaviour of the cell level affecting properties such as adhesion and also influencing cell mitosis and apoptosis. At the tissue level we consider factors or restraints that influence the cells, for example the distribution of a nutrient or messenger molecule, which is represented by field equations, on a growing domain, with individual cells functioning as
sinks and/or sources. The modular approach taken within the framework enables more realistic behaviour to be considered at each scale.
This framework is implemented within the Open Source Chaste library (Cancer Heart and Soft Tissue Environment, (http://www.cs.ox.ac.uk/chaste/)
and has been used to model biochemical and biomechanical interactions in various biological systems. In this talk we present the key ideas of the framework along with applications within the fields of development and disease. 

On the analyticity of CRdiffeomorphisms 12:10 Fri 13 Mar, 2015 :: Engineering North N132 :: Ilya Kossivskiy :: University of Vienna
One of the fundamental objects in several complex variables is CRmappings. CRmappings naturally occur in complex analysis as boundary values of mappings between domains, and as restrictions of holomorphic mappings onto real submanifolds. It was already observed by Cartan that smooth CRdiffeomorphisms between CRsubmanifolds in C^N tend to be very regular, i.e., they are restrictions of holomorphic maps. However, in general smooth CRmappings form a more restrictive class of mappings. Thus, since the inception of CRgeometry, the following general question has been of fundamental importance for the field: Are CRequivalent realanalytic CRstructures also equivalent holomorphically? In joint work with Lamel, we answer this question in the negative, in any positive CRdimension and CRcodimension. Our construction is based on a recent dynamical technique in CRgeometry, developed in my earlier work with Shafikov. 

Did the Legend of Zelda unfold in our Solar System? 12:10 Mon 27 Apr, 2015 :: Napier LG29 :: Adam Rohrlach :: University of Adelaide
Media...Well, obviously not. We can see the other planets, and they're not terribly conducive to Elven based life. Still, I aim to exhaustively explore the topic, all the while avoiding conventional logic and reasoning. Clearly, one could roll out any number of 'telescope' based proofs, and 'video game characters aren't really real, even after a million wishes' arguments, but I want to tackle this hotly debated issue using physics (the ugly cousin of actual mathematics). Armed with a remedial understanding of year 12 physics, from the acclaimed 2000 South Australian syllabus, I can think of no one better qualified, or possibly willing, to give this talk. 

Identifying the Missing Aspects of the ANSI/ISA Best Practices for Security Policy 12:10 Mon 27 Apr, 2015 :: Napier LG29 :: Dinesha Ranathunga :: University of Adelaide
Media...Firewall configuration is a critical activity but it is often conducted manually, which often result in inaccurate, unreliable configurations that leave networks vulnerable to cyber attack. Firewall misconfigurations can have severe consequences in the context of critical infrastructure plants. Internal networks within these plants interconnect valuable industrial control equipment which often control safety critical processes. Security breaches here can result in disruption of critical services, cause severe environmental damage and at worse, loss of human lives.
Automation can make designing firewall configurations less tedious and their deployment more reliable and increasingly costeffective. In this talk I will discuss of our efforts to arrive at a highlevel security policy description based on the ANSI/ISA standard, suitable for automation. In doing do, we identify the missing aspects of the existing best practices and propose solutions. We then apply the corrected best practice specifications to real SCADA firewall configurations and evaluate their usefulness in describing SCADA policies accurately. 

Multivariate regression in quantitative finance: sparsity, structure, and robustness 15:10 Fri 1 May, 2015 :: Engineering North N132 :: A/Prof Mark Coates :: McGill University
Many quantitative hedge funds around the world strive to predict future equity and futures returns based on many sources of information, including historical returns and economic data. This leads to a multivariate regression problem. Compared to many regression problems, the signaltonoise ratio is extremely low, and profits can be realized if even a small fraction of the future returns can be accurately predicted. The returns generally have heavytailed distributions, further complicating the regression procedure.
In this talk, I will describe how we can impose structure into the regression problem in order to make detection and estimation of the very weak signals feasible. Some of this structure consists of an assumption of sparsity; some of it involves identification of common factors to reduce the dimension of the problem. I will also describe how we can formulate alternative regression problems that lead to more robust solutions that better match the performance metrics of interest in the finance setting. 

Indefinite spectral triples and foliations of spacetime 12:10 Fri 8 May, 2015 :: Napier 144 :: Koen van den Dungen :: Australian National University
Motivated by Dirac operators on Lorentzian manifolds, we propose a new framework to deal with nonsymmetric and nonelliptic operators in noncommutative geometry. We provide a definition for indefinite spectral triples, which correspond bijectively with certain pairs of spectral triples.
Next, we will show how a special case of indefinite spectral triples can be constructed from a family of spectral triples. In particular, this construction provides a convenient setting to study the Dirac operator on a spacetime with a foliation by spacelike hypersurfaces.
This talk is based on joint work with Adam Rennie (arXiv:1503.06916). 

Medical Decision Making 12:10 Mon 11 May, 2015 :: Napier LG29 :: Eka Baker :: University of Adelaide
Media...Practicing physicians make treatment decisions based on clinical trial data every day. This data is based on trials primarily conducted on healthy volunteers, or on those with only the disease in question. In reality, patients do have existing conditions that can affect the benefits and risks associated with receiving these treatments.
In this talk, I will explain how we modified an already existing Markov model to show the progression of treatment of a single condition over time. I will then explain how we adapted this to a different condition, and then created a combined model, which demonstrated how both diseases and treatments progressed on the same patient over their lifetime. 

Dynamics on Networks: The role of local dynamics and global networks on hypersynchronous neural activity 15:10 Fri 31 Jul, 2015 :: Ingkarni Wardli B21 :: Prof John Terry :: University of Exeter, UK
Media...Graph theory has evolved into a useful tool for studying complex brain networks inferred from a variety of measures of neural activity, including fMRI, DTI, MEG and EEG. In the study of neurological disorders, recent work has discovered differences in the structure of graphs inferred from patient and control cohorts. However, most of these studies pursue a purely observational approach; identifying correlations between properties of graphs and the cohort which they describe, without consideration of the underlying mechanisms. To move beyond this necessitates the development of mathematical modelling approaches to appropriately interpret network interactions and the alterations in brain dynamics they permit.
In the talk we introduce some of these concepts with application to epilepsy, introducing a dynamic network approach to study resting state EEG recordings from a cohort of 35 people with epilepsy and 40 adult controls. Using this framework we demonstrate a strongly significant difference between networks inferred from the background activity of people with epilepsy in comparison to normal controls. Our findings demonstrate that a mathematical model based analysis of routine clinical EEG provides significant additional information beyond standard clinical interpretation, which may ultimately enable a more appropriate mechanistic stratification of people with epilepsy leading to improved diagnostics and therapeutics. 

A relaxed introduction to resamplingbased multiple testing 12:10 Mon 10 Aug, 2015 :: Benham Labs G10 :: Ngoc Vo :: University of Adelaide
Media...Pvalues and false positives are two phrases that you commonly see thrown around in scientific literature. More often than not, experimenters and analysts are required to quote pvalues as a measure of statistical significance â how strongly does your evidence support your hypothesis? But what happens when this "strong evidence" is just a coincidence? What happens if you have lots of theses hypotheses â up to tens of thousands â to test all at the same time and most of your significant findings end up being just "coincidences"? 

Predicting the Winning Time of a Stage of the Tour de France 12:10 Mon 21 Sep, 2015 :: Benham Labs G10 :: Nic Rebuli :: University of Adelaide
Media...Sports can be lucrative, especially popular ones. But for all of us mere mortals, the only money we will ever glean from sporting events is through gambling (responsibly). When it comes to cycling, people generally choose their favourites based on individual and team performance, throughout the world cycling calendar. But what can be said for the duration of a given stage or the winning time of the highly sort after General Classification? In this talk I discuss a basic model for predicting the winning time of the Tour de France. I then apply this model to predicting the outcome of the 2012 and 2013 Tour de France and discuss the results in context. 

Weak globularity in homotopy theory and higher category theory 12:10 Thu 12 Nov, 2015 :: Ingkarni Wardli B19 :: Simona Paoli :: University of Leicester
Media...Spaces and homotopy theories are fundamental objects of study of algebraic topology. One way to study these objects is to break them into smaller components with the Postnikov decomposition. To describe such decomposition purely algebraically we need higher categorical structures. We describe one approach to modelling these structures based on a new paradigm to build weak higher categories, which is the notion of weak globularity. We describe some of their connections to both homotopy theory and higher category theory. 

Use of epidemic models in optimal decision making 15:00 Thu 19 Nov, 2015 :: Ingkarni Wardli 5.57 :: Tim Kinyanjui :: School of Mathematics, The University of Manchester
Media...Epidemic models have proved useful in a number of applications in epidemiology. In this work, I will present two areas that we have used modelling to make informed decisions. Firstly, we have used an age structured mathematical model to describe the transmission of Respiratory Syncytial Virus in a developed country setting and to explore different vaccination strategies. We found that delayed infant vaccination has significant potential in reducing the number of hospitalisations in the most vulnerable group and that most of the reduction is due to indirect protection. It also suggests that marked public health benefit could be achieved through RSV vaccine delivered to age groups not seen as most at risk of severe disease. The second application is in the optimal design of studies aimed at collection of householdstratified infection data. A design decision involves making a tradeoff between the number of households to enrol and the sampling frequency. Two commonly used study designs are considered: crosssectional and cohort. The search for an optimal design uses Bayesian methods to explore the joint parameterdesign space combined with Shannon entropy of the posteriors to estimate the amount of information for each design. We found that for the crosssectional designs, the amount of information increases with the sampling intensity while the cohort design often exhibits a tradeoff between the number of households sampled and the intensity of followup. Our results broadly support the choices made in existing data collection studies. 

A SemiMarkovian Modeling of Limit Order Markets 13:00 Fri 11 Dec, 2015 :: Ingkarni Wardli 5.57 :: Anatoliy Swishchuk :: University of Calgary
Media...R. Cont and A. de Larrard (SIAM J. Financial Mathematics, 2013) introduced a tractable stochastic model for the dynamics of a limit order book, computing various quantities of interest such as the probability of a price increase or the diffusion limit of the price process. As suggested by empirical observations, we extend their framework to 1) arbitrary distributions for book events interarrival times (possibly nonexponential) and 2) both the nature of a new book event and its corresponding interarrival time depend on the nature of the previous book event. We do so by resorting to Markov renewal processes to model the dynamics of the bid and ask queues. We keep analytical tractability via explicit expressions for the Laplace transforms of various quantities of interest. Our approach is justified and illustrated by calibrating the model to the five stocks Amazon, Apple, Google, Intel and Microsoft on June 21st 2012. As in Cont and Larrard, the bidask spread remains constant equal to one tick, only the bid and ask queues are modelled (they are independent from each other and get reinitialized after a price change), and all orders have the same size. (This talk is based on our joint paper with Nelson Vadori (Morgan Stanley)). 

Tduality for elliptic curve orientifolds 12:10 Fri 4 Mar, 2016 :: Ingkarni Wardli B17 :: Jonathan Rosenberg :: University of Maryland
Media...Orientifold string theories are quantum field theories based on the
geometry of a space with an involution. Tdualities are certain
relationships between such theories that look different
on the surface but give rise to the same observable physics.
In this talk I will not assume
any knowledge of physics but will concentrate on the associated
geometry, in the case where the underlying space is a (complex)
elliptic curve and the involution is either holomorphic or
antiholomorphic. The results blend algebraic topology
and algebraic geometry. This is mostly joint work with
Chuck Doran and Stefan MendezDiez. 

Connecting withinhost and betweenhost dynamics to understand how pathogens evolve 15:10 Fri 1 Apr, 2016 :: Engineering South S112 :: A/Prof Mark Tanaka :: University of New South Wales
Media...Modern molecular technologies enable a detailed examination of the extent of genetic variation among isolates of bacteria and viruses. Mathematical models can help make inferences about pathogen evolution from such data. Because the evolution of pathogens ultimately occurs within hosts, it is influenced by dynamics within hosts including interactions between pathogens and hosts. Most models of pathogen evolution focus on either the withinhost or the betweenhost level. Here I describe steps towards bridging the two scales. First, I present a model of influenza virus evolution that incorporates withinhost dynamics to obtain the betweenhost rate of molecular substitution as a function of the mutation rate, the withinhost reproduction number and other factors. Second, I discuss a model of viral evolution in which some hosts are immunocompromised, thereby extending opportunities for withinhost virus evolution which then affects populationlevel evolution. Finally, I describe a model of Mycobacterium tuberculosis in which multidrug resistance evolves within hosts and spreads by transmission between hosts. 

Mathematical modelling of the immune response to influenza 15:00 Thu 12 May, 2016 :: Ingkarni Wardli B20 :: Ada Yan :: University of Melbourne
Media...The immune response plays an important role in the resolution of primary influenza infection and prevention of subsequent infection in an individual. However, the relative roles of each component of the immune response in clearing infection, and the effects of interaction between components, are not well quantified.
We have constructed a model of the immune response to influenza based on data from viral interference experiments, where ferrets were exposed to two influenza strains within a short time period. The changes in viral kinetics of the second virus due to the first virus depend on the strains used as well as the interval between exposures, enabling inference of the timing of innate and adaptive immune response components and the role of crossreactivity in resolving infection. Our model provides a mechanistic explanation for the observed variation in viruses' abilities to protect against subsequent infection at short interexposure intervals, either by delaying the second infection or inducing stochastic extinction of the second virus. It also explains the decrease in recovery time for the second infection when the two strains elicit crossreactive cellular adaptive immune responses. To account for intersubject as well as intervirus variation, the model is formulated using a hierarchical framework. We will fit the model to experimental data using Markov Chain Monte Carlo methods; quantification of the model will enable a deeper understanding of the effects of potential new treatments.


Behavioural Microsimulation Approach to Social Policy and Behavioural Economics 15:10 Fri 20 May, 2016 :: S112 Engineering South :: Dr Drew Mellor :: Ernst & Young
SIMULAIT is a general purpose, behavioural microsimulation system designed to predict behavioural trends in human populations. This type of predictive capability grew out of original research initially conducted in conjunction with the Defence Science and Technology Group (DSTO) in South Australia, and has been fully commercialised and is in current use by a global customer base. To our customers, the principal value of the system lies in its ability to predict likely outcomes to scenarios that challenge conventional approaches based on extrapolation or generalisation. These types of scenarios include: the impact of disruptive technologies, such as the impact of widespread adoption of autonomous vehicles for transportation or batteries for household energy storage; and the impact of effecting policy elements or interventions, such as the impact of imposing water usage restrictions.
SIMULAIT employs a multidisciplinary methodology, drawing from agentbased modelling, behavioural science and psychology, microeconomics, artificial intelligence, simulation, game theory, engineering, mathematics and statistics. In this seminar, we start with a highlevel view of the system followed by a look under the hood to see how the various elements come together to answer questions about behavioural trends. The talk will conclude with a case study of a recent application of SIMULAIT to a significant policy problem  how to address the deficiency of STEM skilled teachers in the Victorian teaching workforce. 

Probabilistic Meshless Methods for Bayesian Inverse Problems 15:10 Fri 5 Aug, 2016 :: Engineering South S112 :: Dr Chris Oates :: University of Technology Sydney
Media...This talk deals with statistical inverse problems that involve partial differential equations (PDEs) with unknown parameters. Our goal is to account, in a rigorous way, for the impact of discretisation error that is introduced at each evaluation of the likelihood due to numerical solution of the PDE. In the context of meshless methods, the proposed, modelbased approach to discretisation error encourages statistical inferences to be more conservative in the presence of significant solver error. In addition, (i) a principled learningtheoretic approach to minimise the impact of solver error is developed, and (ii) the challenge of nonlinear PDEs is considered. The method is applied to parameter inference problems in which nonnegligible solver error must be accounted for in order to draw valid statistical conclusions. 

Calculus on symplectic manifolds 12:10 Fri 12 Aug, 2016 :: Ingkarni Wardli B18 :: Mike Eastwood :: University of Adelaide
Media...One can use the symplectic form to construct an elliptic complex replacing the de Rham complex. Then, under suitable curvature conditions, one can form coupled versions of this complex. Finally, on complex projective space, these constructions give rise to a series of elliptic complexes with geometric consequences for the FubiniStudy metric and its Xray transform. This talk, which will start from scratch, is based on the work of many authors but, especially, current joint work with Jan Slovak. 

Mathematical modelling of social spreading processes 15:10 Fri 19 Aug, 2016 :: Napier G03 :: Prof Hans De Sterck :: Monash University
Media...Social spreading processes are intriguing manifestations of how humans interact and shape each others' lives. There is great interest in improving our understanding of these processes, and the increasing availability of empirical information in the era of big data and online social networks, combined with mathematical and computational modelling techniques, offer compelling new ways to study these processes.
I will first discuss mathematical models for the spread of political revolutions on social networks. The influence of online social networks and social media on the dynamics of the Arab Spring revolutions of 2011 are of particular interest in our work. I will describe a hierarchy of models, starting from agentbased models realized on empirical social networks, and ending up with populationlevel models that summarize the dynamical behaviour of the spreading process. We seek to understand quantitatively how political revolutions may be facilitated by the modern online social networks of social media.
The second part of the talk will describe a populationlevel model for the social dynamics that cause cigarette smoking to spread in a population. Our model predicts that more individualistic societies will show faster adoption and cessation of smoking. Evidence from a newly composed centurylong composite data set on smoking prevalence in 25 countries supports the model, with potential implications for public health interventions around the world.
Throughout the talk, I will argue that important aspects of social spreading processes can be revealed and understood via quantitative mathematical and computational models matched to empirical data.
This talk describes joint work with John Lang and Danny Abrams. 

Modelling evolution of postmenopausal human longevity: The Grandmother Hypothesis 15:10 Fri 2 Sep, 2016 :: Napier G03 :: Dr Peter Kim :: University of Sydney
Media...Human postmenopausal longevity makes us unique among primates, but how did it evolve? One explanation, the Grandmother Hypothesis, proposes that as grasslands spread in ancient Africa displacing foods ancestral youngsters could effectively exploit, older females whose fertility was declining left more descendants by subsidizing grandchildren and allowing mothers to have new babies sooner. As more robust elders could help more descendants, selection favoured increased longevity while maintaining the ancestral end of female fertility.
We develop a probabilistic agentbased model that incorporates two sexes and mating, fertilitylongevity tradeoffs, and the possibility of grandmother help. Using this model, we show how the grandmother effect could have driven the evolution of human longevity. Simulations reveal two stable lifehistories, one humanlike and the other like our nearest cousins, the great apes. The probabilistic formulation shows how stochastic effects can slow down and prevent escape from the ancestral condition, and it allows us to investigate the effect of mutation rates on the trajectory of evolution. 

Transmission Dynamics of Visceral Leishmaniasis: designing a test and treat control strategy 12:10 Thu 29 Sep, 2016 :: EM218 :: Graham Medley :: London School of Hygiene & Tropical Medicine
Media...Visceral Leishmaniasis (VL) is targeted for elimination from the Indian SubContinent. Progress has been much better in some areas than others. Current control is based on earlier diagnosis and treatment and on insecticide spraying to reduce the density of the vector. There is a surprising dearth of specific information on the epidemiology of VL, which makes modelling more difficult. In this seminar, I describe a simple framework that gives some insight into the transmission dynamics. We conclude that the majority of infection comes from cases prior to diagnosis. If this is the case then, early diagnosis will be advantageous, but will require a test with high specificity. This is a paradox for many clinicians and public health workers, who tend to prioritise high sensitivity.
Medley, G.F., Hollingsworth, T.D., Olliaro, P.L. & Adams, E.R. (2015) Healthseeking, diagnostics and transmission in the control of visceral leishmaniasis. Nature 528, S102S108 (3 December 2015), DOI: 10.1038/nature16042 

Symmetric functions and quantum integrability 15:10 Fri 30 Sep, 2016 :: Napier G03 :: Dr Paul ZinnJustin :: University of Melbourne/Universite Pierre et Marie Curie
Media...We'll discuss an approach to studying families of symmetric polynomials which is based on ''quantum integrability'', that is, on the use of exactly solvable twodimensional lattice models. We'll first explain the general strategy on the simplest case, namely Schur polynomials, with the introduction of a model of lattice paths (a.k.a. fivevertex model). We'll then discuss recent work (in collaboration with M. Wheeler) that extends this approach to HallLittlewood polynomials and Grothendieck polynomials, and some applications of it. 

Parahoric bundles, invariant theory and the KazhdanLusztig map 12:10 Fri 21 Oct, 2016 :: Ingkarni Wardli B18 :: David Baraglia :: University of Adelaide
Media...In this talk I will introduce the notion of parahoric groups, a loop group analogue of parabolic subgroups. I will also discuss a global version of this, namely parahoric bundles on a complex curve. This leads us to a problem concerning the behaviour of invariant polynomials on the dual of the Lie algebra, a kind of "parahoric invariant theory". The key to solving this problem turns out to be the KazhdanLusztig map, which assigns to each nilpotent orbit in a semisimple Lie algebra a conjugacy class in the Weyl group. Based on joint work with Masoud Kamgarpour and Rohith Varma. 

Measuring and mapping carbon dioxide from remote sensing satellite data 15:10 Fri 21 Oct, 2016 :: Napier G03 :: Prof Noel Cressie :: University of Wollongong
Media...This talk is about environmental statistics for global remote sensing of atmospheric carbon dioxide, a leading greenhouse gas. An important compartment of the carbon cycle is atmospheric carbon dioxide (CO2), where it (and other gases) contribute to climate change through a greenhouse effect. There are a number of CO2 observational programs where measurements are made around the globe at a small number of groundbased locations at somewhat regular time intervals. In contrast, satellitebased programs are spatially global but give up some of the temporal richness. The most recent satellite launched to measure CO2 was NASA's Orbiting Carbon Observatory2 (OCO2), whose principal objective is to retrieve a geographical distribution of CO2 sources and sinks. OCO2's measurement of columnaveraged mole fraction, XCO2, is designed to achieve this, through a dataassimilation procedure that is statistical at its basis. Consequently, uncertainty quantification is key, starting with the spectral radiances from an individual sounding to borrowing of strength through spatialstatistical modelling. 

What is index theory? 12:10 Tue 21 Mar, 2017 :: Inkgarni Wardli 5.57 :: Dr Peter Hochs :: School of Mathematical Sciences
Media...Index theory is a link between topology, geometry and analysis. A typical theorem in index theory says that two numbers are equal: an analytic index and a topological index. The first theorem of this kind was the index theorem of Atiyah and Singer, which they proved in 1963. Index theorems have many applications in maths and physics. For example, they can be used to prove that a differential equation must have a solution. Also, they imply that the topology of a space like a sphere or a torus determines in what ways it can be curved. Topology is the study of geometric properties that do not change if we stretch or compress a shape without cutting or glueing. Curvature does change when we stretch something out, so it is surprising that topology can say anything about curvature. Index theory has many surprising consequences like this.


Quaternionic Kaehler manifolds of cohomogeneity one 12:10 Fri 16 Jun, 2017 :: Ligertwood 231 :: Vicente Cortes :: Universitat Hamburg
Media...Quaternionic Kaehler manifolds form an important class of Riemannian manifolds of special holonomy. They provide examples of Einstein manifolds of nonzero scalar curvature. I will show how to construct explicit examples of complete quaternionic Kaehler manifolds of negative scalar curvature beyond homogeneous spaces. In particular, I will present a series of examples of cohomogeneity one, based on arXiv:1701.07882. 

Compact pseudoRiemannian homogeneous spaces 12:10 Fri 18 Aug, 2017 :: Engineering Sth S111 :: Wolfgang Globke :: University of Adelaide
Media...A pseudoRiemannian homogeneous space $M$ of finite volume can be presented as $M=G/H$, where $G$ is a Lie group acting transitively and isometrically on $M$, and $H$ is a closed subgroup of $G$.
The condition that $G$ acts isometrically and thus preserves a finite measure on $M$ leads to strong algebraic restrictions on $G$. In the special case where $G$ has no compact semisimple normal subgroups, it turns out that the isotropy subgroup $H$ is a lattice, and that the metric on $M$ comes from a biinvariant metric on $G$.
This result allows us to recover Zeghibâs classification of Lorentzian compact homogeneous spaces, and to move towards a classification for metric index 2.
As an application we can investigate which pseudoRiemannian homogeneous spaces of finite volume are Einstein spaces. Through the existence questions for lattice subgroups, this leads to an interesting connection with the theory of transcendental numbers, which allows us to characterize the Einstein cases in low dimensions.
This talk is based on joint works with Oliver Baues, Yuri Nikolayevsky and Abdelghani Zeghib. 

On the fundamental of RayleighTaylor instability and interfacial mixing 15:10 Fri 15 Sep, 2017 :: Ingkarni Wardli B17 :: Prof Snezhana Abarzhi :: University of Western Australia
RayleighTaylor instability (RTI) develops when fluids of different densities are accelerated against their density gradient. Extensive interfacial mixing of the fluids ensues with time. RayleighTaylor (RT) mixing controls a broad variety of processes in fluids, plasmas and materials, in high and low energy density regimes, at astrophysical and atomistic scales. Examples include formation of hot spot in inertial confinement, supernova explosion, stellar and planetary convection, flows in atmosphere and ocean, reactive and supercritical fluids, material transformation under impact and lightmaterial interaction. In some of these cases (e.g. inertial confinement fusion) RT mixing should be tightly mitigated; in some others (e.g. turbulent combustion) it should be strongly enhanced. Understanding the fundamentals of RTI is crucial for achieving a better control of nonequilibrium processes in nature and technology.
Traditionally, it was presumed that RTI leads to uncontrolled growth of smallscale imperfections, singlescale nonlinear dynamics, and extensive mixing that is similar to canonical turbulence. The recent success of the theory and experiments in fluids and plasmas suggests an alternative scenario of RTI evolution. It finds that the interface is necessary for RT mixing to accelerate, the acceleration effects are strong enough to suppress the development of turbulence, and the RT dynamics is multiscale and has significant degree of order.
This talk presents a physicsbased consideration of fundamentals of RTI and RT mixing, and summarizes what is certain and what is not so certain in our knowledge of RTI. The focus question  How to influence the regularization process in RT mixing? We also discuss new opportunities for improvements of predictive modeling capabilities, physical description, and control of RT mixing in fluids, plasmas and materials. 

How oligomerisation impacts steady state gradient in a morphogenreceptor system 15:10 Fri 20 Oct, 2017 :: Ingkarni Wardli 5.57 :: Mr Phillip Brown :: University of Adelaide
In developmental biology an important process is cell fate determination, where cells start to differentiate their form and function. This is an element of the broader concept of morphogenesis. It has long been held that cell differentiation can occur by a chemical signal providing positional information to 'undecided' cells. This chemical produces a gradient of concentration that indicates to a cell what path it should develop along. More recently it has been shown that in a particular system of this type, the chemical (protein) does not exist purely as individual molecules, but can exist in multiprotein complexes known as oligomers.
Mathematical modelling has been performed on systems of oligomers to determine if this concept can produce useful gradients of concentration. However, there are wide range of possibilities when it comes to how oligomer systems can be modelled and most of them have not been explored.
In this talk I will introduce a new monomer system and analyse it, before extending this model to include oligomers. A number of oligomer models are proposed based on the assumption that proteins are only produced in their oligomer form and can only break apart once they have left the producing cell. It will be shown that when oligomers are present under these conditions, but only monomers are permitted to bind with receptors, then the system can produce robust, biologically useful gradients for a significantly larger range of model parameters (for instance, degradation, production and binding rates) compared to the monomer system. We will also show that when oligomers are permitted to bind with receptors there is negligible difference compared to the monomer system. 

Springer correspondence for symmetric spaces 12:10 Fri 17 Nov, 2017 :: Engineering Sth S111 :: Ting Xue :: University of Melbourne
Media...The Springer theory for reductive algebraic groups plays an important role in representation theory. It relates nilpotent orbits in the Lie algebra to irreducible representations of the Weyl group. We develop a Springer theory in the case of symmetric spaces using Fourier transform, which relates nilpotent orbits in this setting to irreducible representations of Hecke algebras of various Coxeter groups with specified parameters. This in turn gives rise to character sheaves on symmetric spaces, which we describe explicitly in the case of classical symmetric spaces. A key ingredient in the construction is the nearby cycle sheaves associated to the adjoint quotient map. The talk is based on joint work with Kari Vilonen and partly based on joint work with Misha Grinberg and Kari Vilonen. 

Quantum Airy structures and topological recursion 13:10 Wed 14 Mar, 2018 :: Ingkarni Wardli B17 :: Gaetan Borot :: MPI Bonn
Media...Quantum Airy structures are Lie algebras of quadratic differential operators  their classical limit describes Lagrangian subvarieties in symplectic vector spaces which are tangent to the zero section and cut out by quadratic equations. Their partition function  which is the function annihilated by the collection of differential operators  can be computed by the topological recursion. I will explain how to obtain quantum Airy structures from spectral curves, and explain how we can retrieve from them correlation functions of semisimple cohomological field theories, by exploiting the symmetries. This is based on joint work with Andersen, Chekhov and Orantin. 

Models, machine learning, and robotics: understanding biological networks 15:10 Fri 16 Mar, 2018 :: Horace Lamb 1022 :: Prof Steve Oliver :: University of Cambridge
The availability of complete genome sequences has enabled the construction of computer models of metabolic networks that may be used to predict the impact of genetic mutations on growth and survival. Both logical and constraintbased models of the metabolic network of the model eukaryote, the ale yeast Saccharomyces cerevisiae, have been available for some time and are continually being improved by the research community. While such models are very successful at predicting the impact of deleting single genes, the prediction of the impact of higher order genetic interactions is a greater challenge. Initial studies of limited gene sets provided encouraging results. However, the availability of comprehensive experimental data for the interactions between genes involved in metabolism demonstrated that, while the models were able to predict the general properties of the genetic interaction network, their ability to predict interactions between specific pairs of metabolic genes was poor. I will examine the reasons for this poor performance and demonstrate ways of improving the accuracy of the models by exploiting the techniques of machine learning and robotics.
The utility of these metabolic models rests on the firm foundations of genome sequencing data. However, there are two major problems with these kinds of network models  there is no dynamics, and they do not deal with the uncertain and incomplete nature of much biological data. To deal with these problems, we have developed the Flexible Nets (FNs) modelling formalism. FNs were inspired by Petri Nets and can deal with missing or uncertain data, incorporate both dynamics and regulation, and also have the potential for model predictive control of biotechnological processes.


Quantifying language change 15:10 Fri 1 Jun, 2018 :: Horace Lamb 1022 :: A/Prof Eduardo Altmann :: University of Sydney
Mathematical methods to study natural language are increasingly important because of the ubiquity of textual data in the Internet. In this talk I will discuss mathematical models and statistical methods to quantify the variability of language, with focus on two problems: (i) How the vocabulary of languages changed over the last centuries? (ii) How the language of scientific disciplines relate to each other and evolved in the last decades? One of the main challenges of these analyses stem from universal properties of word frequencies, which show high temporal variability and are fattailed distributed. The later feature dramatically affects the statistical properties of entropybased estimators, which motivates us to compare vocabularies using a generalized JensonShannon divergence (obtained from entropies of order alpha). 

Quantifying language change 15:10 Fri 1 Jun, 2018 :: Napier 208 :: A/Prof Eduardo Altmann :: University of Sydney
Mathematical methods to study natural language are increasingly important because of the ubiquity of textual data in the Internet. In this talk I will discuss mathematical models and statistical methods to quantify the variability of language, with focus on two problems: (i) How the vocabulary of languages changed over the last centuries? (ii) How the language of scientific disciplines relate to each other and evolved in the last decades? One of the main challenges of these analyses stem from universal properties of word frequencies, which show high temporal variability and are fattailed distributed. The later feature dramatically affects the statistical properties of entropybased estimators, which motivates us to compare vocabularies using a generalized JensonShannon divergence (obtained from entropies of order alpha). 

Twisted Ktheory of compact Lie groups and extended Verlinde algebras 11:10 Fri 12 Oct, 2018 :: Barr Smith South Polygon Lecture theatre :: ChiKwong Fok :: University of Adelaide
In a series of recent papers, Freed, Hopkins and Teleman put forth a deep result which identifies the twisted K theory of a compact Lie group G with the representation theory of its loop group LG. Under suitable conditions, both objects can be enhanced to the Verlinde algebra, which appears in mathematical physics as the Frobenius algebra of a certain topological quantum field theory, and in algebraic geometry as the algebra encoding information of moduli spaces of Gbundles over Riemann surfaces. The Verlinde algebra for G with nice connectedness properties have been wellknown. However, explicit descriptions of such for disconnected G are lacking. In this talk, I will discuss the various aspects of the FreedHopkinsTeleman Theorem and partial results on an extension of the Verlinde algebra arising from a disconnected G. The talk is based on work in progress joint with David Baraglia and Varghese Mathai. 

Bayesian Synthetic Likelihood 15:10 Fri 26 Oct, 2018 :: Napier 208 :: A/Prof Chris Drovandi :: Queensland University of Technology
Complex stochastic processes are of interest in many applied disciplines. However, the likelihood function associated with such models is often computationally intractable, prohibiting standard statistical inference frameworks for estimating model parameters based on data. Currently, the most popular simulationbased parameter estimation method is approximate Bayesian computation (ABC). Despite the widespread applicability and success of ABC, it has some limitations. This talk will describe an alternative approach, called Bayesian synthetic likelihood (BSL), which overcomes some limitations of ABC and can be much more effective in certain classes of applications. The talk will also describe various extensions to the standard BSL approach. This project has been a joint effort with several academic collaborators, postdocs and PhD students. 

The role of microenvironment in regulation of cell infiltration and bortezomibOV therapy in glioblastoma 15:10 Fri 11 Jan, 2019 :: IW 5.57 :: Professor Yangjin Kim :: Konkuk University, South Korea
Tumor microenvironment (TME) plays a critical role in regulation of tumor cell invasion in glioblastoma. Many microenvironmental factors such as extracllular matrix, microglia and astrocytes can either block or enhance this critical infiltration step in brain [4]. Oncolytic viruses such as herpes simplex virus1 (oHSV) are genetically modified to target and kill cancer cells while not harming healthy normal cells and are currently under multiple clinical trials for safety and efficacy [1]. Bortezomib is a peptidebased proteasome inhibitor and is an FDAapproved drug for myeloma and mantle cell lymphoma. Yoo et al (2) have previously demonstrated that bortezomibinduced unfolded protein response (UPR) in many tumor cell lines (glioma, ovarian, and head and neck) upregulated expression of heat shock protein 90 (HSP90), which then enhanced viral replication through promotion of nuclear localization of the viral polymerase in vitro. This led to synergistic tumor cell killing in vitro, and a combination treatment of mice with oHSV and bortezomib showed improved antitumor efficacy in vivo [2]. This combination therapy also increased the surface expression levels of NK cell activating markers and enhanced proinflammatory cytokine secretion. These findings demonstrated that the synergistic interaction between oHSV and bortezomib, a clinically relevant proteasome inhibitor, augments the cancer cell killing and promotes overall therapeutic efficacy. We investigated the role of NK cells in combination therapy with oncolytic virus (OV) and bortezomib. NK cells display rapid and potent immunity to metastasis and hematological cancers, and they overcome immunosuppressive effects of tumor microenvironment. We developed a mathematical model, a system of PDEs, in order to address the question of how the density of NK cells affects the growth of the tumor [3]. We found that the antitumor efficacy increases when the endogenous NKs are depleted, and also when exogenous NK cells are injected into the tumor. We also show that the TME plays a significant role in antitumor efficacy in OV combination therapy, and illustrate the effect of different spatial patterns of OV injection [5]. The results illustrate a possible phenotypic switch within tumor populations in a given microenvironment, and suggest new antiinvasion therapies. These predictions were validated by our in vivo and in vitro experiments.
References
1] Â Kanai R, â¦ Rabkin SD, âOncolytic herpes simplex virus vectors and chemotherapy: are combinatorial strategies more effective for cancer?â, Future Oncology, 6(4), 619â634, 2010. â¨
[2] Â Yoo J, et al., âBortezomibinduced unfolded protein response increases oncolytic hsv1 replication resulting in synergistic antitumor effectâ, Clin Cancer Res , Vol. 20(14), 2014, pp. 37873798. â¨
[3] Â Yangjin Kim,..Balveen Kaur and Avner Friedman, âComplex role of NK cells in regulation of oncolytic virusbortezomib therapyâ, PNAS, 115 (19), pp. 49274932, 2018. â¨
[4] Yangjin Kim, ..Sean Lawler, and Mark Chaplain, âRole of extracellular matrix and microenvironment in regulation of tumor growth and LARmediated invasion in glioblastomaâ, PLoS One, 13(10):e0204865, 2018. â¨
[5] Yangjin Kim, â¦, Hans G. Othmer, âSynergistic effects of bortezomibOV therapy and antiinvasiveâ¨strategies in glioblastoma: A mathematical modelâ, Special issue, submitted, 2018. 
News matching "Epidemiological consequences of householdbased an" 
ARC Grant successes Congratulations to Tony Roberts, Charles Pearce, Robert Elliot, Andrew Metcalfe and all their collaborators on their success in the current round of ARC grants. The projects are "Development of innovative technologies for oil production based on the advanced theory of suspension flows in porous media" (Tony Roberts et al.), "Perturbation and approximation methods for linear operators with applications to train control, water resource management and evolution of physical systems" (Charles Pearce et al.),
"Risk Measures and Management in Finance and Actuarial Science Under RegimeSwitching Models" (Robert Elliott et al.) and "A new flood design methodology for a variable and changing climate" (Andrew Metcalfe et al.) Posted Mon 26 Oct 09. 
Publications matching "Epidemiological consequences of householdbased an"Publications 

EvidenceBased Medicine Evaluation of Electrophysiological Studies of the Anxiety Disorders Clark, C; Galletly, Cherrie; Ash, David; Moores, K; Penrose, R; McFarlane, Alexander, Clinical EEG and Neuroscience 40 (84–112) 2009  A high resolution spatiotemporal model for single storm events based on radar images Qin, J; Leonard, Michael; Kuczera, George; Thyer, M; Metcalfe, Andrew; Lambert, Martin, Water Down Under 2008, Adelaide 14/04/08  Synchronization of neural networks based on parameter identification and via output or state coupling Lou, X; Cui, B, Journal of Computational and Applied Mathematics 222 (440–457) 2008  Goodnessoffit tests based on characterizations involving moments of order statistics Morris, Kerwin; Szynal, D, International Journal of Pure and Applied Mathematics 38 (83–121) 2007  The Mekongapplications of value at risk (VAR) and conditional value at risk (CVAR) simulation to the benefits, costs and consequences of water resources development in a large river basin Webby, Roger; Adamson, Peter; Boland, J; Howlett, P; Metcalfe, Andrew; Piantadosi, J, Ecological Modelling 201 (89–96) 2007  Nonlinear analysis of rubberbased polymeric materials with thermal relaxation models Melnik, R; Strunin, D; Roberts, Anthony John, Numerical Heat Transfer Part AApplications 47 (549–569) 2005  Classofservice mapping for QoS: A statistical signaturebased approach to IP traffic classification Roughan, Matthew; Sen, S; Spatscheck, O; Duffield, N, ACM SIG COMM 2004, Taormina, Sicily, Italy 25/10/04  The effect of World War 1 and the 1918 influenza pandemic on cohort life expectancy of South Australian males born in 18811900 Leppard, Phillip; Tallis, George; Pearce, Charles, Journal of Population Research 21 (161–176) 2004  A recursive filterbased algorithm for maximum likelihood localisation of narrowband autoregressive sources Malcolm, William; Elliott, Robert, Thirtyeighth Asilomar Conference on Signals, Systems & Computers, Pacific Grove, California USA 07/11/04  A genetic algorithm based on nearest neighbour classification to breast cancer diagnosis Jain, R; Mazumdar, Jagan, Australasian Physical and Engineering Sciences in Medicine 26 (6–11) 2003  Resamplingbased multiple testing for microarray data analysis (Invited discussion of paper by Ge, Dudoit and Speed) Glonek, Garique; Solomon, Patricia, Test 12 (50–53) 2003  Goodnessoffit tests based on characterizations in terms of moments of order statistics Morris, Kerwin; Szynal, D, Applicationes Mathematicae 29 (251–283) 2002  A goodnessoffit test for the uniform distribution based on a characterization Morris, Kerwin; Szynal, D, Journal of Mathematical Sciences 106 (2719–2724) 2001  Goodnessoffit tests based on characterizations of continuous distributions Morris, Kerwin; Szynal, D, Applicationes Mathematicae 27 (475–488) 2000 
Advanced search options
You may be able to improve your search results by using the following syntax:
Query  Matches the following 

Asymptotic Equation  Anything with "Asymptotic" or "Equation". 
+Asymptotic +Equation  Anything with "Asymptotic" and "Equation". 
+Stokes "NavierStokes"  Anything containing "Stokes" but not "NavierStokes". 
Dynam*  Anything containing "Dynamic", "Dynamical", "Dynamicist" etc. 
