You are here
 Text size: S | M | L
 May 2019 M T W T F S S 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31

# Search the School of Mathematical Sciences

 Find in People Courses Events News Publications

## People matching "Inferring absolute population and recruitment of s"

 Associate Professor Gary Glonek Associate Professor in StatisticsMore about Gary Glonek...
 Associate Professor Inge Koch Associate Professor in StatisticsMore about Inge Koch...
 Professor Matthew Roughan Professor of Applied MathematicsMore about Matthew Roughan...
 Professor Patty Solomon Professor of Statistical BioinformaticsMore about Patty Solomon...
 Dr Simon Tuke Lecturer in StatisticsMore about Simon Tuke...

## Courses matching "Inferring absolute population and recruitment of s"

 Analysis of multivariable and high dimensional data Multivariate analysis of data is performed with the aims to 1. understand the structure in data and summarise the data in simpler ways; 2. understand the relationship of one part of the data to another part; and 3. make decisions or draw inferences based on data. The statistical analyses of multivariate data extend those of univariate data, and in doing so require more advanced mathematical theory and computational techniques. The course begins with a discussion of the three classical methods Principal Component Analysis, Canonical Correlation Analysis and Discriminant Analysis which correspond to the aims above. We also learn about Cluster Analysis, Factor Analysis and newer methods including Independent Component Analysis. For most real data the underlying distribution is not known, but if the assumptions of multivariate normality of the data hold, extra properties can be derived. Our treatment combines ideas, theoretical properties and a strong computational component for each of the different methods we discuss. For the computational part -- with Matlab -- we make use of real data and learn the use of simulations in order to assess the performance of different methods in practice. Topics covered: 1. Introduction to multivariate data, the multivariate normal distribution 2. Principal Component Analysis, theory and practice 3. Canonical Correlation Analysis, theory and practice 4. Discriminant Analysis, Fisher's LDA, linear and quadratic DA 5. Cluster Analysis: hierarchical and k-means methods 6. Factor Analysis and latent variables 7. Independent Component Analysis including an Introduction to Information Theory The course will be based on my forthcoming monograph Analysis of Multivariate and High-Dimensional Data - Theory and Practice, to be published by Cambridge University Press. More about this course...

## Events matching "Inferring absolute population and recruitment of s"

 Watching evolution in real time; problems and potential research areas. 15:10 Fri 26 May, 2006 :: G08. Mathematics Building University of Adelaide :: Prof Alan Cooper (Federation Fellow)Abstract...Recent studies (1) have indicated problems with our ability to use the genetic distances between species to estimate the time since their divergence (so called molecular clocks). An exponential decay curve has been detected in comparisons of closely related taxa in mammal and bird groups, and rough approximations suggest that molecular clock calculations may be problematic for the recent past (eg <1 million years). Unfortunately, this period encompasses a number of key evolutionary events where estimates of timing are critical such as modern human evolutionary history, the domestication of animals and plants, and most issues involved in conservation biology. A solution (formulated at UA) will be briefly outlined. A second area of active interest is the recent suggestion (2) that mitochondrial DNA diversity does not track population size in several groups, in contrast to standard thinking. This finding has been interpreted as showing that mtDNA may not be evolving neutrally, as has long been assumed. Large ancient DNA datasets provide a means to examine these issues, by revealing evolutionary processes in real time (3). The data also provide a rich area for mathematical investigation as temporal information provides information about several parameters that are unknown in serial coalescent calculations (4).References: Ho SYW et al. Time dependency of molecular rate estimates and systematic overestimation of recent divergence times. Mol. Biol. Evol. 22, 1561-1568 (2005); Penny D, Nature 436, 183-184 (2005). Bazin E., et al. Population size does not influence mitochondrial genetic diversity in animals. Science 312, 570 (2006); Eyre-Walker A. Size does not matter for mitochondrial DNA, Science 312, 537 (2006). Shapiro B, et al. Rise and fall of the Beringian steppe bison. Science 306: 1561-1565 (2004); Chan et al. Bayesian estimation of the timing and severity of a population bottleneck from ancient DNA. PLoS Genetics, 2 e59 (2006). Drummond et al. Measurably evolving populations, Trends in Ecol. Evol. 18, 481-488 (2003); Drummond et al. Bayesian coalescent inference of past population dynamics from molecular sequences. Molecular Biology Evolution 22, 1185-92 (2005).
 A Bivariate Zero-inflated Poisson Regression Model and application to some Dental Epidemiological data 14:10 Fri 27 Oct, 2006 :: G08 Mathematics Building University of Adelaide :: University Prof Sudhir PaulAbstract...Data in the form of paired (pre-treatment, post-treatment) counts arise in the study of the effects of several treatments after accounting for possible covariate effects. An example of such a data set comes from a dental epidemiological study in Belo Horizonte (the Belo Horizonte caries prevention study) which evaluated various programmes for reducing caries. Also, these data may show extra pairs of zeros than can be accounted for by a simpler model, such as, a bivariate Poisson regression model. In such situations we propose to use a zero-inflated bivariate Poisson regression (ZIBPR) model for the paired (pre-treatment, posttreatment) count data. We develop EM algorithm to obtain maximum likelihood estimates of the parameters of the ZIBPR model. Further, we obtain exact Fisher information matrix of the maximum likelihood estimates of the parameters of the ZIBPR model and develop a procedure for testing treatment effects. The procedure to detect treatment effects based on the ZIBPR model is compared, in terms of size, by simulations, with an earlier procedure using a zero-inflated Poisson regression (ZIPR) model of the post-treatment count with the pre-treatment count treated as a covariate. The procedure based on the ZIBPR model holds level most effectively. A further simulation study indicates good power property of the procedure based on the ZIBPR model. We then compare our analysis, of the decayed, missing and filled teeth (DMFT) index data from the caries prevention study, based on the ZIBPR model with the analysis using a zero-inflated Poisson regression model in which the pre-treatment DMFT index is taken to be a covariate
 Modelling gene networks: the case of the quorum sensing network in bacteria. 15:10 Fri 1 Jun, 2007 :: G08 Mathematics Building University of Adelaide :: Dr Adrian KoerberAbstract...The quorum sensing regulatory gene-network is employed by bacteria to provide a measure of their population-density and switch their behaviour accordingly. I will present an overview of quorum sensing in bacteria together with some of the modelling approaches I\'ve taken to describe this system. I will also discuss how this system relates to virulence and medical treatment, and the insights gained from the mathematics.
 Likelihood inference for a problem in particle physics 15:10 Fri 27 Jul, 2007 :: G04 Napier Building University of Adelaide :: Prof. Anthony DavisonAbstract...The Large Hadron Collider (LHC), a particle accelerator located at CERN, near Geneva, is (currently!) expected to start operation in early 2008. It is located in an underground tunnel 27km in circumference, and when fully operational, will be the world's largest and highest energy particle accelerator. It is hoped that it will provide evidence for the existence of the Higgs boson, the last remaining particle of the so-called Standard Model of particle physics. The quantity of data that will be generated by the LHC is roughly equivalent to that of the European telecommunications network, but this will be boiled down to just a few numbers. After a brief introduction, this talk will outline elements of the statistical problem of detecting the presence of a particle, and then sketch how higher order likelihood asymptotics may be used for signal detection in this context. The work is joint with Nicola Sartori, of the Università Ca' Foscari, in Venice.
 Insights into the development of the enteric nervous system and Hirschsprung's disease 15:10 Fri 24 Aug, 2007 :: G08 Mathematics building University of Adelaide :: Assoc. Prof. Kerry Landman :: Department of Mathematics and Statistics, University of MelbourneAbstract...During the development of the enteric nervous system, neural crest (NC) cells must first migrate into and colonise the entire gut from stomach to anal end. The migratory precursor NC cells change type and differentiate into neurons and glia cells. These cells form the enteric nervous system, which gives rise to normal gut function and peristaltic contraction. Failure of the NC cells to invade the whole gut results in a lack of neurons in a length of the terminal intestine. This potentially fatal condition, marked by intractable constipation, is called Hirschsprung's Disease. The interplay between cell migration, cell proliferation and embryonic gut growth are important to the success of the NC cell colonisation process. Multiscale models are needed in order to model the different spatiotemporal scales of the NC invasion. For example, the NC invasion wave moves into unoccupied regions of the gut with a wave speed of around 40 microns per hour. New time-lapse techniques have shown that there is a web-like network structure within the invasion wave. Furthermore, within this network, individual cell trajectories vary considerably. We have developed a population-scale model for basic rules governing NC cell invasive behaviour incorporating the important mechanisms. The model predictions were tested experimentally. Mathematical and experimental results agreed. The results provide an understanding of why many of the genes implicated in Hirschsprung's Disease influence NC population size. Our recently developed individual cell-based model also produces an invasion wave with a well-defined wave speed; however, in addition Individual cell trajectories within the invasion wave can be extracted. Further challenges in modeling the various scales of the developmental system will be discussed.
 Regression: a backwards step? 13:10 Fri 7 Sep, 2007 :: Maths G08 :: Dr Gary GlonekMedia...Abstract...Most students of high school mathematics will have encountered the technique of fitting a line to data by least squares. Those who have taken a university statistics course will also have heard this method referred to as regression. However, it is not obvious from common dictionary definitions why this should be the case. For example, "reversion to an earlier or less advanced state or form". In this talk, the mathematical phenomenon that gave regression its name will be explained and will be shown to have implications in some unexpected contexts.
 Statistical Critique of the International Panel on Climate Change's work on Climate Change. 18:00 Wed 17 Oct, 2007 :: Union Hall University of Adelaide :: Mr Dennis TrewinAbstract...Climate change is one of the most important issues facing us today. Many governments have introduced or are developing appropriate policy interventions to (a) reduce the growth of greenhouse gas emissions in order to mitigate future climate change, or (b) adapt to future climate change. This important work deserves a high quality statistical data base but there are statistical shortcomings in the work of the International Panel on Climate Change (IPCC). There has been very little involvement of qualified statisticians in the very important work of the IPCC which appears to be scientifically meritorious in most other ways. Mr Trewin will explain these shortcomings and outline his views on likely future climate change, taking into account the statistical deficiencies. His conclusions suggest climate change is still an important issue that needs to be addressed but the range of likely outcomes is a lot lower than has been suggested by the IPCC. This presentation will be based on an invited paper presented at the OECD World Forum.
 Moderated Statistical Tests for Digital Gene Expression Technologies 15:10 Fri 19 Oct, 2007 :: G04 Napier Building University of Adelaide :: Dr Gordon Smyth :: Walter and Eliza Hall Institute of Medical Research in Melbourne, AustraliaAbstract...Digital gene expression (DGE) technologies measure gene expression by counting sequence tags. They are sensitive technologies for measuring gene expression on a genomic scale, without the need for prior knowledge of the genome sequence. As the cost of DNA sequencing decreases, the number of DGE datasets is expected to grow dramatically. Various tests of differential expression have been proposed for replicated DGE data using over-dispersed binomial or Poisson models for the counts, but none of the these are usable when the number of replicates is very small. We develop tests using the negative binomial distribution to model overdispersion relative to the Poisson, and use conditional weighted likelihood to moderate the level of overdispersion across genes. A heuristic empirical Bayes algorithm is developed which is applicable to very general likelihood estimation contexts. Not only is our strategy applicable even with the smallest number of replicates, but it also proves to be more powerful than previous strategies when more replicates are available. The methodology is applicable to other counting technologies, such as proteomic spectral counts.
 Global and Local stationary modelling in finance: Theory and empirical evidence 14:10 Thu 10 Apr, 2008 :: G04 Napier Building University of Adelaide :: Prof. Dominique Guégan :: Universite Paris 1 Pantheon-SorbonneAbstract...To model real data sets using second order stochastic processes imposes that the data sets verify the second order stationarity condition. This stationarity condition concerns the unconditional moments of the process. It is in that context that most of models developed from the sixties' have been studied; We refer to the ARMA processes (Brockwell and Davis, 1988), the ARCH, GARCH and EGARCH models (Engle, 1982, Bollerslev, 1986, Nelson, 1990), the SETAR process (Lim and Tong, 1980 and Tong, 1990), the bilinear model (Granger and Andersen, 1978, Guégan, 1994), the EXPAR model (Haggan and Ozaki, 1980), the long memory process (Granger and Joyeux, 1980, Hosking, 1981, Gray, Zang and Woodward, 1989, Beran, 1994, Giraitis and Leipus, 1995, Guégan, 2000), the switching process (Hamilton, 1988). For all these models, we get an invertible causal solution under specific conditions on the parameters, then the forecast points and the forecast intervals are available. Thus, the stationarity assumption is the basis for a general asymptotic theory for identification, estimation and forecasting. It guarantees that the increase of the sample size leads to more and more information of the same kind which is basic for an asymptotic theory to make sense. Now non-stationarity modelling has also a long tradition in econometrics. This one is based on the conditional moments of the data generating process. It appears mainly in the heteroscedastic and volatility models, like the GARCH and related models, and stochastic volatility processes (Ghysels, Harvey and Renault 1997). This non-stationarity appears also in a different way with structural changes models like the switching models (Hamilton, 1988), the stopbreak model (Diebold and Inoue, 2001, Breidt and Hsu, 2002, Granger and Hyung, 2004) and the SETAR models, for instance. It can also be observed from linear models with time varying coefficients (Nicholls and Quinn, 1982, Tsay, 1987). Thus, using stationary unconditional moments suggest a global stationarity for the model, but using non-stationary unconditional moments or non-stationary conditional moments or assuming existence of states suggest that this global stationarity fails and that we only observe a local stationary behavior. The growing evidence of instability in the stochastic behavior of stocks, of exchange rates, of some economic data sets like growth rates for instance, characterized by existence of volatility or existence of jumps in the variance or on the levels of the prices imposes to discuss the assumption of global stationarity and its consequence in modelling, particularly in forecasting. Thus we can address several questions with respect to these remarks. 1. What kinds of non-stationarity affect the major financial and economic data sets? How to detect them? 2. Local and global stationarities: How are they defined? 3. What is the impact of evidence of non-stationarity on the statistics computed from the global non stationary data sets? 4. How can we analyze data sets in the non-stationary global framework? Does the asymptotic theory work in non-stationary framework? 5. What kind of models create local stationarity instead of global stationarity? How can we use them to develop a modelling and a forecasting strategy? These questions began to be discussed in some papers in the economic literature. For some of these questions, the answers are known, for others, very few works exist. In this talk I will discuss all these problems and will propose 2 new stategies and modelling to solve them. Several interesting topics in empirical finance awaiting future research will also be discussed.
 Computational Methods for Phase Response Analysis of Circadian Clocks 15:10 Fri 18 Jul, 2008 :: G04 Napier Building University of Adelaide. :: Prof. Linda Petzold :: Dept. of Mechanical and Environmental Engineering, University of California, Santa BarbaraAbstract...Circadian clocks govern daily behaviors of organisms in all kingdoms of life. In mammals, the master clock resides in the suprachiasmatic nucleus (SCN) of the hypothalamus. It is composed of thousands of neurons, each of which contains a sloppy oscillator - a molecular clock governed by a transcriptional feedback network. Via intercellular signaling, the cell population synchronizes spontaneously, forming a coherent oscillation. This multi-oscillator is then entrained to its environment by the daily light/dark cycle. Both at the cellular and tissular levels, the most important feature of the clock is its ability not simply to keep time, but to adjust its time, or phase, to signals. We present the parametric impulse phase response curve (pIPRC), an analytical analog to the phase response curve (PRC) used experimentally. We use the pIPRC to understand both the consequences of intercellular signaling and the light entrainment process. Further, we determine which model components determine the phase response behavior of a single oscillator by using a novel model reduction technique. We reduce the number of model components while preserving the pIPRC and then incorporate the resultant model into a couple SCN tissue model. Emergent properties, including the ability of the population to synchronize spontaneously are preserved in the reduction. Finally, we present some mathematical tools for the study of synchronization in a network of coupled, noisy oscillators.
 Elliptic equation for diffusion-advection flows 15:10 Fri 15 Aug, 2008 :: G03 Napier Building University of Adelaide :: Prof. Pavel Bedrikovsetsky :: Australian School of Petroleum Science, University of Adelaide.Abstract... The standard diffusion equation is obtained by Einstein's method and its generalisation, Fokker-Plank-Kolmogorov-Feller theory. The time between jumps in Einstein derivation is constant. We discuss random walks with residence time distribution, which occurs for flows of solutes and suspensions/colloids in porous media, CO2 sequestration in coal mines, several processes in chemical, petroleum and environmental engineering. The rigorous application of the Einstein's method results in new equation, containing the time and the mixed dispersion terms expressing the dispersion of the particle time steps. Usually, adding the second time derivative results in additional initial data. For the equation derived, the condition of limited solution when time tends to infinity provides with uniqueness of the Caushy problem solution. The solution of the pulse injection problem describing a common tracer injection experiment is studied in greater detail. The new theory predicts delay of the maximum of the tracer, compared to the velocity of the flow, while its forward "tail" contains much more particles than in the solution of the classical parabolic (advection-dispersion) equation. This is in agreement with the experimental observations and predictions of the direct simulation.

## Publications matching "Inferring absolute population and recruitment of s"

Publications
CleanBGP: Verifying the consistency of BGP data
Flavel, Ashley; Maennel, Olaf; Chiera, Belinda; Roughan, Matthew; Bean, Nigel, International Network Management Workshop, Orlando, Florida 19/10/08
Energy balanced data gathering in WSNs with grid topologies
Chen, J; Shen, Hong; Tian, Hui, 7th International Conference on Grid and Cooperative Computing, China 24/10/08
Evolving gene frequencies in a population with three possible alleles at a locus
Hajek, Bronwyn; Broadbridge, P; Williams, G, Mathematical and Computer Modelling 47 (210–217) 2008
The importance of calculating absolute rather than relative fracture risk (vol 41, pg 937, 2007)
Tucker, G; Metcalfe, Andrew; Pearce, Charles; Need, Allan; Dick, I; Prince, R; Nordin, Borje, Bone 42 (1241–1241) 2008
Data fusion without data fusion: localization and tracking without sharing sensitive information
Roughan, Matthew; Arnold, Jonathan, Information, Decision and Control 2007, Adelaide, Australia 12/02/07
Optimal multilinear estimation of a random vector under constraints of casualty and limited memory
Howlett, P; Torokhti, Anatoli; Pearce, Charles, Computational Statistics & Data Analysis 52 (869–878) 2007
Statistics in review; Part 1: graphics, data summary and linear models
Moran, John; Solomon, Patricia, Critical care and Resuscitation 9 (81–90) 2007
The importance of calculating absolute rather than relative fracture risk
Tucker, Graeme; Metcalfe, Andrew; Pearce, Charles; Need, Allan; Dick, I; Prince, R; Nordin, Borje, Bone 41 (937–941) 2007
Experimental Design and Analysis of Microarray Data
Wilson, C; Tsykin, Anna; Wilkinson, Christopher; Abbott, C, chapter in Bioinformatics (Elsevier Ltd) 1–36, 2006
Is BGP update storm a sign of trouble: Observing the internet control and data planes during internet worms
Roughan, Matthew; Li, J; Bush, R; Mao, Z; Griffin, T, SPECTS 2006, Calgary, Canada 31/07/06
Watching data streams toward a multi-homed sink under routing changes introduced by a BGP beacon
Li, J; Bush, R; Mao, Z; Griffin, T; Roughan, Matthew; Stutzbach, D; Purpus, E, PAM2006, Adelaide, Australia 30/03/06
Data-recursive smoother formulae for partially observed discrete-time Markov chains
Elliott, Robert; Malcolm, William, Stochastic Analysis and Applications 24 (579–597) 2006
Optimal linear estimation and data fusion
Elliott, Robert; Van Der Hoek, John, IEEE Transactions on Automatic Control 51 (686–689) 2006
Secure distributed data-mining and its application to large-scale network measurements
Roughan, Matthew; Zhang, Y, Computer Communication Review 36 (7–14) 2006
Optimal estimation of a random signal from partially missed data
Torokhti, Anatoli; Howlett, P; Pearce, Charles, EUSIPCO 2006, Florence, Italy 04/09/06
Optimal recursive estimation of raw data
Torokhti, Anatoli; Howlett, P; Pearce, Charles, Annals of Operations Research 133 (285–302) 2005
Combining routing and traffic data for detection of IP forwarding anomalies
Roughan, Matthew; Griffin, T; Mao, M; Greenberg, A; Freeman, B, Sigmetrics - Performance 2004, New York, USA 12/06/04
IP forwarding anomalies and improving their detection using multiple data sources
Roughan, Matthew; Griffin, T; Mao, M; Greenberg, A; Freeman, B, SIGCOMM 2004, Oregon, USA 30/08/04
Optimal quantization for energy-efficient information transfer in a population of neuron-like devices
McDonnell, Mark; Stocks, N; Pearce, Charles; Abbott, Derek, Fluctuations and Noise 2004, Gran Canaria Islands, Spain 26/05/04
Relationships between the El-Nino southern oscillation and spate flows in southern Africa and Australia
Whiting, Julian; Lambert, Martin; Metcalfe, Andrew; Adamson, Peter; Franks, S; Kuczera, George, Hydrology and Earth System Sciences 8 (1118–1128) 2004
The effect of World War 1 and the 1918 influenza pandemic on cohort life expectancy of South Australian males born in 1881-1900
Leppard, Phillip; Tallis, George; Pearce, Charles, Journal of Population Research 21 (161–176) 2004
The data processing inequality and stochastic resonance
McDonnell, Mark; Stocks, N; Pearce, Charles; Abbott, Derek, Noise in Complex Systems and Stochastic Dynamics, Santa Fe, New Mexico, USA 01/06/03
Stochastic resonance and data processing inequality
McDonnell, Mark; Stocks, N; Pearce, Charles; Abbott, Derek, Electronics Letters 39 (1287–1288) 2003
Resampling-based multiple testing for microarray data analysis (Invited discussion of paper by Ge, Dudoit and Speed)
Glonek, Garique; Solomon, Patricia, Test 12 (50–53) 2003
Best estimators of second degree for data analysis
Howlett, P; Pearce, Charles; Torokhti, Anatoli, ASMDA 2001, Compiegne, France 12/06/01
Optimal successive estimation of observed data
Torokhti, Anatoli; Howlett, P; Pearce, Charles, International Conference on Optimization: Techniques and Applications (5th: 2001), Hong Kong, China 15/12/01
Statistical analysis of medical data: New developments - Book review
Solomon, Patricia, Biometrics 57 (327–328) 2001
Disease surveillance and data collection issues in epidemic modelling
Solomon, Patricia; Isham, V, Statistical Methods in Medical Research 9 (259–277) 2000