January
2020  M  T  W  T  F  S  S    1  2  3  4  5  6  7  8  9  10  11  12  13  14  15  16  17  18  19  20  21  22  23  24  25  26  27  28  29  30  31          

Search the School of Mathematical SciencesEvents matching "Alberta Power Prices" 
A Bivariate Zeroinflated Poisson Regression Model and application to some Dental Epidemiological data 14:10 Fri 27 Oct, 2006 :: G08 Mathematics Building University of Adelaide :: University Prof Sudhir Paul
Data in the form of paired (pretreatment, posttreatment) counts arise in the study of the effects of several treatments after accounting for possible covariate effects. An example of such a data set comes from a dental epidemiological study in Belo Horizonte (the Belo Horizonte caries prevention study) which evaluated various programmes for reducing caries. Also, these data may show extra pairs of zeros than can be accounted for by a simpler model, such as, a bivariate Poisson regression model. In such situations we propose to use a zeroinflated bivariate Poisson regression (ZIBPR) model for the paired (pretreatment, posttreatment) count data. We develop EM algorithm to obtain maximum likelihood estimates of the parameters of the ZIBPR model. Further, we obtain exact Fisher information matrix of the maximum likelihood estimates of the parameters of the ZIBPR model and develop a procedure for testing treatment effects. The procedure to detect treatment effects based on the ZIBPR model is compared, in terms of size, by simulations, with an earlier procedure using a zeroinflated Poisson regression (ZIPR) model of the posttreatment count with the pretreatment count treated as a covariate. The procedure based on the ZIBPR model holds level most effectively. A further simulation study indicates good power property of the procedure based on the ZIBPR model. We then compare our analysis, of the decayed, missing and filled teeth (DMFT) index data from the caries prevention study, based on the ZIBPR model with the analysis using a zeroinflated Poisson regression model in which the pretreatment DMFT index is taken to be a covariate 

Alberta Power Prices 15:10 Fri 9 Mar, 2007 :: G08 Mathematics Building University of Adelaide :: Prof. Robert Elliott
Media...The pricing of electricity involves several interesting features. Apart from daily, weekly and seasonal fluctuations, power prices often exhibit large spikes. To some extent this is because electricity cannot be stored. We propose a model for power prices in the Alberta market. This involves a diffusion process modified by a factor related to a Markov chain which describes the number of large generators on line. The model is calibrated and future contracts priced. 

Global and Local stationary modelling in finance: Theory and empirical evidence 14:10 Thu 10 Apr, 2008 :: G04 Napier Building University of Adelaide :: Prof. Dominique Guégan :: Universite Paris 1 PantheonSorbonne
To model real data sets using second order stochastic processes imposes that the data sets verify the second order stationarity condition. This stationarity condition concerns the unconditional moments of the process. It is in that context that most of models developed from the sixties' have been studied; We refer to the ARMA processes (Brockwell and Davis, 1988), the ARCH, GARCH and EGARCH models (Engle, 1982, Bollerslev, 1986, Nelson, 1990), the SETAR process (Lim and Tong, 1980 and Tong, 1990), the bilinear model (Granger and Andersen, 1978, Guégan, 1994), the EXPAR model (Haggan and Ozaki, 1980), the long memory process (Granger and Joyeux, 1980, Hosking, 1981, Gray, Zang and Woodward, 1989, Beran, 1994, Giraitis and Leipus, 1995, Guégan, 2000), the switching process (Hamilton, 1988). For all these models, we get an invertible causal solution under specific conditions on the parameters, then the forecast points and the forecast intervals are available.
Thus, the stationarity assumption is the basis for a general asymptotic theory for identification, estimation and forecasting. It guarantees that the increase of the sample size leads to more and more information of the same kind which is basic for an asymptotic theory to make sense.
Now nonstationarity modelling has also a long tradition in econometrics. This one is based on the conditional moments of the data generating process. It appears mainly in the heteroscedastic and volatility models, like the GARCH and related models, and stochastic volatility processes (Ghysels, Harvey and Renault 1997). This nonstationarity appears also in a different way with structural changes models like the switching models (Hamilton, 1988), the stopbreak model (Diebold and Inoue, 2001, Breidt and Hsu, 2002, Granger and Hyung, 2004) and the SETAR models, for instance. It can also be observed from linear models with time varying coefficients (Nicholls and Quinn, 1982, Tsay, 1987).
Thus, using stationary unconditional moments suggest a global stationarity for the model, but using nonstationary unconditional moments or nonstationary conditional moments or assuming existence of states suggest that this global stationarity fails and that we only observe a local stationary behavior.
The growing evidence of instability in the stochastic behavior of stocks, of exchange rates, of some economic data sets like growth rates for instance, characterized by existence of volatility or existence of jumps in the variance or on the levels of the prices imposes to discuss the assumption of global stationarity and its consequence in modelling, particularly in forecasting. Thus we can address several questions with respect to these remarks.
1. What kinds of nonstationarity affect the major financial and economic data sets? How to detect them?
2. Local and global stationarities: How are they defined?
3. What is the impact of evidence of nonstationarity on the statistics computed from the global non stationary data sets?
4. How can we analyze data sets in the nonstationary global framework? Does the asymptotic theory work in nonstationary framework?
5. What kind of models create local stationarity instead of global stationarity? How can we use them to develop a modelling and a forecasting strategy?
These questions began to be discussed in some papers in the economic literature. For some of these questions, the answers are known, for others, very few works exist. In this talk I will discuss all these problems and will propose 2 new stategies and modelling to solve them. Several interesting topics in empirical finance awaiting future research will also be discussed.


Puzzlebased learning: Introduction to mathematics 15:10 Fri 23 May, 2008 :: LG29 Napier Building University of Adelaide :: Prof. Zbigniew Michalewicz :: School of Computer Science, University of Adelaide
Media...The talk addresses a gap in the educational curriculum for 1st year students by proposing a new course that aims at getting students to think about how to frame and solve unstructured problems. The idea is to increase the student's mathematical awareness and problemsolving skills by discussing a variety of puzzles. The talk makes an argument that this approach  called PuzzleBased Learning  is very beneficial for introducing mathematics, critical thinking, and problemsolving skills.
The new course has been approved by the University of Adelaide for Faculty of Engineering, Computer Science, and Mathematics. Many other universities are in the process of introducing such a course. The course will be offered in two versions: (a) fullsemester course and (b) a unit within general course (e.g. Introduction to Engineering). All teaching materials (power point slides, assignments, etc.) are being prepared. The new textbook (PuzzleBased Learning: Introduction to Critical Thinking, Mathematics, and Problem Solving) will be available from June 2008. The talk provides additional information on this development.
For further information see http://www.PuzzleBasedlearning.edu.au/ 

Impulsively generated drops 15:00 Fri 27 Feb, 2009 :: Napier LG29 :: Prof William Phillips :: Swinburne University of Technology
This talk is concerned with the evolution of an unbounded inviscid fluidfluid
interface subject to an axisymmetric impulse in pressure and how inertial,
interfacial and gravitational forces affect that evolution. The construct was
motivated by the occurrence of lung hemorrhage resulting from ultrasonic
imaging and pursues the notion that bursts of ultrasound act to expel droplets
that puncture the soft airfilled sacs in the lung plural surface allowing them
to fill with blood. The evolution of the free surface is described by a
boundary integral formulation which is integrated forward in time numerically.
As the interface evolves, it is seen, depending upon the levels of gravity and
surface tension, to form either axisymmetric surface jets, waves or droplets.
Moreover the droplets may be spherical, inverted tearshaped or pancake like.
Also of interest is the finite time singularity which occurs when the drop
pinches off; this is seen to be of the power law type with an exponent of 2/3.


Quadrature domains, pLaplacian growth, and bubbles contracting in HeleShaw cells with a powerlaw fluid. 15:10 Mon 15 Jun, 2009 :: Napier LG24 :: Dr Scott McCue :: Queensland University Technology
The classical HeleShaw flow problem is related to Laplacian growth and nullquadrature domains. A generalisation is constructed for powerlaw fluids, governed by the pLaplace equation, and a number of results are established that are analogous to the classical case. Both fluid clearance and bubble extinction is considered, and by considering two extremes of extinction behaviour, a rather complete asymptotic description of possible behaviours is found. 

Upper bounds for the essential dimension of the moduli stack of SL_nbundles over a curve 11:10 Mon 14 Dec, 2009 :: School Board Room :: Dr Nicole Lemire :: University of Western Ontario, Canada
In joint work with Ajneet Dhillon, we find upper bounds for the essential dimension of various moduli stacks of SL_nbundles over a curve. When n is a prime power, our calculation computes the essential dimension of the moduli stack of stable bundles exactly and the essential dimension is not equal to the dimension in this case.


Exploratory experimentation and computation 15:10 Fri 16 Apr, 2010 :: Napier LG29 :: Prof Jonathan Borwein :: University of Newcastle
Media...The mathematical research community is facing a great challenge to reevaluate the role of proof in light of the growing power of current computer systems, of modern mathematical computing packages, and of the growing capacity to datamine on the Internet. Add to that the enormous complexity of many modern capstone results such as the Poincare conjecture, Fermat's last theorem, and the Classification of finite simple groups. As the need and prospects for inductive mathematics blossom, the requirement to ensure the role of proof is properly founded remains undiminished. I shall look at the philosophical context with examples and then offer some of five benchmarking examples of the opportunities and challenges we face. 

Arbitrage bounds for weighted variance swap prices 15:05 Fri 3 Dec, 2010 :: Napier LG28 :: Prof Mark Davis :: Imperial College London
This paper builds on earlier work by Davis and Hobson (Mathematical Finance,
2007) giving modelfreeexcept for a 'frictionless markets' assumption
necessary and sufficient conditions for absence of arbitrage given a set of
currenttime put and call options on some underlying asset. Here we suppose
that the prices of a set of put options, all maturing at the same time, are
given and satisfy the conditions for consistency with absence of arbitrage.
We
now add a pathdependent option, specifically a weighted variance swap, to
the
set of traded assets and ask what are the conditions on its time0 price
under
which consistency with absence of arbitrage is maintained. In the present
work,
we work under the extra modelling assumption that the underlying asset price
process has continuous paths. In general, we find that there is always a
non
trivial lower bound to the range of arbitragefree prices, but only in the
case
of a corridor swap do we obtain a finite upper bound. In the case of, say,
the
vanilla variance swap, a finite upper bound exists when there are additional
traded European options which constrain the left wing of the volatility
surface
in appropriate ways. 

To which extent the model of BlackScholes can be applied in the financial market? 12:10 Mon 21 Mar, 2011 :: 5.57 Ingkarni Wardli :: Ahmed Hamada :: University of Adelaide
Black and Scholes have introduced a new approach to model the stock price dynamics about three decades ago. The so called Black Scholes model seems to be very adapted to the nature of market prices, mainly because the usage of the Brownian motion and the mathematical properties that follow from. Like every theoretical model, put in practice, it does not appear to be flawless, that means that new adaptations and extensions should be made so that engineers and marketers could utilise the Black Scholes models to trade and hedge risk on the market. A more detailed description with application will be given in the talk. 

Statistical modelling in economic forecasting: semiparametrically spatiotemporal approach 12:10 Mon 23 May, 2011 :: 5.57 Ingkarni Wardli :: Dawlah Alsulami :: University of Adelaide
How to model spatiotemporal variation of housing prices is an important and challenging problem as it is of vital importance for both investors and policy makersto assess any movement in housing prices. In this seminar I will talk about the proposed model to estimate any movement in housing prices and measure the risk more accurately. 

Where is the best place in Australia to build an enhanced geothermal system? 12:10 Mon 30 May, 2011 :: 5.57 Ingkarni Wardli :: Ms Josephine Varney :: University of Adelaide
This week, my parents will join around 185,000 other Australians, in a significant move towards renewable energy, and install solar panels on the roof of their house. While solar energy is an important and useful form of renewable energy it is not able to provide power all the time.
Opponents of renewable energy maintain that until renewable energy can provide energy all the time, traditional fossilfuel generated power will be required to produce our baseload power.
Geothermal energy is a renewable energy that can provide energy all the time. However, due to its special geological requirements, it can only be produced in a very small number of places in the world.
An Enhanced Geothermal System (EGS) is a new technology which allows geothermal energy to be produced in a much wider range of places than traditional geothermal energy. Currently, there are ten different companies investigating possible EGS sties within Australia. This seminar investigates the question, that all these companies hope they have answered well, 'Where is the best place in Australia for an EGS facility?' 

The change of probability measure for jump processes 12:10 Mon 28 May, 2012 :: 5.57 Ingkarni Wardli :: Mr Ahmed Hamada :: University of Adelaide
Media...In financial derivatives pricing theory, it is very common to change the probability measure from historical measure "real world" into a RiskNeutral measure as a development of the non arbitrage condition.
Girsanov theorem is the most known example of this technique and is used when prices randomness is modelled by Brownian motions. Other genuine candidates for modelling market randomness that have proved efficiency in recent literature are jump process, so how can a change of measure be performed for such processes?
This talk will address this question by introducing the non arbitrage condition, discussing Girsanov theorem for diffusion and jump processes and presenting a concrete example. 

Aircooled binary Rankine cycle performance with varying ambient temperature 12:10 Mon 13 Aug, 2012 :: B.21 Ingkarni Wardli :: Ms Josephine Varney :: University of Adelaide
Media...Next month, I have to give a presentation in Reno, Nevada to a group of geologists, engineers and geophysicists. So, for this talk, I am going to ask you to pretend you know very little about maths (and perhaps a lot about geology) and give me some feedback on my proposed talk.
The presentation itself, is about the effect of aircooling on geothermal power plant performance. Aircooling is necessary for geothermal plays in dry areas, and ambient air temperature significantly aï¬ects the power output of aircooled geothermal power plants. Hence, a method for determining the effect of ambient air temperature on geothermal power plants is presented. Using the ambient air temperature distribution from Leigh Creek, South Australia, this analysis shows that an optimally designed plant produces 6% more energy annually than a plant designed using the mean ambient temperature. 

Principal Component Analysis (PCA) 12:30 Mon 3 Sep, 2012 :: B.21 Ingkarni Wardli :: Mr Lyron Winderbaum :: University of Adelaide
Media...Principal Component Analysis (PCA) has become something of a buzzword recently in a number of disciplines including the gene expression and facial recognition. It is a classical, and fundamentally simple, concept that has been around since the early 1900's, its recent popularity largely due to the need for dimension reduction techniques in analyzing high dimensional data that has become more common in the last decade, and the availability of computing power to implement this. I will explain the concept, prove a result, and give a couple of examples. The talk should be accessible to all disciplines as it (should?) only assume first year linear algebra, the concept of a random variable, and covariance.


What would happen if geothermal energy was used to preheat the feedwater for a traditional steam power plant? 12:10 Mon 25 Mar, 2013 :: B.19 Ingkarni Wardli :: Jo Varney :: University of Adelaide
Media...In our effort to determine the most effective way to use geothermal energy this is a left field, yet enticing, idea. Would this produce more 'extra' power than a geothermal plant on its own? Would there be sufficient benefit to interest traditional power generators?
We investigated retrofitting two different geothermal preheating options to a 500MW supercritical steam power plant. We then compared the 'extrapower' produced using geothermal preheating, to the power produced by using geothermal energy on its own.
We think the results are interesting and promising, but come along and judge for yourself. 

Filtering Theory in Modelling the Electricity Market 12:10 Mon 6 May, 2013 :: B.19 Ingkarni Wardli :: Ahmed Hamada :: University of Adelaide
Media...In mathematical finance, as in many other fields where applied mathematics is a powerful tool, we assume that a model is good enough when it captures different sources of randomness affecting the quantity of interests, which in this case is the electricity prices. The power market is very different from other markets in terms of the randomness sources that can be observed in the prices feature and evolution. We start from suggesting a new model that simulates the electricity prices, this new model is constructed by adding a periodicity term, a jumps terms and a positives mean reverting term. The later term is driven by a nonobservable Markov process. So in order to prices some financial product, we have to use some of the filtering theory to deal with the nonobservable process, these techniques are gaining very much of interest from practitioners and researchers in the field of financial mathematics. 

What Technical Performance Measures are Critical to Evaluate Geothermal Developments? 12:10 Mon 17 Mar, 2014 :: B.19 Ingkarni Wardli :: Jo Varney :: University of Adelaide
Media...Josephine Varney, Nigel Bean and Betina Bendall.
When geologists, geophysicists and engineers study geothermal developments, each group has their own set of technical performance measures. While these performance measures tell each group something important about the geothermal development, there is often difficulty in translating these technical performance measures into financial performance measures for investors. In this paper, we argue that brine effectiveness is the best, simple financial performance measure for a geothermal investor. This is because it is a good, yet simple indicator of ROI (return on investment); and importantly, links well production to power plant production, hence describes the geothermal development in a holistic sense. 

A model for the BitCoin block chain that takes propagation delays into account 15:10 Fri 28 Mar, 2014 :: B.21 Ingkarni Wardli :: Professor Peter Taylor :: The University of Melbourne
Media...Unlike cash transactions, most electronic transactions require the presence of a trusted authority to verify that the payer has sufficient funding to be able to make the transaction and to adjust the account balances of the payer and payee. In recent years BitCoin has been proposed as an "electronic equivalent of cash". The general idea is that transactions are verified in a coded form in a block chain, which is maintained by the community of participants. Problems can arise when the block chain splits: that is different participants have different versions of the block chain, something which can happen only when there are propagation delays, at least if all participants are behaving according to the protocol.
In this talk I shall present a preliminary model for the splitting behaviour of the block chain. I shall then go on to perform a similar analysis for a situation where a group of participants has adopted a recentlyproposed strategy for gaining a greater advantage from BitCoin processing than its combined computer power should be able to control. 

Visualising the diversity of benchmark instances and generating new test instances to elicit insights into algorithm performance 15:10 Fri 10 Oct, 2014 :: Napier 102 :: Professor Kate SmithMiles :: Monash University
Media...Objective assessment of optimization algorithm performance is notoriously difficult, with conclusions often inadvertently biased towards the chosen test instances. Rather than reporting average performance of algorithms across a set of chosen instances, we discuss a new methodology to enable the strengths and weaknesses of different optimization algorithms to be compared across a broader instance space. Results will be presented on timetabling, graph colouring and the TSP to demonstrate: (i) how pockets of the instance space can be found where algorithm performance varies significantly from the average performance of an algorithm; (ii) how the properties of the instances can be used to predict algorithm performance on previously unseen instances with high accuracy; (iii) how the relative strengths and weaknesses of each algorithm can be visualized and measured objectively; and (iv) how new test instances can be generated to fill the instance space and provide desired insights into algorithmic power. 

Group Meeting 15:10 Fri 29 May, 2015 :: EM 213 :: Dr Judy Bunder :: University of Adelaide
Talk : Patch dynamics for efficient exascale simulations
Abstract
Massive parallelisation has lead to a dramatic increase in available computational power.
However, data transfer speeds have failed to keep pace and are the major limiting factor in the development of exascale computing. New algorithms must be developed which minimise the transfer of data. Patch dynamics is a computational macroscale modelling scheme which provides a coarse macroscale solution of a problem defined on a fine microscale by dividing the domain into many nonoverlapping, coupled patches. Patch dynamics is readily adaptable to massive parallelisation as each processor core can evaluate the dynamics on one, or a few, patches. However, patch coupling conditions interpolate across the unevaluated parts of the domain between patches and require almost continuous data transfer. We propose a modified patch dynamics scheme which minimises data transfer by only reevaluating the patch coupling conditions at `mesoscale' time scales which are significantly larger than the microscale time of the microscale problem. We analyse and quantify the error arising from patch dynamics with mesoscale temporal coupling. 

Queues and cooperative games 15:00 Fri 18 Sep, 2015 :: Ingkarni Wardli B21 :: Moshe Haviv :: Department of Statistics and the Federmann Center for the Study of Rationality, The Hebrew Universit
Media...The area of cooperative game theory deals with models in which a number of individuals, called players, can form coalitions so as to improve the utility of its members. In many cases, the formation of the grand coalition is a natural result of some negotiation or a bargaining procedure.
The main question then is how the players should split the gains due to their cooperation among themselves. Various solutions have been suggested among them the Shapley value, the nucleolus and the core.
Servers in a queueing system can also join forces. For example, they can exchange service capacity among themselves or serve customers who originally seek service at their peers. The overall performance improves and the question is how they should split the gains, or,
equivalently, how much each one of them needs to pay or be paid in order to cooperate with the others. Our major focus is in the core of the resulting cooperative game and in showing that in many queueing games the core is not empty.
Finally, customers who are served by the same server can also be looked at as players who form a grand coalition, now inflicting damage on each other in the form of additional waiting time. We show how cooperative game theory, specifically the AumannShapley prices, leads to a way in which this damage can be attributed to individual customers or groups of customers. 
Publications matching "Alberta Power Prices"Publications 

On the predictive power of shortestpath weight inference Coyle, Andrew; Kraetzl, Miro; Maennel, Olaf; Roughan, Matthew, Internet Measurement Conference 08, Greece 20/10/08  Computer algebra models the inertial dynamics of a thin film flow of power law fluids and other nonNewtonian fluids (Unpublished) Roberts, Anthony John,  Asymptotic matching constraints for a boundarylayer flow of a powerlaw fluid Denier, James; Hewitt, R, Journal of Fluid Mechanics 518 (261–279) 2004  On the boundarylayer equations for powerlaw fluids Denier, James; Dabrowski, Paul, Proceedings of the Royal Society of London Series AMathematical Physical and Engineering Sciences 460 (3143–3158) 2004  Critical care trials: Sample size, power and interpretation Moran, John; Solomon, Patricia, Critical care and Resuscitation 6 (239–247) 2004 
Advanced search options
You may be able to improve your search results by using the following syntax:
Query  Matches the following 

Asymptotic Equation  Anything with "Asymptotic" or "Equation". 
+Asymptotic +Equation  Anything with "Asymptotic" and "Equation". 
+Stokes "NavierStokes"  Anything containing "Stokes" but not "NavierStokes". 
Dynam*  Anything containing "Dynamic", "Dynamical", "Dynamicist" etc. 
