Recent Results in Modelling National Economic Time Series Data
Brian D. O. Anderson, The Australian National University and National ICT Australia
Statistics offices in most advanced countries collect large numbers of economic time series—over 300 individual series is a typical number. Such series are, among other purposes, used for building models of a nation’s economy, for use by agencies like central banks. The latter bodies are interested in short to medium term forecasting, and formulate policy in the light of those forecasts.
The task of model building is challenging, and has been addressed in recent years. The favoured classes of models is known as Generalized Dynamic Factor Models, and take the form of a linear discrete-time system excited by vector white noise, with the vector of measured series corresponding to the model’s vector output. The input vector dimension and state dimension of virtually always much less than the output vector dimension. A major, but not yet fully resolved challenge is to handle the fact that the various economic time series are not all collected with the same period; most commonly, some series are available monthly and others quarterly. This fact complicates substantially the task of model building.
This talk will review recent progress in the area, including the major conclusion that there is generically no loss of generality in working with autoregressive models, and that with mixed frequency data, under reasonable assumptions a high frequency (monthly period) state-variable model can be constructed, the outputs of which are formed from the states at the two different sampling periods.
Brian Anderson was born in Sydney, Australia, and educated at Sydney University in mathematics and electrical engineering, with PhD in electrical engineering from Stanford University in 1966. He joined ANU as its first engineering professor in 1982 and is now a Distinguished Professor in the College of Engineering and Computer Science and Distinguished Researcher in National ICT Australia (NICTA). His research work has focused on problems of automatic control and signal processing. For this work he has won a number of international prizes and medals, and is a Fellow of the Australian Academy of Science, the Australian Academy of Technological Sciences and Engineering, the Royal Society, an Honorary Fellow of the Institution of Engineers, Australia, and a Foreign Associate of the US National Academy of Engineering.
He holds honorary doctorates from a number of universities, including Université Catholique de Louvain, Belgium, and ETH, Zürich and five Australian universities. He is a past president of the International Federation of Automatic Control and the Australian Academy of Science. He served as the first President of National ICT Australia (NICTA), and was a member of company boards, including Cochlear Ltd, the world’s major supplier of bionic ears, and a member of the Prime Minister’s Science Council under three prime ministers. He received an AO in 1993, the Centennial Medal in 2001 and the Order of the Rising Sun from Japan in 2007.
Coherence as an Organizing Principle in Statistical Signal Processing
Louis Scharf, Colorado State University
The concepts of coherence and interference are central to optics, electromagnetics, communication, and control. Perhaps they are central to statistical signal processing, as well. In this plenary talk we shall examine this suggestion by exploring the extent to which Generalized Coherence may be used as an organizing principle in detection, estimation, and time series analysis. In so doing, we shall establish the geometries and invariances of generalized coherence, and then apply it to the analysis of several new and old problems in statistical signal processing. Of particular note is a logical derivation of what may be called broadband multi-channel coherence, a statistic whose finite sample distribution is the distribution of a product of independent beta random variables. In the main, this talk will report results and insights gained in collaboration with Doug Cochran, David Ramirez, Ignacio Santamaria, Javier Via, Peter Schreier, Nick Klausner, and Mahmood Azimi-Sadjadi.
Louis Scharf is Research Professor of Mathematics and Emeritus Professor of Electrical and Computer Engineering at Colorado State University, Fort Collins, CO. His research interests are in statistical signal processing, as it applies to wireless communication, adaptive radar and sonar, electric power, and cyber security. He has made original contributions to matched and adaptive subspace detection, invariance theories for signal processing, and reduced-rank signal processing in canonical coordinate systems. He has authored three books: L.L. Scharf, "Statistical Signal Processing: Detection, Estimation, and Time Series Analysis," Addison-Wesley, 1991; L.L. Scharf, "A First Course in Electrical and Computer Engineering," Addison-Wesley, 1998; and P.J. Schreier and L.L. Scharf, "Statistical Signal Processing of Complex-Valued Data: The Theory of Improper and Noncircular Signals," Cambridge University Press, 2010.
Professor Scharf was Technical Program Chair for the 1980 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), Tutorials Chair for ICASSP 2001; and Technical Program Chair for the Asilomar Conference on Signals, Systems, and Computers, 2002. He is past Chair of the Fellow Committee for the IEEE Signal Processing Society. He has received several awards for his professional service and his contributions to statistical signal processing, including an IEEE Distinguished Lectureship; an IEEE Third Millennium Medal; the Technical Achievement Award from the IEEE Signal Processing Society (SPS); the Society Award from the IEEE SPS; and the Donald W. Tufts Award for Underwater Acoustic Signal Processing. He is a Life Fellow of IEEE.
Enabling Functional Neuro-imaging with Statistical Signal Processing
Victor Solo, University of New South Wales
Functional Magnetic Resonance Imaging (fMRI) has caused a revolution in cognitive neuroscience since its advent in the early 1990s because of its ability to show the brain 'in action'. Its dominant position is due to several features: it is non-invasive and so when properly used is nearly harmless; it provides spatial and temporal resolution relevant to brain dynamics; and it is very flexible allowing the kind of image sequences produced to be tailored to the scientific questions of interest. But fMRI is useless without some fundamentally enabling methodologies.
Chief amongst these is statistical signal processing and we give a brief survey of fMRI from this point of view. We give some physics background; we identify the basic scientific and statistical signal processing issues; and based on our own work we show advanced techniques in action; sparsity for activation map production; problems with Granger causality; brain network analysis using multivariate mutual information. We also discuss briefly multi-modal brain imaging involving EEG, MEG, fMRI.
Victor Solo was born and educated in Australia. He has a BSc.(1971) in mathematics from the University of Queensland; a BSc.(1973, first class honours) in Statistics from the University of New South Wales, Sydney and a BE(1974, first class honours) in Mechanical Engineering from the University of New South Wales, Sydney. His PhD (1979) in Statistics is from the Australian National University.
He is currently a Professor of Electrical Engineering and Telecommunications at the University of New South Wales in Sydney, Australia (2000–2003, 2006–). He was previously Professor of Statistics at Macquarie University (1991–2000) in Sydney, Australia.
He has spent about half his academic career in the United States. From 1980–1985 he was Assistant and then Associate Professor of Statistics, at Harvard University. From 1985–1991 he was Associate Professor of Electrical Engineering at Cornell and then at Johns Hopkins. Most recently (January 2004– July 2006) he was a Professor of Electrical Engineering and Computer Science (EECS) at the University of Michigan, Ann Arbor, USA where he also had a quarter appointment in the Statistics Department. From 1999–2004 he was also a visiting Professor of Radiology at Harvard Medical School.
He has been an Associate Editor of significant Engineering, Statistics and Economics Journals. IEEE Transactions on Automatic Control (1993–1995); IEEE Transactions on Pattern Analysis and Machine Intelligence (2001–2003); Journal of the American Statistical Association (1991–2002); Econometric Theory (1987–2004).
He is currently on the Editorial board of the IEEE Signal Processing Magazine (2012–) and the IEEE Journal of Selected Topics in Signal Processing (2012–). He is a Fellow of the IEEE.
In April 2011 he was a Program for Economic Research (PER) invited visitor (for one week) to the Economics Department at Columbia University. This is an honour usually only accorded to distinguished Economists.
His research work (funded by the NSF, NIH, ONR and the ARC) has been in three core areas of fundamental research and two applied and PhDs have been graduated in each of the areas below except the last. He has served on NSF panels and the ARC College of Experts.
Electrical Engineering — Stability analysis in adaptive signal processing and adaptive control using martingale and also averaging methods, computer vision, communication networks, system identification, inverse problems, sparse signal processing, geometric signal processing, information theory.
Statistics — Asymptotic analysis and algorithm development, inverse problems/nonparametrics, model selection, functional data analysis, Granger causality.
Econometrics — Asymptotic analysis e.g. using weak convergence; dynamic factor models.
Neuro-imaging — A collaboration on functional Magnetic Resonance Imaging (fMRI) carried on since 1995 with groups at the Martinos Center for Biomedical Imaging, Harvard Medical School and supported by NIH grants. This work involves e.g. inverse problems, sparse signal processing, spatio-temporal modelling, brain network connectivity.
Neural Coding — A collaboration 1996–2005 with groups at Harvard and MIT and more recently funded by the ARC. This work involves multivariate point process theory, computation and asymptotics.
Particle Methods for Inference in Non-Linear Non-Gaussian State-Space Models
Arnaud Doucet, University of Oxford
State-space models are a very popular class of time series models which have found thousands of applications in electrical engineering, robotics, tracking, vision, econometrics etc. Except for linear and Gaussian models where the Kalman filter can be used, state and parameter inference in non-linear non-Gaussian models is analytically intractable. Particle methods are a class of flexible and easily parallelizable simulation-based algorithms which provide consistent approximations to these inference problems. The aim of this talk is to present the most recent developments in this active research area.
Arnaud Doucet is a Statutory Professor in the Department of Statistics of Oxford University, an EPSRC Established Career Fellow and a Professorial Fellow of Hertford College. He received his PhD in Information Engineering in 1997 from University Paris XI. Before joining Oxford in 2011, he has held faculty positions in Melbourne University, Cambridge University, the University of British Columbia and the Institute of Statistical Mathematics in Tokyo. He is primarily interested in the development and study of novel Monte Carlo methods for inference in complex stochastic systems.
The Square Kilometre Array: new paradigms required for signal processing in astronomy
Steven Tingay, Curtin University
I will describe the Square Kilometre Array (SKA), a multi-billion dollar radio telescope being designed and built by a large international consortium. The SKA will be built in Western Australia and Southern Africa and will produce vast amounts of data, driving new approaches to signal processing in order to achieve the performance specifications required by the scientific goals. I'll briefly outline the science and engineering of the SKA, discuss prototype telescopes currently operating, and focus on the Big Data problem that needs to be addressed from algorithmic and computational points of view.
Steven Tingay is a Western Australian Premier’s Research Fellow, Director of the Curtin Institute of Radio Astronomy, Deputy Director of the International Centre for Radio Astronomy Research, and Director of the Murchison Widefield Array (MWA) project. Steven has authored or co-authored over 130 papers in international refereed journals and has attracted over $80m of research funding over the last decade. His main interests are in radio astronomy and astrophysics. He has been responsible for the development of instrumentation and software that is now used around the world. Steven currently leads the MWA project, a $50m international radio telescope recently completed and brought into its operational phase in the remote Murchison region of Western Australia. The MWA is the low frequency Precursor for the multi-billion dollar Square Kilometre Array (SKA) and he has been an active contributor to the international SKA project for the last decade. Steven is an alumnus of The University of Melbourne and of the Australian National University.
Sparse stochastic processes, matched wavelet expansions and ICA
Michael Unser, École Polytechnique Fédérale De Lausanne (EPFL)
We introduce an extended family of continuous-domain sparse processes that are specified by a generic (non-Gaussian) innovation model or, equivalently, as solutions of linear stochastic differential equations driven by white Lévy noise. We present the functional tools for their characterization. We show that their probability distributions are infinitely divisible, which induces two distinct types of behavior—Gaussian versus sparse—at the exclusion of any other. This is the key to proving that the non-Gaussian members of the family admit a sparse representation in a matched wavelet basis.
We use the characteristic form of these processes to deduce their transform-domain statistics and to precisely assess residual dependencies. These ideas are illustrated with examples of sparse processes for which operator-like wavelets outperform the classical KLT (or DCT) and result in an independent component analysis. Finally, for the case of self-similar processes, we show that the wavelet-domain probability laws are ruled by a diffusion-like equation that describes the evolution across scale.
Michael Unser is professor and director of EPFL’s Biomedical Imaging Group, Lausanne, Switzerland. His primary area of investigation is biomedical image processing. He is internationally recognized for his research contributions to sampling theory, wavelets, the use of splines for image processing, and stochastic processes. He has published over 200 journal papers on those topics. He is the author with P. Tafti of the book “An introduction to sparse stochastic processes” to be published by Cambridge University Press.
From 1985 to 1997, he was with the Biomedical Engineering and Instrumentation Program, National Institutes of Health, Bethesda USA, conducting research on bioimaging.
Dr Unser has held the position of associate Editor-in-Chief (2003–2005) for the IEEE Transactions on Medical Imaging. He is currently member of the editorial boards of SIAM J. Imaging Sciences, IEEE J. Selected Topics in Signal Processing, Foundations and Trends in Signal Processing, and the Proceedings of the IEEE. He is the founding chair of the technical committee on Bio Imaging and Signal Processing (BISP) of the IEEE Signal Processing Society.
Prof Unser is a fellow of the IEEE (1999), an EURASIP fellow (2009), and a member of the Swiss Academy of Engineering Sciences. He is the recipient of several international prizes including three IEEE-SPS Best Paper Awards and two Technical Achievement Awards from the IEEE (2008 SPS and EMBS 2010).
Questions and comments can be directed to the following email address: