Cosmological Image Processing (Dr Jason McEwen)
We have recently entered an era of precision cosmology. The Big Bang cosmological model that describes our Universe explains many cosmological observations to exquisite accuracy, including the relic radiation of the infant Universe, the so-called cosmic microwave background (CMB). However, we remain ignorant of many of the components of this model. We know very little about dark matter and dark energy, which together constitute approximately 95% of the energy content of the Universe. Furthermore, we know very little about the first moments after the birth of our Universe and, in particular, what mechanism seeded the structure we observe in the Universe today, such as galaxies and clusters of galaxies. I will explain how the era of precision cosmology has emerged, thanks in particular to the coupling of large and precise cosmological data-sets with sophisticated signal and image processing techniques. I will also discuss how we can make further progress in order to go beyond the standard cosmological model, by developing and exploiting yet more advanced analysis methods.
Dr Jason McEwen is a lecturer in the Mullard Space Science Laboratory at University College London and a Core Team member of the European Space Agency Planck Surveyor satellite mission. After graduating with a B.E. (Hons) in Electrical and Electronic Engineering from the University of Canterbury in 2002, he completed a Ph.D. in Astrophysics at the University of Cambridge in 2006. Following his Ph.D. he held a Research Fellowship at Clare College, Cambridge, before working as a Quantitative Analyst at Credit Suisse. He then held a position as a Scientist at Ecole Polytechnique Federale de Lausanne, Switzerland, followed by a Leverhulme Trust Early Career Fellowship and then a Newton International Fellowship, both at University College London. His research interests are focused on astroinformatics, combing his interest in signal processing, including sampling theory, wavelet theory, compressed sensing and Bayesian statistics, with applications in cosmology and radio interferometry.
Natural Interaction for Augmented Reality Applications (Professor Mark Billinghurst)
Augmented Reality (AR) is technology that overlays computer graphics on the real world so that the real and virtual content are seamlessly merged together. Over the past decade Augmented Reality has begun enter the commercial arena and now there are thousands of applications available and tens of millions of users. However most AR experiences are delivered on mobile phones or handheld devices using limited forms of interaction. Using computer vision technology there is an opportunity to create more natural AR interfaces. This presentation will show how methods can be developed that will support tangible object interaction, use of free-hand gesture, and whole body input in AR systems. Work will be presented from the HIT Lab NZ and other leading research groups. An overview will also be given of opportunities for future work, such as gesture interaction with Google Glass and other wearable computers, and integration of speech and gesture into multimodal systems.
Professor Mark Billinghurst is a researcher developing innovative computer interfaces that explore how virtual and real worlds can be merged to enhance face-to-face and remote collaboration. Director of the Human Interface Technology Laboratory (New Zealand) at the University of Canterbury in Christchurch, New Zealand, he has produced over 250 technical publications and his work has been demonstrated at a wide variety of conferences. He is active in several research areas including Augmented and Virtual Reality, mobile user interfaces and collaborative computer interfaces. He has previously worked at ATR Research Labs in Japan, British Telecom’s Advanced Perception Unit, the MIT Media Laboratory and on the Google Glass project team. He is well known for his work as the co-developer of the popular ARToolKit AR tracking library, for producing the first collaborative AR applications, and some of the first mobile AR applications. One of his research projects, the MagicBook, was winner of the 2001 Discover award for best Entertainment application and in 2005 his work on ARTennis was awarded the 2005 Independent Mobile Game award for best independent mobile phone game. In 2012 he was awarded the IMSAR lasting impact award for most influential paper published in AR over the last 10 years, and the 2013 IEEE VR Technical Achievement award.