School of Computer Science and Software Engineering

Seminars 2009

School Research Seminars Presented in 2009.

  1. Visual tracking and SLAM at sea
  2. A computational investigation of bone biology
  3. Fluctuation-induced forces in and out of equilibrium
  4. Heuristic RNA pseudoknot detection in long sequences based on stem-loop correlated energy modelling
  5. Bringing engineering to life
  6. Tracking mobile targets using energy-constrained sensor networks
  7. New insights into therapeutic drug interventions for catabolic bone diseases using an in-silico modelling approach
  8. Identifying conceptual similarities using distributed symbols and backpropagation-to-representation
  9. Information fusion techniques in multimodal biometric systems
  10. Toxicity evidence integration
  11. An approach to improve the quality of Java Swing GUI programs written by first-year computing students
  12. Using genetic algorithms to automate the optimisation of a web application interface
  13. Method for automated feedback to improve student code quality in software engineering education
  14. Tool support for learning programming style
  15. A model of parallel computation to guide programming
  16. Measuring the stakeholders' agreement level in negotiation through experiment
  17. A traceability model using requirement interaction to support change impact analysis in software development activity
  18. From wavelets to contourlets : Evolution and applications
  19. Quantified Boolean formula solvers and their applications in CAD for VLSI
  20. Adaptive approaches to electronic fraud detection in highly dynamic environments
  21. Boolean SAT, the respective solvers and their applications to CAD for VLSI verification, test and debug
  22. Query optimisation for sensor networks
  23. Semantic and social networking technologies to analyse the online media coverage on the US election 2008
  24. Restricted natural language and RDF for preference specification

Visual Tracking and SLAM at Sea

  • Ian Reid - University of Oxford
  • 2.30pm Friday 14th August
Abstract

I will discuss work with my DPhil student Charles Bibby, which considers the problem of enhancing the navigational capabilities, the collision avoidance and security systems, and the search and rescue capabilities of ships and other sea-going vessels.

Time-permitting, I will describe three novel algorithms which we have developed to this end: (i) SLAMIDE is a SLAM algorithm based on a recursive sliding window filter which allows reversible data-association decisions, and motion-model selection, allowing the incorporation of dynamic objects into the environment map. We show that in simulation this algorithm achieves excellent robustness to poor initial data-association (comparable to JCBB) but is also very robust in a cluttered environment, as well as being able to handle a high proportion of dynamic objects in the map. We also show results from challenging real radar data in a busy harbour. (ii) We extend the idea of SLAMIDE by incorporating an occupancy grid to model landmasses, and by introducing the idea of representing the trajectories of both the ego-motion and of dynamic objects in the map using splines. This has the benefit of greatly reducing the size of the map. Further, by replacing a discrete representation of the trajectory with a continuous one, we can now allow for completely asynchronous measurements. (iii) I will describe a new visual tracking and segmentation algorithm, based on the evolution of a level-set contour. The key idea that differentiates this tracker from other similar ones is that we marginalise out the uncertain foreground/background membership probabilities of all pixels in the image and solve only for the two variables of interest, namely pose and shape of the target. The algorithm is fast, requiring less than 50ms per frame on even modest hardware, robust to illumination and shape changes, and can track agile motion. We have deployed this algorithm to control a high-performance custom-built PTZ platform and recently begun trials at sea.

The work is sponsored by Servowatch Systems Ltd.

About the Presenter

Ian Reid is a Reader in Engineering Science and Fellow of Exeter College, at the University of Oxford where he jointly heads the Active Vision Group. He obtained BSc from the University of Western Australia in 1987, and came to Oxford University on a Rhodes Scholarship in 1988 where he completed a D.Phil. in 1991. His research has touched on many aspects of computer vision, concentrating on algorithms for visual tracking, control of active head/eye robotic platforms (for surveillance and navigation), SLAM, visual geometry, novel view synthesis and human motion capture. He has published over 100 papers on these and related topics. He serves on the editorial boards of Image and Vision Computing Journal and the recently formed IPSJ Transactions on Computer Vision Applications.

Back to top


A Computational Investigation of Bone Biology

  • Devin Sullivan
  • 11:00am Friday 7th August
Abstract

The bone cycle is a continuous and dynamic system in which old or damaged bone is constantly removed and replaced by new bone. This system consists of two basic cell types, osteoclasts and osteoblasts. The osteoclasts are generally speaking responsible for the catabolic effect of bone resorption while the osteoblasts cause bone formation, an anabolic process. The coordination between these two cell types is crucial in maintaining appropriate strength in one’s bones. This system works in highly coordinated groups called basic multicellular units or BMUs.

This work is a summary of my 3 month research scholarship visiting UWA. During that time I had the opportunity to work on a variety of problems related to bone regulation ranging from in-vitro to in-vivo applications. Despite the obvious and extensive self-regulation of bone tissue, biologists have been able to do little to understand the complexities of these cell-cell interactions. Several computational models attempt to shed light on these interactions, covering a large scope of biological queries related to the bone system. This presentation aims to highlight a few of these models and their potential applications. The first model will highlight the balance in differentiation of mesenchymal stem cells into osteoblasts versus adipocytes. The next model investigates the effects of parathyroid hormone on overall bone activity and resorption. Lastly, a spatial model of a single BMU demonstrates the spatial movement of these cells as they interact. The discussion will focus on the potential therapeutic applications of each model. Of particular interest will be the predictive nature of the models beyond that of traditional biological approaches.

Back to top


Fluctuation-induced Forces in and Out of Equilibrium

  • Pascal Buenzli
  • 11:00am Friday 24th July
Abstract

Fluctuation-induced forces are forces arising between objects due to the fluctuations of their surrounding media. Their presence reveals the incessant jiggling motion that otherwise averages into what seems still. Probably the best-known example of such a force is the London--van der Waals force between neutral atoms in a fluid. The atoms' fluctuating dipole moments average zero individually, nevertheless produce a nonvanishing attraction between them. In 1948, H. B. G. Casimir predicted the existence of a similar attractive force between two neutral metallic plates due to the quantum fluctuations of the electromagnetic field. This so-called "Casimir force", aside from thrilling SF, has been the subject of extensive research since the late '90s when experimental setups were able to probe its strength. Many other fluctuation-induced phenomena have since entered the field.

This talk will overview two distinct studies performed in this field. The first concerns a microscopic (bottom-up) approach to understand the Casimir force between metallic plates. This calculation resolves a long-lasting and heavily-debated controversy about the value of the force at large separations, that dates back in works by E. M. Lifshitz and J. Schwinger. The second topic studies various interesting properties that fluctuation-induced forces acquire when they arise out of nonequilibrium fluctuations. In particular, forces and torques can be induced on single (asymmetric) objects and they can be tuned both in strength and sign. These properties are illustrated between various objects immersed in a basic reaction-diffusion fluid and calculations compared with a Boundary Element Method (BEM) scheme.

Back to top


Heuristic RNA Pseudoknot Detection in Long Sequences Based on Stem-loop Correlated Energy Modelling.

  • Jana Sperschneider
  • 2pm Friday 17th July
Abstract

There are two types of nucleic acids in the living cell: deoxyribonucleic acid (DNA) and ribonucleic acid (RNA). RNA is a versatile macromolecule which is no longer seen as the passive intermediate between DNA and proteins. Numerous functional RNA molecules with an astonishing variety have been uncovered in the last decade. Macromolecule function is closely connected to its three-dimensional folding and therefore, structure prediction from the base sequence is of great importance.

When only a single sequence is given, the most popular approach for RNA structure prediction is free energy minimization using dynamic programming. However, the inability to predict crossing structure elements, so-called pseudoknots, is a major drawback. Pseudoknots are functional structure elements which occur in most classes of RNA and in many viruses. From a theoretical point of view, general pseudoknot prediction is not an easy task and has been shown to constitute an NP-complete problem. In general, most practical methods as reported in the literature suffer from low accuracy for longer sequences and high running times.

This talk will give an introduction to a different algorithmic framework which aims to detect pseudoknots in a sequence with high confidence. The importance of the underlying folding model for pseudoknots will be discussed and recent progress in pseudoknot structure prediction will be presented.

Back to top


Bringing Engineering to Life

  • Bruce Gardiner
  • 11am Friday 19th June
Abstract

A new research group, calling themselves engineering computational biology, has recently joined the School of Computer Science and Software engineering. The two main aims of this seminar are to give an introductory overview of the research conducted by this group and to invite interactions with other members of CSSE. In the past 5 or so years the engineering computational groups has been developing mechanistic mathematical/computational models of biological systems at the tissue, cellular and sub-cellular levels.

Current projects include: Intercellular communication in bone regulation, developmental biology and prostate cancer progression. Intracellular communication in colorectal cancer, cell adhesion and programmed cell death. Cell-tissue interactions regulating cartilage health and kidney oxygenation.

Finally clinical applications such as the design of glaucoma drainage implants and strategies to predict preterm birth. Some of these projects will be discussed to illustrate the different aspects of this research and what this engineering approach may offer to the biomedical community.

Back to top


Tracking Mobile Targets Using Energy-constrained Sensor Networks

  • Prof Bijendra Jain - Gledden Senior Visiting Fellow at UWA
  • 11am Friday 25th May
Abstract

In this talk we first give a general overview of sensor networks and their applications. But for most part of the talk we consider the problem of tracking mobile targets using energy-constrained sensor networks.

In particular, we consider the problem of estimating the location of a moving target ‘T’ in a 2D plane. We assume that it is possible for sensors to detect the presence of the target in its vicinity and to (possibly) measure the distance from/to the target. Given that available energy in sensors is at a premium, we have proposed protocols for target detection and route activation that require sensors to conserve energy by switching between ‘inactive’ and ‘active’ modes of operations, while waking-up frequently in inactive mode to evaluate the need to become active. Yet another method to save energy is to reduce the number of measurements and, as a result, the number of transmissions. We therefore propose that energy be conserved by (a) requiring that a sensor switch to an ‘inactive’ mode whenever feasible, and (b) selecting fewer but adequate number of sensors that measure distance and communicate with the central tracker. Given an adequate spread of sensors, and an ability to only detect presence or absence of the target within its vicinity in a timely manner, it is feasible to obtain an approximate trajectory of a mobile target as a function of time. Alternatively, distance measurements from several such sensors may be used to estimate the location of the target T at a point of time. Clearly, the latter approach is expected to track the target more accurately. However, it will necessarily require that there be at least three sensors within the vicinity of the target and, therefore, this approach requires a significantly greater density of sensors. The error in estimating location of the target using distance measurements from multiple sensors is shown to be dependent on two measures viz. proximity of sensors to the target, and co-linearity of sensors. We also propose a new measure, ideal direction for selecting a 3rd sensor, given locations of two sensors and the location of the target. We propose algorithms to estimate the track of the target by using distance measurements from sensors selected on the basis of the above measures. Further, we evaluate all the protocols and algorithms using simulations.

About the Presenter

Professor Bijendra Jain obtained B. Tech. from IIT Kanpur in 1970, and Ph. D. from SUNY, Stony Brook in 1975, both in Electrical Engg. Since 1975 he has been with IIT Delhi, where he is presently Deputy Director (Faculty) and Professor of Computer Science. In the past he has held visiting assignments with Universities of Texas and Maryland, Bell Labs, and Cisco Systems. Presently, he is a Gledden Senior Visiting Fellow at University of Western Australia.

His interest is in Computer Networks and Systems, including network models and analysis, algorithms for large sparse matrix operations, scheduling algorithms for hard real-time systems, fault-tolerant routing. His recent interest is, however, in ad hoc and sensor networks. His research is funded in part by Government of India, UNDP, US Army, Sun Microsystems, Microsoft and Media Lab Asia. As early as 1989, Prof. Jain, together with developers from other institutions in India, built and launched India’s first data network, ERNet. Today, ERNet is a thriving not-for-profit company connecting over a million users spread across over 2000 institutions in India. He is a co-inventor in seven US patents, assigned to Cisco Systems. These cover methods to speed-up access to Web pages and efficient monitoring of IP network performance. He has co-authored "OSI: Its Architecture and Protocols", a book published by McGraw Hill, New York.

He is an active industry consultant. He is also a member of several Government committees, including Naval Research Board. Lastly, Professor Jain is a co-founder and past-Chairman of Kritikal Solutions, a technology start-up incubated on IIT Delhi campus and one which is focused on computer vision, embedded systems and networks.

Back to top


New Insights into Therapeutic Drug Interventions for Catabolic Bone Diseases Using an In-silico Modelling Approach

  • Peter Pivonka
  • 11am Friday 1st May
Abstract

The conceptual model employed by bone scientists today is based on the dynamic balance between two main bone cell types, the osteoclasts and osteoblasts, which continuously resorb bone and form new bone. This process is referred to as bone remodelling. This conceptual approach has given many new insights into how bone structure and function is influenced by a variety of factors including hormones, cytokines, mechanical loading, and gene mutations to name only a few. But these two cell types do not work independently, but rather work in a coordinated way in so-called basic multi-cellular units (BMUs). There is cross-talk between the cell types to coordinate their functional behaviours. Given this interaction, it is very difficult to predict what might happen given some changes of the bone microenvironment. To date there have been few attempts to integrate all the key observations into a theoretical framework, which allows theoretical predictions and subsequent investigation experimentally. Mathematical modelling, provides the basis for "translation" of conceptual models into theoretical models which can then be employed to quantitatively investigate various hypotheses and study system behaviour as a whole rather than single-component behaviour. This presentation will discuss how in-silico modelling can be applied to advance our current knowledge on bone biology with particular emphasis on therapeutic interventions in catabolic bone diseases.

Back to top


Identifying Conceptual Similarities Using Distributed Symbols and Backpropagation-to-representation

  • Peter Dreisiger
  • 11am Friday 24th April
Abstract

The tasks of concept formation and concept recognition underlie our ability to generalise knowledge and cope with uncertainty; these tasks, however, also assume that we are able to identify shared traits and conceptual similarities. While researchers within the cognitive sciences continue to study our ability to form concepts and estimate similarities, the growing need for data mining and analysis tools has led to the development of more heuristic forms of statistical pattern recognition and conceptual clustering; some of these techniques have even drawn upon advances in the area of dimensional reduction.

Non-linear Principal Component Analysis, self-organising feature maps and the auto-encoder neural network allow us to discover new trends in numerical data; while these techniques are able to discover many hidden patterns and relationships, they are restricted to the analysis of entities, or observations, that have a natural, metrical representation. For symbolic data, there are several forms of co-occurrence and correlation analysis, and techniques such as Latent Semantic Analysis (LSA) have been developed to discover similarities in textual documents; LSA, in particular, can discover implicit relationships based upon the terms' shared neighbourhoods. What these techniques lack, however, is the ability to capture syntax and usage.

One approach which has received very little attention from the data mining community is a variant of the auto-encoder that uses an extended form of back-propagation to capture regularities in the properties and roles of terms and other, non-numerical entities. While this technique, called FGREP, has been used to model the process of concept formation in humans, it has not, as yet, been used to discover patterns and conceptual clusters in larger sets of data; nor has it been used within the intelligent agent community to classify or group entities based upon their perceivable attributes.

The aims of this project, then, are three-fold: (1) to investigate FGREP's ability to discover meaningful concepts in larger sets of symbolic data; (2) to identify its limitations and investigate how we can improve its average performance; and (3) to compare it to other existing techniques. In this talk, we will introduce FGREP and our implementation of this technique, we will present the results of our first set of experiments, and we will see how its conceptual clusters compare to those found using LSA.

Back to top


Information Fusion Techniques in Multimodal Biometric Systems

  • Maryam Mehdizadeh
  • 11am Friday 17th April
Abstract

Most biometric systems deployed in real-world applications are unimodal, i.e., they rely on the evidence of a single source of information for authentication (e.g., single fingerprint or face). These systems have to contend with a variety of problems such as: (a) Noise in sensed data: A fingerprint image with a scar, or a voice sample altered by cold are examples of noisy data. Multibiometrics is expected to overcome some of the limitations of unibiometric systems by combining the evidence presented by multiple biometric sources. This integration of information is known as information fusion and if appropriately done, can enhance the matching accuracy, increase population coverage and deter spoofing activities.

The aim of this research is to investigate how methods of semi-supervised learning and non-linear dimensionality reduction can be used in multimodal biometric systems for the purpose of feature fusion. We are intending to explore how feature selection and feature fusion can be a part of learning algorithm and be automated as much as possible. We are going to learn the feature spaces of face and hand biometric data with Locally Linear Embedding (LLE) - as a nonlinear feature descriptor - and consolidate (fuse) the two spaces in the concept of adaptive feature fusion with semi-supervised learning methods.

Back to top


Toxicity Evidence Integration

  • Alison Anderson
  • 11am Friday 3rd April
Abstract

The environment is increasingly contaminated with thousands of chemical compounds the health effects of which are not well understood. Toxicity arises from complex interactions between environmental agents, genes and the mechanisms that control gene expression, collectively known as the epigenome. Advances in bioinformatics are driving new approaches to investigating and evaluating these complex toxicity pathways. Fundamental to these objectives is the need to integrate information, in particular, evidence of chemical toxicities from disparate biological resources. For this project information from disparate publicly available resources has been downloaded and serialised as triple graphs which comprise three components (subject, predicate, object) or (entity, attribute, value). An ontology-driven method has been developed to extract toxicity specific subgraphs of information from the aggregated triplestore. Toxicity information is mapped to the ontology via N3Logic, a W3C specification which allows rules to be expressed in a Web environment. This process provides a framework that facilitates the integration and manipulation of toxicity evidence arising from multi-disciplinary research. It is the first step in meeting the main objective of this PhD: to provide a holistic weight-of-evidence approach to assessing toxicity evidence.

Back to top


An Approach to Improve the Quality of Java Swing GUI Programs Written by First-year Computing Students

  • Rieky Barady - Master of Computer Science proposal
  • 12pm Thursday 26th March
Abstract

Graphical User Interfaces(GUIs) have become the most popular means used to interact with today's software. However, testing and learning to correctly implement a GUI program is difficult for several reasons. Students often encounter problems with error handling, navigation, finding bugs and using the language libraries. We propose to develop learning tools to support students to improve the quality of their Java Swing GUI programs. We propose a system for semi-automated support that contains a mixture of development tools, test cases, guided tutorial exercises and checklists. The system will be evaluated by analysing the quality of student GUI programming assignments developed both with and without the tool support.

Back to top


Using Genetic Algorithms to Automate the Optimisation of a Web Application Interface

  • Simon Geoghegan - Honours proposal
  • 12pm Thursday 26th March
Abstract

One major problem in the design of the human computer interface of a web application is deciding on a layout that will maximise ease of use, maximise the usage of the available functions and be the most aesthetically pleasing. A good design in the interface will be able to maximise the number of return visitors and to maximise the usage of the system. For these reasons, this project aims to have an automated system which uses genetic algorithms to find the optimal interface to use for a web application. This project hopes to be able to increase retention rates of users and usability of the "Virtual Health Monitor(tm)" System.

Back to top


Method for Automated Feedback to Improve Student Code Quality in Software Engineering Education

  • Lesley Zhang - Masters Research proposal
  • 12pm Thursday 26th March
Abstract

Open problems in software engineering education include effective techniques for improving code quality, student's lack of experience of the testing process and that not all the students are challenged by existing curricula. To overcome these points, automated assessment approaches and test-driven development (TTD) can be used, alongside a flexible assessment approach that supports the different abilities of students.

Back to top

 


Tool Support for Learning Programming Style

  • You Hai Lim - Honours proposal
  • 12pm Thursday 26th March
Abstract

Maintenance is an important part of the software development life cycle. Poor programming style can greatly affect the readability of the code, which makes maintenance harder and lowers the quality of software. New programming students usually ignore the importance of programming style as it is only a small part in programming units. This project aims to develop a combination of code styling tools and exercises to help students in learning good programming style.

Back to top


A Model of Parallel Computation to Guide Programming

  • Prof Larry Snyder - University of Washington
  • 11am Friday 13th March
Abstract

Most computers sold now are parallel, that is, they have multiple processors. If the parallel computing research of the last 30 years had been completely successful, we would have the perfect parallel language to write programs for these machines. It was not completely successful and so the tools available -- threading, message passing, OpenMP, etc. -- are inadequate. As a result, these machines mostly run sequential programs and so are being underutilized. In the talk I will quickly explain the difficulties with the current approaches, and then show how it is possible to write quality parallel programs with presently available resources. Key to the approach will be a model of parallel computation that achieves both parallel performance and portability. Numerous examples will illustrate the ideas.

Back to top


Measuring the Stakeholders' Agreement Level in Negotiation Through Experiment

  • Sabrina Ahmed
  • 11am Friday 6th March
Abstract

Determining the right requirements to develop a system is crucial as it involved variety of people and it affects the quality of the end product. Different stakeholders have different requirements which they may express in different ways. Naturally, they will express requirements in their own terms and with implicit knowledge of their own work. Generally, stakeholders are not sure of what they want from the computer system except in the most general terms. Hence, conflicts are inevitable. Negotiation is applied to resolve the conflicts and becoming popular nowadays to better improve requirements engineering process. Therefore, this paper is empirically confirming the effectiveness of negotiation effort in order to improve the level of agreement among all the stakeholders. Agreed requirements are believed to represent all the stakeholders' perspectives and perceptions and to obtain a set of unambiguous, correct, complete, consistent and achievable requirements.

Back to top


A Traceability Model Using Requirement Interaction to Support Change Impact Analysis in Software Development Activity

  • Nazri Kama
  • 11am Friday 27th February
Abstract

Software traceability is the ability to relate requirements specification with other software artefacts created in the development life-cycle of a software system. Typically, traceability denotes satisfiability, dependency and evolution relations between software artefacts. In fact, it provides an important insight into software change impact analysis that gives an essential support in understanding the impacted artefacts within and across software phases. This research proposes a traceability model in the field to support software change impact analysis through requirement interaction analysis approach. This approach is important based on the precept that requirements are not individualistic in nature. Indeed, they interact or communicate among each other in order to fulfil stakeholders’ needs. In a normal practice, if a particular requirement changes in the requirement document, the change will also affect the other requirements that it interacts with. This talk presents some progress towards improvement of change impact analysis results through a traceability model that exploits the capability of requirements interaction analysis.

Back to top


From Wavelets to Contourlets: Evolution and Applications

  • Assoc. Professor Atif Mansoor - National University of Sciences and Technology, Pakistan
  • 11am Friday 20th February
Abstract

Fourier Transform is well known for its signal processing applications. However, Fourier Transform provides only frequency resolution but lacks time resolution. To overcome this limitation, Short Time Fourier Transform and subsequently Wavelet Transform were developed. Wavelet Transform, with its strength of multi-resolution analysis, has found applications in diverse domains. But Wavelet Transform represents discontinuities in the vertical or horizontal directions only and thus is unable to effectively represent discontinuities for two dimensional signals such as images. In this “beyond wavelets” era, researchers have proposed various new transforms addressing this limitation. These new transforms include Curvelets, Brushlets, Directionlets and Contourlets etc. This talk discusses this evolution process followed by the details of the Contourlet Transform. Finally, some of the research performed by the Computer Vision and Biometrics Group of NUST in palmprint identification and image steganalysis using Contourlet Transform will be discussed.

About the Presenter

Associate Professor Atif Mansoor graduated from NED University with a BE degree in Avionics. After winning a prestigious national scholarship from the Ministry of Science and Technology, Pakistan he went for his postgraduate studies to Germany. He was awarded best foreign student award from Hamburg University of Technology. He further completed his MPhil from Univ. of Engineering and Technology Taxila. He is currently the head of Computer Vision and Biometrics group at the College of Aeronautical Engineering in National University of Sciences and Technology, Pakistan. His research interests are Wavelets and its variants, Signal Processing, Image Processing, Computer Vision, Pattern Recognition and Biometrics.

Back to top


Quantified Boolean Formula Solvers and their Applications in CAD for VLSI

  • Andreas Veneris - University of Toronto
  • 11am Monday 9th February
Abstract

Formal CAD tools operate on mathematical models describing the sequential behavior of a VLSI design. With the growing size and state-space of modern digital hardware designs, the conciseness of this mathematical model is of paramount importance in extending the scalability and applicability of formal CAD tools, provided that the compression does not come at the cost of reduced performance. An Iterative Logic Array (ILA) representation, which replicates the circuit for a bounded number of time-frames, is required in many Boolean satisfiability (SAT) based encodings for CAD problems involving sequential designs. This can often exceed available memory resources. Quantified Boolean Formula satisfiability (QBF) is a powerful generalisation of SAT which belongs to the same complexity class as many sequential CAD problems, making it a natural candidate as a formal platform for encoding these problems. In this talk, we first present a succinct QBF encoding for the ILA using a single copy of the design. Then, we show how to parametrise this encoding to achieve further compression, using a novel technique titled time-frame windowing. This technique generates a family of ILA encodings by balancing the roles of explicit circuit unrolling and of universal quantification in reasoning about sequential behaviour. The generated QBF-based ILA encodings are shown to admit a non-trivial minimal-size member, using which proves to be empirically vital in the experiments. Comprehensive hardware constructions are used to illustrate the proposed encodings. Three notable CAD problems, namely bounded model checking, design debugging and sequential test pattern generation, are encoded as QBF instances to demonstrate the robustness and practicality of the proposed approach. Extensive experiments on OpenCore circuits show memory reductions in the order of 90 per cent and demonstrate competitive and often superior run-times compared with state-of-the-art SAT techniques. As an added advantage, the total number of solved instances is increased by 16 per cent. Furthermore, we contrast search-based and resolution-based QBF solving strategies and we analyse their performance on our encodings. The theoretical and practical contributions presented in this talk encourage further research in effective QBF encodings and QBF solvers for performance- driven CAD solutions to intractable VLSI design problems.

Back to top


Adaptive Approaches to Electronic Fraud Detection in Highly Dynamic Environments

  • Mohammad Behdad
  • 11am Friday 6th February
Abstract

As more and more companies and government agencies move towards electronic processing, they become more vulnerable to large scale and systematic fraud. This trend has led to a significant research effort towards providing algorithms and methods for fraud detection. These efforts have not been fully successful due to the unique characteristics of fraud detection; the most important of which is the adaptive environment. In other words, as the fraud detection techniques improve, the fraudsters change their behaviour. Other challenging characteristics of fraud detection are: imbalanced data, unequal misclassification costs, concept drift, overwhelming large volume of data, the required high accuracy, the required fast processing time and the lack of sufficient amount of training data. LCS (Learning Classifier Systems) is an adaptive machine learning technique which combines reinforcement learning and evolutionary computing. It has been successfully applied in the areas such as modelling, robotics and data mining. However, despite having the very important feature of adaptability, it has not been extensively applied to fraud detection. The reason is that it is susceptible to imbalanced data and large volumes of data. The aim of this research is to tune and use LCS to detect electronic fraudulent. The improvements in the LCS algorithms will then be generalized and will readily be applicable to other domains.

Back to top


Boolean SAT, the Respective Solvers and their Applications to CAD for VLSI Verification, Test and Debug

  • Andreas Veneris - University of Toronto
  • 11am Thursday 5th February
Abstract

Recent years have seen an increased use of Boolean satisfiability (SAT) in the design cycle for Very Large Scale Integration (VLSI) circuits as an engine to solve intractable problems. The use of SAT in the VLSI design cycle is strengthened by the amount of ongoing research into SAT solvers. Any improvement to the state-of-the-art in SAT solving immediately benefits all SAT-based solutions. Although useful in many stages of the design cycle, logic diagnosis has not been addressed within a satisfiability-based framework. Logic diagnosis arises in the digital VLSI cycle during logic debugging and during silicon debug. In this talk we will first present the necessary background and historical perspectives of Boolean SAT engines in CAD for VLSI. Then we will present the first Boolean SAT-based formulation for multiple fault/error diagnosis in combinational and sequential circuits. A number of heuristics will be outlined that improve performance for large designs. An extensive suite of experiments on large benchmark and industrial circuits confirm the robustness and practicality of SAT as they outperform conventional methods by orders of magnitude. These results suggest that satisfiability captures significant characteristics of the problem of diagnosis. They also encourage novel research in satisfiability-based diagnosis approaches as a complementary process to this of design verification.

About the Presenter

Andreas Veneris received the Diploma in Computer Engineering and Informatics from the University of Patras in 1991, the M.S. degree in Computer Science from the University of Southern California, Los Angeles in 1992 and the Ph.D. degree in Computer Science from the University of Illinois at Urbana-Champaign in 1998. He currently is an Associate Professor cross-appointed with the Department of Electrical and Computer Engineering and the Department of Computer Science at the University of Toronto. His research interests include algorithms and CAD tools for debug, verification and test of digital systems and circuits. He is co-recipient of a best paper award in ASP-DAC'01, co-author of a book and member of IEEE, ACM, AAAS, Technical Chamber of Greece and The Planetary Society. He is also the President/CEO of a university spin off company that commercializes the research of his group on automated debugging.

Back to top


Query Optimisation for Sensor Networks

  • Chi Yang  - PhD proposal
  • 11am Friday 30th January
Abstract

Sensor networks enable scientists to monitor the spatial and temporal attributes of a landscape. Scientists and users use the database paradigm to manipulate WSN and extract the data by queries. Some WSN query systems have been developed including TinyDB, TinyCubus, Tinylime, Semantic streams, Regiment and Kiaros, etc. However, most of them can be inefficient because they have limited support for using application domain knowledge and partial accurate response to queries. To address this problem, my PhD project plans to build up a query system for WSN based on TinyDB. The new system targets to gain more efficiency with the help of application domain knowledge using techniques such learning and probability.

Back to top


Semantic and Social Networking Technologies to Analyse the Online Media Coverage on the US Election 2008

  • Dr Arno Scharl - MODUL University Vienna
  • 11am Friday 23rd January
Abstract

The US Election 2008 Web Monitor acknowledges the influential role of online media in the US presidential elections and analyses publications from different stakeholder groups in weekly intervals. For this purpose, the system captures the Web sites of international media, the Fortune 1000 (the biggest US companies in terms of revenue), as well as 1000 popular blogs on political issues. For each presidential candidate, the system automatically extracts media attention, sentiment, and associated keywords from a repository of 800,000 Web documents.

The project uses two Facebook applications to gather social information. The first application lets users vote for and track their preferred candidates. The second one called ‘Sentiment Quiz’ follows the tradition of games with a purpose by inviting Facebook users and their network of online friends to evaluate whether quotes from an archive of election-related documents express positive or negative sentiment. Besides improving algorithms for automated sentiment detection, the system analyses information diffusion in social networks and leverages the wisdom of the crowds for investigating individual perceptions of Web documents with political content.

Available at www.ecoresearch.net/election2008, the US Election 2008 Web Monitor has recently been awarded the First Prize in the category "Online Communities, Web 2.0 and Social Networks" of the Austrian National Award for Multimedia and e-Business.

About the Presenter

Professor Arno Scharl is the Vice President of MODUL University Vienna, where he also heads the Department of New Media Technology. Prior to his current appointment, he held professorships at the University of Western Australia and Graz University of Technology, was a Key Researcher at the Austrian Competence Center for Knowledge Management, and a Visiting Fellow at Curtin University of Technology and the University of California at Berkeley. Arno Scharl completed his doctoral research at the Vienna University of Economics and Business Administration. Additionally, he holds a PhD and MSc from the University of Vienna, Department of Sports Physiology. He edited a book on "The Geospatial Web" (www.geospatialweb.com) and founded the ECOresearch Network (www.ecoresearch.net), which also hosts the Media Watch on Climate Change (www.ecoresearch.net/climate). His current research interests focus on the integration of semantic and geospatial Web technology, Web mining and media monitoring, virtual communities and environmental online communication.

Back to top


Restricted Natural Language and RDF for Preference Specification

  • Dr Johann Mitloehner - Vienna University
  • 11am Friday 16th January
About the presenter

Dr Mitloehner is an Assistant Professor at the Institute for Information Business of the Vienna University of Economics and Business Administration. His research areas include:

  • Genetic Algorithms
  • Preference Modelling and Social Choice Aggregation Rules
  • Software Project Effort Estimation with Ordinally Scaled Input Data
  • Semantic Web Technologies using Restricted Natural Language

Back to top

 

School of Computer Science and Software Engineering

This Page

Last updated:
Wednesday, 13 February, 2013 8:23 AM

http://web.csse.uwa.edu.au/2111706