Projects



Teaching assignment utilising metaheuristics

Allocation of educators to diverse and rapidly evolving educational programmes of study such as those within Computing and under increasingly tighter budgetary constraints is a non-trivial task given the wide range of subject areas encompassed here, which span information technology, information systems, software engineering and computer science, as well as core areas such as computer forensics and computer games development. Suitability and availability of expertise coupled with an aspiration to limit disruption to existing teaching assignments can often result in first fit solutions that are less than optimal in terms of suitability. This system is highly sensitive to even small changes, which ripple out through assignments and make it a difficult problem for solution. The ongoing work has resulted in a methodology for profiling modules and, by association, educator expertise that provided a basis for exploring a large number of potential teaching assignments utilising search algorithms. The prototype system limited profiles to what staff had previously delivered. The live system saw the integration of data provided by all academic staff within Computing. This data enhanced suitability profiles by indicating which elements of the curriculum staff would and what they could deliver. Implemented in a live format for the first time in preparation for the 2013 academic year, the process rapidly achieved a very good solution to this difficult, evidenced by the significantly lower incidence of change requests from colleagues. The solution minimised disruption to existing assignments, highlighted bottleneck areas and distributed 167 teaching units across 44 members of staff while minimising under and over-assignment despite the very large number of unassigned modules left retiring colleagues. Further work includes the inclusion of more elaborate workload balancing heuristics, the application of other algorithms and expansion into other disciplines.


Ethics of Technology with Applications in Healthcare

Collecting data and analysing these for commercial use has become general practice supported by advances in available technology. Developers are now facing the question of whether their applications are ethically and morally acceptable. This project studies possibilities of engineering systems with built-in ethical arguments as well as developing further ideas for ethically approved systems. Within this project we study how mobile technology and wearables can be facilitated for early-onset dementia patients. More information is available here.


Shaping the future of the intelligent home

The home of the future will use a variety of sensors to perceive and learn about the habits of the person(s) using any of the room(s). Partial information will be gathered and put into a larger context by communicating agents. These agents form a multi-agent system in which agents communicate to exchange information to achieve the desired goal by picking suitable plans (rules) and making them the current intention. Several intentions can be active concurrently in the pursuit of a number of (sub)goals. Agents also might have to work together (form a coalition) to achieve the goal(s). Changes in the environment, such as the availability of sufficient resources, can lead to the necessity of dropping an intention and picking an alternative plan. This project will look into shaping the future of the intelligent home.


Business rule generation for product data quality assurance

This EPSRC-funded project is a partnership with Newport-based company GXS PDQ Ltd. The company produces business rules for customers in the Retail Fast Moving Consumer Goods and Consumer Electronics sectors, and incorporates these rules into its Product Data Quality (PDQ) service for data quality checking. This enables its customers to identify quality failures in their data. However, the generation of business rules is time consuming, and must be repeated following changes in the business environment, the addition of new products or of new channels for sales, or changes to existing products. The project will involve using artificial intelligence (including semantic and statistical data mining techniques) to automate the generation of these rules.


Resource and location-based verification of multi-agent systems

The basic idea of rational agents being autonomous entities perceiving changes in their environment and acting according to a set of rules or plans in the pursuit of goals does not take resources into account. However, many actions that an agent would execute in order to achieve a goal can – in real life – only be carried out in the presence of certain resources. Without sufficient resources some actions are not available, leading to plan failure. The analysis of agents and (multi-agent) systems with resources is still in its infancy and has been tackled almost exclusively in a pragmatic and experimental way. This project takes first steps in model checking resource bounded systems. Well-known computational models are combined with a notion of resource to enable a more systematic and rigorous specification and analysis of such systems. The work is carried out in collaboration with the Technical University of Clausthal in Germany.


Computational analysis of processes in cell biology
This project applies probabilistic Petri nets as models for cell behaviour.Together with Swansea University’s Institute of Life Sciences initial computational models for processes in cell biology have been developed. These are more faithful in visualisation than previous automata models. Further research will enable computational simulations to be carried out to assist the planning of costly experiments.


Residential property price forecasting

There is no reliable forecasting service for residential values with current house prices taken as the best indicator of future price movement. This approach has failed to predict the periodic market crises or to produce estimates of long-term sustainable value (a recent European Directive could be leading mortgage lenders towards the use of sustainable valuations in preference to the open market value). Work here, underway in collaboration with colleagues, sees the application of sensitivity analysis to artificial neural networks, trained using multivariate time-series data, which forecasts future trends within the housing market. Prior work in this area has been well received both in academic and media circles, with the changing data that encompasses a sustained period of downward prices offering a further opportunity for research. Work on feature selection and data mining led to a predictive model for residential house price forecasting and has resulted in collaborative work in GIS where the impact of flood plains on house prices will be studied.


The use of stereo vision in augmented reality

This work presents a method for non-computationally expensive automatic alignment of cameras that utilises stereoscopic imagery separated at varying distances just below that of the intraocular distance. Here, automatic stereoscopic alignment in real-time is a non-trivial process that relies on calculating the best virtual alignment of camera lenses through image overlaying. This is important as retail 3D camera lenses are not typically sufficiently calibrated for accurate estimates of distance. The alignment of images allows the filtering of background objects and focuses on points of interest. Imprecision in camera lens calibration leads to problems with the required alignment of images and consequent filtering of background objects. The algorithm presented in this paper allows virtual calibration within non-calibrated cameras to provide a real-time filtering of images and the consequent identification of points of interest. The proposed method is capable of generating the best alignment setup at a reasonable computational expense in natural environments with partial background occlusion.


A Rule Based System to support the provision of bespoke wheelchairs for the severely disabled

The purpose of this work is to determine whether it is possible to use an automated measurement tool to clinically classify clients who are wheelchair users with severe musculoskeletal deformities, replacing the current process which relies upon clinical engineers with advanced knowledge and skills. Clients’ body shapes were captured using the Cardiff Body Match (CBM) Rig developed by the Rehabilitation Engineering Unit (REU) at Rookwood Hospital in Cardiff. A bespoke feature extraction algorithm was developed that estimates the position of external landmarks on clients’ pelvises so that useful measurements can be obtained. The outputs of the feature extraction algorithms were compared to CBM measurements where the positions of the client’s pelvis landmarks were known. The results show that using the extracted features facilitated classification. Qualitative analysis showed that the estimated positions of the landmark points were close enough to their actual positions to be useful to clinicians undertaking clinical assessments.


Diagnostic Research into Biomechanical Urinary Problems using Signal Processing

Urodynamics is a clinical used to diagnose the pathophysiological reason behind lower urinary tract symptoms with which a patient presents. The test is carried out by taking pressure measurements inside the bladder and rectum and observing how pressure changes during bladder filling and voiding. The data recorded in urodynamics is usually in the form of a time series containing three pressure traces (the two measured pressures, in the bladder and rectum, and the difference between them which is assumed to be bladder muscle activity). The flow rate of any fluid voided and the amount of fluid that has been pumped into the bladder during the test is also recorded. Occasionally X-ray video is used to aid in diagnosis but in most cases the presence of certain pressure events and key values taken during the bladder voiding cycle are used to make a diagnosis. Here, research into the classification of problems derived from trace data is ongoing.


Developing Image Enhancement and Image Transmission Techniques for an Internet-Based Medical Image Processing System

The pointwise sampling algorithm has been used in adaptive contrast enhancement (ACE) for nearly three decades. To overcome its computational overhead various stepwise sampling algorithms have been proposed, including the HOICE (Highly Overlapped Interpolation Contrast Enhancement) algorithm that addresses the artefacts, such as image distortion and blocking effects, of other stepwise sampling algorithms while keeping the advantage of speed. However, HOICE’s internal mechanism has not been compared with traditional pointwise algorithms, which may affect the scope of its future application. This paper establishes a bridge between traditional pointwise sampling and HOICE: it first mathematically analyses the internal mechanism of two typical sampling algorithms (pointwise and HOICE); and then quantitatively demonstrates the similarities between these two kinds of algorithms. Moreover, the results of quantitative measurement suggest that HOICE has an advantage over the pointwise algorithm in reducing the distortion along image edges. Therefore, HOICE has the advantages of both pointwise and other stepwise algorithms making it a reliable sampling algorithm for ACE. Work completed in collaboration with the Medical Physics Department at Heath Hospital, University of Wales has resulted in a novel technique for increasing the contrast in medical images.


Predicting the geo-temporal variations of crime and disorder

We have developed computer models for predicting and mapping crime levels that helped inform the Crime and Disorder Audits of the collaborating Partnerships, in collaboration with local Crime and Disorder Partnerships (South Wales Police, and South Wales Fire & Rescue Service). The work received initial University RS and postdoctoral RA support before EPSRC/CASE funding was secured for two research students.

A system that intelligently interrogates a constantly updated database of crime incidence and provides accurate indicators of where and when crime is likely to be highest would be of great utility in real-time police resource allocation. A limiting factor, however is that crime incidence counts are generally low in relation to crime type, time and space, and subject to randomness. Crime forecast error measures vary inversely with increasing incidence count utilised in estimating time series forecast models. Average crime counts per unit time period and geographic area of at least 25 to 35 are needed before forecast errors become acceptable.
This work details a forecasting framework for short-term, tactical deployment of police resources in which the objective is the identification of areas where the levels of crime are high enough to enable accurate predictive models to be produced. Here, identified hot-spot regions are utilised as the foundation for predictive models.


Determining Geographical Causal Relationships through the Development of Spatial Cluster Detection and Feature Selection Techniques

In collaboration with the Welsh Consultant Haematologists Leukaemia Registry), this work led to a new algorithm for determining salient attributes within cause and effect relationships. Spatial datasets contain information relating to the locations of incidents of a disease or other phenomena. Appropriate analysis of such datasets can reveal information about the distribution of cases of the phenomena. Areas that contain higher than expected incidence of the phenomena, given the background population, are of particular interest. Such clusters of cases may be affected by external factors. By analysing the locations of potential influences, it may be possible to establish whether a cause and effect relationship is present within the dataset.


Container-ship stowage planning

A collaborative project with Maritime Computer and Technical Services and (the then) P&O Containers resulted in a paradigm for the automated planning of cargo placement on container ships. The methodology was derived by applying principles of combinatorial optimization and, in particular, the Tabu Search metaheuristic. The methodology progressively refines the placement of containers, using the Tabu search concept of neighbourhoods, within the cargo-space of a container ship until each container is specifically allocated to a stowage location. Heuristic rules are built into objective functions for each stage that enable the combinatorial tree to be explored in an intelligent way, resulting in good, if not optimal, solutions for the problem in a reasonable processing time. This body is heavily cited and forms the basis for a resurgence in interest in this area.

More information

If you are interested in any of the above projects and want to find out more, please contact Dr Ian Wilson in the first instance.