Sparse grid approximations and fitting using generalised combination techniques. Sparse grid techniques provide an effective tool to deal with the
computational curse of dimensionality which is a constant challenge in
modelling complex data. The proposed research is aimed at the
development and analysis of algorithms for data fitting with sparse
grids using variants of the combination technique. The outcome of the
research is a theory which will provide insights in the applicability,
limit ....Sparse grid approximations and fitting using generalised combination techniques. Sparse grid techniques provide an effective tool to deal with the
computational curse of dimensionality which is a constant challenge in
modelling complex data. The proposed research is aimed at the
development and analysis of algorithms for data fitting with sparse
grids using variants of the combination technique. The outcome of the
research is a theory which will provide insights in the applicability,
limitations and the convergence properties of the proposed
algorithms. The outcomes will be widely applicable in modelling of
large scale and complex data as is encountered in areas of
bioinformatics, physics and experimental studies of complex systems.
Read moreRead less
Investigation and Development of Parallel Large Scale Record Linkage Techniques. Record linkage aims at matching records of the same entity (like customer or patient) in large (administrative) databases. The outcomes of the proposed research will improve current techniques in terms of efficiency, accuracy and the need for human intervention. Through experimental studies and stochastic modelling the performance of traditional and new methods for data cleaning, standardisation and linkage will be ....Investigation and Development of Parallel Large Scale Record Linkage Techniques. Record linkage aims at matching records of the same entity (like customer or patient) in large (administrative) databases. The outcomes of the proposed research will improve current techniques in terms of efficiency, accuracy and the need for human intervention. Through experimental studies and stochastic modelling the performance of traditional and new methods for data cleaning, standardisation and linkage will be assessed. The effect of the statistical dependency of attribute values will be studied. New methods using clustering for blocking large datasets, and predictive models including interaction terms will be implemented, analysed and evaluated on high-performance computers and office-based PC clusters.
Read moreRead less