Parallel and Distributed Machine Learning - Smart Data Analysis in the Multicore Era. In large data centres our research will lead to reduced energy consumption by using graphics cards which have a much better computation to power ratio than traditional processors. On desktop computers, it will make machine learning practical by enabling efficient algorithms for spam filtering and content analysis. On networked systems it will lead to distributed inference, caching and collaborative filtering ap ....Parallel and Distributed Machine Learning - Smart Data Analysis in the Multicore Era. In large data centres our research will lead to reduced energy consumption by using graphics cards which have a much better computation to power ratio than traditional processors. On desktop computers, it will make machine learning practical by enabling efficient algorithms for spam filtering and content analysis. On networked systems it will lead to distributed inference, caching and collaborative filtering applications which will both reduced the bandwidth required and make the internet safer for users. Finally, it will enable rapid deployment of sensor networks for monitoring and detection, such as for environmental monitoring and safeguarding Australia's borders.Read moreRead less
Linkage Infrastructure, Equipment And Facilities - Grant ID: LE0346878
Funder
Australian Research Council
Funding Amount
$190,000.00
Summary
GeoWulf: An Inference Engine for Complex Earth Systems. The project is to build a `Beowulf' cluster as a platform for solving
complex data inference problems in the Earth sciences, and in
particular the fields of thermochronology, seismology, crustal and
mantle dynamics, and landform evolution. A Beowulf cluster is a
network-linked set of commonly available `off-the-shelf' PC-computers
configured to give unprecedented performance/cost ratio. Projects
using the Beowulf facility will combine ....GeoWulf: An Inference Engine for Complex Earth Systems. The project is to build a `Beowulf' cluster as a platform for solving
complex data inference problems in the Earth sciences, and in
particular the fields of thermochronology, seismology, crustal and
mantle dynamics, and landform evolution. A Beowulf cluster is a
network-linked set of commonly available `off-the-shelf' PC-computers
configured to give unprecedented performance/cost ratio. Projects
using the Beowulf facility will combine state-of-the-art computational
techniques recently developed at ANU, and high quality data sets
collected over the past decade to address fundamental questions in
the Geosciences.Read moreRead less
Frontiers in inference about risk. The project aims to develop new methods for robust risk evaluation and minimisation under various constraints and scenarios. Risk evaluation, estimation and prediction using past data is a central activity in diverse areas such as finance, insurance, superannuation and environmental regulation. The project aims to propose and solve innovatively robust risk optimisation problems under constraints, taking into account the time dynamics. Applications include risk ....Frontiers in inference about risk. The project aims to develop new methods for robust risk evaluation and minimisation under various constraints and scenarios. Risk evaluation, estimation and prediction using past data is a central activity in diverse areas such as finance, insurance, superannuation and environmental regulation. The project aims to propose and solve innovatively robust risk optimisation problems under constraints, taking into account the time dynamics. Applications include risk management around natural catastrophes and long-term asset investment of pension funds. The solutions and outcomes are expected to deliver optimal resource allocation proposals and better management of risk exposure in practice.Read moreRead less
Optimisation for next generation machine learning. As more and more data are being collected, it is important to build intelligent systems which will can analyse these data efficiently. This project will take design and analyse new algorithms which take advantage of emerging paradigms in hardware such as multicore processors, graphic processing units (GPU), and cluster computers to achieve this goal.