AI meets image analysis at the University of Adelaide
Researchers at the University of Adelaide are creating machines capable of undertaking complex tasks, acknowledging the outcomes and improving their performance accordingly.
That’s according to Professor Anton van den Hengel, Director of the Australian Institute for Machine Learning (AIML), who claimed the university’s technology can “compete with, and sometimes exceed, human capabilities in tasks like recognition, statistical analysis and classification”.
The breakthrough, according to Prof van den Hengel, has been the advent of ‘deep learning’ technology — a form of machine learning (itself a subset of artificial intelligence) based on the human brain’s neural networks. “That’s enabled machines to distil and interpret huge amounts of prior and incoming information, and particularly visual information,” he said.
Prof Ian Reid, a senior colleague of Prof van den Hengel’s and Deputy Director of the Australian Centre for Robotic Vision, agrees. “Artificial neural networks, together with vast computing power and data volume, have enabled step-change in the level of intelligence machine learning can achieve,” he said.
Faster disease diagnosis
For example, the University of Adelaide recently collaborated on the creation of the world’s first AI microbiology screening technology for use in pathology laboratories. Developed in partnership with healthcare company LBT Innovations, the Automated Plate Assessment System (APAS) went into production in 2017 and is attracting international interest.
“APAS will enable doctors to order more tests, which will give them more information, sooner,” said Prof van den Hengel, who led the six-person APAS software development team. “It could even allow country or developing-world hospitals to run their own tests without having to ship samples to a central lab. That would save a huge amount of time, and potentially many lives.”
The system automates the traditionally time-consuming functions performed by microbiologists in screening culture plates after incubation. It takes high-quality images of the plates, then analyses and interprets any microbial growth, matches this against key patient data, presents a diagnosis and continually updates its own knowledge base.
Significantly, APAS also removes non-pathogenic plates from the workflow. “This is very important,” said LBT co-founder Lusia Guthrie, now Chair of Clever Culture Systems — the joint-venture company bringing APAS to market. “In routine microbiology testing, up to 70% of plates may be negative. Removing them automatically will give microbiologists more time to spend on complex decisions, enabling even greater accuracy and allowing more tests to be run.”
LBT CEO Brent Barnes believes the system will ultimately mean faster recovery for millions, saying, “More and more accurate testing will see patients getting the right treatment earlier and spending less time in hospital.”
Keen to build on the foundation laid with APAS, the university and LBT are now jointly developing three other related medical devices utilising the university’s AI image-analysis technology.
Accelerating crop farmers’ adaptation to climate change
Another application of the AI image-analysis technology lies in the agricultural sector, with Prof van den Hengel tailoring the technology to accurately estimate potential new cereal varieties’ yields after only very short periods of growth, enabling rapid identification of robust varieties able to thrive in harsh conditions.
“This novel approach promises to transform crop breeding,” Prof van den Hengel said.
“By using image analysis to understand plants’ shape and structure at all stages of growth, we’ll be able to identify and automatically measure attributes associated with high yields very early in test plants’ lifespans.”
The system uses multiple images taken from numerous angles to construct computerised 3D models of the plants for analysis. Once completed, it will be incorporated into the university’s Plant Accelerator facility, which provides important complementary capability.
“The Plant Accelerator’s fully robotic plant management system allows automatic and repeatable control of growing conditions for up to 2400 plants at a time, and enables automatic delivery of those plants to our imaging stations,” Prof van den Hengel said.
“That’s going to allow rapid, detailed and accurate estimations of vast numbers of crop varieties’ potential yields under all kinds of climate-change-related stresses, such as high salinity or drought. We’ve no doubt it will expedite the development of hardy, high-yield varieties and help improve global food security.”
Industry partner Bayer Crop Science has signed on to help commercialise the technology for this application.
Emulating nature’s perfect pursuit
The technology is also enhancing autonomous-pursuit capabilities, with computer scientists, engineers and neuroscientists at the university adapting dragonflies’ neuronal processes into an algorithm that emulates the insect’s phenomenal visual tracking capability. Widely considered nature’s most effective predator, dragonflies are able to target, pursue and capture tiny flying prey in mid-air at speeds of up to 60 km/h — even if that target attempts to disappear within a seething swarm — with an impressive hit-rate of over 95%.
“Tested in various nature-mimicking virtual reality environments, our pursuit algorithm matches all other state-of-the-art pursuit algorithms’ accuracy, but achieves that while running up to 20 times faster,” Prof Ben Cazzolato said. “So it requires far less relative processing power.”
Mechanical engineering researchers at the university have also incorporated the algorithm in an autonomous robot that, in testing, has effectively and efficiently pursued targets in unstructured environments.
The interdisciplinary project is being led by neuroscientist Dr Steven Wiederman, whose team first identified how the dragonfly is able to focus on a single moving target and shut out all else. He and his team are now collaborating with Professor Reid to develop neurobiology-inspired machine-learning drone-tracking systems.
“We’re excited to further define the principles underlying neuronal processing,” Dr Wiederman said. “Translating them into advanced artificial vision systems could result in some incredibly effective autonomous robotics and drones, as well as neuronal prosthetics and many more applications.”
Machine learning software called TEXLab forecasts the survival rates and response to various...
Daren Cumberbatch reveals how laboratory information management systems (LIMS) can be used to...
For many years, researchers knew that rodents' squeaks told a lot about how the animals are...