Innovation Showcase: A day immersed in the future

Innovation Showcase: A day immersed in the future

Online
Thursday, 17 February 2022
9 am - 1:30 pm (AEDT)
Free

Our Innovation Showcase is a rare and exclusive opportunity to get up-close with next-generation technologies developed across our faculty – raising the curtain to the future.

Believe in breakthroughs? It’s what we do.

During this immersive on-campus exhibition, you’ll discover projects focused on agriculture, cybersecurity, digital health, human-centred software, energy and other global priority areas. You’ll also get to chat about their real-world applications with our researchers at the vanguard of IT.

To fuel the insights, we’re also hosting esteemed speakers from top-tier organisations who will delve into how our research is benefiting their companies and sectors more widely.

Interested in a partnership? Our Departments of Data Science and Artificial Intelligence, Software Systems and Cybersecurity, and Human-Centred Computing will offer one-to-one sessions to discuss collaborations in detail with you.

Encounter innovation

Whether it’s enhancing cybersecurity, championing sustainability, streamlining critical healthcare, building better cities or upskilling employees, our faculty is contributing to society through a variety of interdisciplinary projects.

Here are some of the projects that will be on display at the event – with more to be added!

Can virtual bees feed the world
Can virtual bees feed the world?
Computational and Collective Intelligence, Data Science and Artificial Intelligence

Through computer simulations, we're revealing how climate change may influence bee pollination and ultimately, flower evolution. This project also explores how the management of environmentally and economically important plants can be modelled to drive reliable decision-making and contribute to food security.


City Futures
City Data Futures
Emerging Technologies Research Lab, Human-Centred Computing

In partnership with the City of Melbourne, this initiative has produced new insights into how people experience and engage with city sensors and data. Outcomes of ETLab's research include: a new design for and set of recommendations regarding how to engage with the public through ethical and transparent approaches to data, embedded in local and city values; and a new City Data Futures Methodology which is transferable and scalable across other urban sites and to other cities.


Bughunter
Software Engineering, Software Systems and Cybersecurity

A world-first, bughunter is an AI-based technique to quickly identify and explain defects in programs. By leveraging recent advances in explainable AI, this innovation can accurately locate 61% of the actual defective lines − which is 91% better than state-of-the-art approaches. This project will also help developers mitigate risks moving forward.


Multimodal autonomous AI
Multimodal autonomous AI
Vision and Language, Data Science and Artificial Intelligence

Our pioneering research in vision and language is driving AI that can perceive and understand like humans. Discover how we’re designing and teaching deep models without any labels to translate an image from one style (eg. winter) to another (eg. summer) without changing the image’s content.


MiniZinc
MiniZinc
Optimisation, Data Science and Artificial Intelligence

MiniZinc is a discrete optimisation modelling language used widely around the world to solve important decision-making problems. It has been used by Monash to deliver 400M euro better spending to UNHCR and automated plant designs to Woodside. It has also been leveraged by many other international corporations to tackle important decision-making questions.


PsiNet
PsiNet
Exertion Games Lab, Human-Centred Computing

All about ‘sharing brains’, PsiNet is a novel technology that promotes inter-brain synchrony. It uses brain-to-brain interfacing and AI to sense brain activity in groups of people and stimulate their brains in response to be more synchronised. The impact? Helping people feel more part of a whole while raising their awareness.


Interactive Accessible Models
Interactive Accessible Models
Inclusive Technologies Lab, Human-Centred Computing

We’ve combined 3D printing with low-cost electronics and conversational interfaces like Siri and Alexa to enhance access to information for people who are blind or have low vision. Tactile raised line diagrams have traditionally been used, however they can’t convey depth or height. Our models allow people to interact with touch and voice, in which the model can respond with voice, sound or haptic feedback.


FakeBuster: How to detect deep fakes
FakeBuster: How to detect deep fakes
Human-Centered AI Lab, Human-Centred Computing

Achieving over 90% accuracy, FakeBuster helps organisers of online conferences or seminars detect if a participant's video is being manipulated or spoofed. Commended internationally, this project will enhance critical security in a time where it’s increasingly easy to manipulate media. The capabilities of FakeBuster can also be used to undermine fake news.

More details will be released soon. Stay tuned! In the meantime, keep up with the latest and greatest from our faculty by connecting with us on Facebook, Twitter, YouTube and LinkedIn.

ResearchIndustry and community

For industry and partnership enquiries

Associate Professor Aldeida Aleti

Associate Dean (Engagement and Impact)

For event enquiries

Sam Beadle

Senior Events Coordinator

Share this event