2019 Teams & Projects
Event photos can be found
here
.
|
The SeatFinder is an application designed for university students to help find free seats in libraries during busy time like exam periods. An ultrasonic sensor placed under each desk is used to detect whether a seat is free or occupied and this information is sent to an phone application which displays a layout of the library, showing all the seats in the library. A seat coloured green is free, and a red seat is occupied. An extra feature included in our project was a reservation button, which can be used to reserve your seat for a set amount of time. During the reservation, a seat will be highlighted orange. LED lights were also used to display the status of a seat. In simpler words, this project is a similar to a car park system, except for seats.
|
|
Turtle
Emily Kuo - Akash Ramaswamy - Sarah Ross |
|
Our project makes retail shopping a three step process: arrive, shop and pay - customers sign in to an individually personalised trolley, items are scanned as the customer shops and trolley contents are automatically synced for streamlined payments.
Onboard we have a Raspberry Pi, LCD and RFID scanner facilitating the scanning and live transmission of products the customer has placed in the trolley. These items are tracked in real-time on a cloud database. At checkout time, scanning a loyalty/ID card at the POS system will retrieve the list of trolley items from the cloud and present payment options. Our POS system consisted of a laptop, Arduino and RFID scanner. |
|
Paradigm Disruptors
Shayne D'Lima - Dilshan Goonatillake - Joseph Nguyen -Mukesh Sudhakar |
|
The project is a recycling chute that accepts recyclables from users after they tap on with their RFID card. The RFID card reader connected to the WeMos D1 prompts it to send a HTTP request to the Raspberry Pi Zero which detects whether items inputted are recyclable using a camera. The response from the Pi determines whether the WeMos actuates two servos to allow items into the chute.
Users receive visual feedback from the system with the use of an LED light - it turns green when the user initially taps on with their card or when an item is accepted and turns red when the item in the chute is not recyclable. After a minute the light turns off, signalling the next user to tap on with their RFID card. After each item is accepted there is a frontend wherein users can see their recycling credits and participate in challenges or promotions. The frontend is built in React with Material UI components. |
|
Guava
Joseph Hao - Steven Tran - Jacob Wisniewski - Eddie Yao |
|
Through a mobile app, users are able to type in their intended destination. By utilising the Google Maps API, the optimal route is calculated and then transmitted. The Shüüs connect to the phone via Wifi (using the Wemos D1), and begin receiving directional information. The microcontroller responds by sending the appropriate signal to a MOSFET circuit, which in turn powers the motor of either the left foot, or the right foot. An accelerometer and gyro module (MPU-6050) continuously collects pitch, yaw and roll data to calculate the amount of steps taken, which is transmitted back to the mobile application, and displayed to the user.
Aims to address navigation in unfamiliar areas for both sighted and vision impaired people.
John Bui - Penn Chen - Zi Li Tan - Angus Trau - Kenny Ung
|
|
Only around 10% of visually impaired children around the world are literate in Braille. In an attempt to solve this problem, we created a device known as the B.R.E.A.D.B.O.X.
This device supplements a child's learning by converting on-screen text into physical Braille character by character which the child can feel and learn at their own pace. B.R.E.A.D.B.O.X also supports TTS (Text-to-Speech), which allows the child to hear the currently displayed character (word, letter, or punctuation). An Arduino programmed with six servos were used to create the braille characters, receiving input from a user interface on the computer programmed in Java.
BAS²
Steven Hoang - Albin Joy - Steven Lay - Bryan Vardic Gunawan |
|
The project consists of a raspberry pi and camera. The raspberry pi hosts a Web application for your mobile phone which lists all the items currently in your pantry, as well as allows you to add more items, or simply view your pantry. The computer vision system compares histograms of each item to a database of objects previously scanned in. Then it will continuously update to tell the user which items are still held in their pantry.
Deals with food wastage mainly, aiming to minimise double buying of products. The ultimate goal is that it could be further implemented as an inventory manager for any other applications, such as medical facilities or armories.
#include <stdlib.h>
Owen Brooks - Alex Chou - Geordie McClelland - An Nguyen - Josh Pearse |
|
Our solution consists of a R-Pi that utilizes the Pi camera to record the face of the driver; Computer vision is used with facial recognition to identify the eyes of the driver and from there, calculate the blink duration to determine if the driver is fatigued. Moreover, to foolproof the system, a touch sensor and heartbeat sensor are used to ensure that the driver is indeed tired and not looking away from view. The system is constructed with both the Arduino and R-Pi; Allowing more process power for the R-Pi for computer vision while the Arduino controls the actuators and evaluates the data from all input sources. A simple buzzer is used to alert the driver that he/she is fatigued.
Aims to prevent road fatalities caused by drowsy drivers. |
|
Trihard
Jackson Hew - Zwe Lin Htet - Jun Wen Kwan - Shakeel Rafi - Joon You Tan |
Non-member price Member price Non-member Early Bird Member Early Bird (Register before April 10 to get the Early Bird discount) |
$25 $20 $20 $15 |
Limited spots available! Teams must have 4 - 6 team members. Read the rules before purchasing to avoid complications! |