Last Wednesday was the culmination of the second year group projects. We had an afternoon of demos followed by lightning talks from each group showing off their projects.
Here is a brief summary of the projects that the Queens’ students worked on:
Retail Category Manager: Tatsiana’s group were given the brief of making it easier for retailers to list their products in online marketplaces like Amazon. The idea was to auto-categorise a list of products based on keywords in the name and description. They adopted a neat Bayesian approach which they trained on the products already categorised in the marketplace. This had a nice effect that if a misclassification occurs it can be manually corrected and then the prior probabilities can be updated to reflect the new knowledge.
Location Based Teaching: Radu and Sid’s project was to build a location tracking system which tracked smart phones using iBeacons. They then implemented a location-based callback system so that you could be notified of interesting events near-by in the building. The location system worked by matching fingerprints of visible beacons with a previously created fingerprint map. They managed to get quite accurate results (within 10 metres) in some parts of the building. However, in more open spaces the accuracy was less good (sometimes even putting you on the wrong floor). They built an Android app and a website which worked pretty well.
Building the Matrix: Katie, Ben and Jamie got to play with two Oculus Rift headsets. They had to build a distributed multiplayer virtual reality game. A lot of work went into this since they built their whole 3D engine from scratch in C++. The game was bumper cars. They had managed to get the latency on the headtracking down really well so when you moved your head the view updated very smoothly. However, the latency of the game controls was quite a bit bigger – that’s my excuse for why I kept getting bumped off the track.
Micro friends video diary: Matt’s project involved automatically summarising video. Their software automatically generates a short 3 second highlight clip of your video and shares in on a social network that they built. They used various image processing algorithms (such as face tracking) to try and work out which were the best frames to include. And they built an entire social networking website for people to share and follow other peoples video clips.
Well done to everyone for successfully delivering your projects on time.