• Spring 2018 Final Project Summary

    Project Instructor: Yogananda Isukapalli, yoga@ucsb.edu



    Project Summary Table:

    GPS-Based & Tracking Messaging & Multiplayer Games Augmented Reality Controls Optical Character & Image Recognition
    Locate Dogpark Pixify Poor Man’s Music Glove Weceipt
    ClassTracker PictoBug Magic Defense Hover Hand Music Performer
    College Attendance App Shout AR Emoji Rolling Ball Awareness++
    SPOT Mobile Base Station Chip Chat Treasure Hunt    
    RydeBot Basilisk      
    UCSB Tracker Speed Sudoku      
    Intuitive Grocery List SET      
    Campus View        




    Poor Man’s Music Glove

    Team: Jashanvir S.Taggar & Celeste Bean

    Drawing inspiration from a number of efforts in the electronic music community to create “musical gloves,” we are building a much more readily available alternative using only a phone. Our app will produce a musical note dependent on the phone’s readings from its accelerometer and gyroscope, which control the note’s volume and pitch, respectively. Users will then be able to save their creations either locally or to a website and will have the ability to download other users’ songs. The project incorporates elements of sensor reading, peripheral output, saving data, and networking. As a stretch goal, we may find some way to incorporate the camera or geographical data so that only people in close proximity can see each other’s images.




    Pixify

    Team: Octavio Lopez & Eduardo Olmos

    User will focus the camera on an image target, using the application. Animated creatures will appear on as many as two image targets. The application will then introduce a combat menu system where the user can attack the other user’s creature. The creatures can also be unlocked for viewing later using its respective image card.




    Hover Hand

    Team: Colin Garrett & Steven Fields

    Hello, we are Hover Hand, a team dedicated to making a more intuitive control for flying a drone. By using a glove outfitted with IMU sensors, we hope to create a model of the hand that tracks movement that can control a drone. We are sending values from our glove over BLE to an android phone, that can then leverage the DJI sdk to control a DJI drone via wifi communication.




    Locate

    Team: Franklin Tang, Himangshu Chowdhury, & Ryan Lorica

    Step aside Bluetooth trackers. It’s time for a real GPS locator. Our goal is to create an affordable mobile tracker that reports to you phone via our app, Locate. The app will be able to ping the Locator via cost-effective text messaging, and receive a reply with the Locator’s GPS coordinates. With Google Maps integration, you will be able to see your device’s real location, on demand.




    ClassTracker

    Team: Saurabh Gupta & Brandon Pon

    The idea of this app is to simplify the process of finding your way around the UCSB campus. Right now, the few methods of navigating around campus include pulling out a physical map (not feasible, especially in today’s tech-y world), using UCSB’s interactive map (which no one really uses), or using Google Maps. The problem with Google Maps is that it doesn’t recognize specific buildings, and sometimes it will not properly recognize an address or location that a user inputs. This is where our app comes into play.




    Weceipt

    Team: Dennis Fong & Jacky Zheng

    We want the process of splitting bills to be as easy as the 3 steps of snap, add, and drag. Nobody wants to be the one who pays for the bill and be responsible for splitting the receipt while calculating tax and tip. Weceipt is the solution to easily splitting a receipt in the snap of a picture.




    College Attendance App

    Team: Chet Koziol & Trevor Hecht

    What is our purpose? To eliminate time wasted by calling role. How will it work? The teacher will set a password prior to class and write it on the board. Once the students arrive, they will enter this password to show that they are attending. After the time runs out for students to enter the password, the list of absent students are displayed for the teacher. What will the user see as a professor: Page 1: has a list of all the courses and times that he teaches. Page 2 (once they select a class): a text edit and a button widget. This is where the teacher will set the attendance password. What will the user see as a student: Page 1: has a list of their classes and options to add or remove a class. Page 2 (once they select a class): a text edit and a button widget. This is where they will submit the attendance password.




    SPOT Mobile Base Station

    Team: Bryan Lavin-Parmenter & Neil O’Bryan

    SPOT (Spatial Positioning on Terrain) is an experimental human communication interface designed for astronauts in conjunction with a human-computer interaction team at NASA. The goal of SPOT is to reduce an astronaut’s cognitive load while maximizing productivity during future surface exploration missions. SPOT accomplishes this goal by reducing the reliance on vocal communication for distribution of information between astronauts.




    Dogpark

    Team: Maga Kim & Raymond Yang

    “Tinder for Dogs”

    Allow dog owners to meet other dog owners in the area.

    Match with other dogs and chat with owners in groups to arrange park visits, runs, meetups etc.




    PictoBug

    Team: Amber Du & Anthony Chen

    PictoBug is an app that pays homage to the Nintendo DS’s program called Pictochat, that allows users nearby to chat and draw with each other without the need for an Internet connection. By using WiFi Direct, the app lets users create and join chat rooms where they can message or draw messages to each other in a similar way that PictoChat did.




    Shout

    Team: Jair R. Santiago Carranza, Min Jian Yang, & Brian Young

    The goal of this projects is create an app that helps people keep in touch with friends, family, and loved ones. This application aims to provide its users with simple person-to-person and group messaging. Additionally, the application will also feature the ability to talk to local users that also have the application installed.




    Chip Chat

    Team: Andrew Polk, Victoria Sneddon, & Matthew Speck

    Our Idea: An app that will allow a user to join a chat room with other users. Chat rooms will be entered by name.

    Using TCP Sockets/Server: Can join chat room by entering a chat room name which will automatically connect you to the correct server and port number A third node/server will be in place where the chat rooms will be hosted The server will act as a bridge between users in a chat room Chat room will stay open on the server as long as a single user is connected. Saving processing power by not keeping rooms open indefinitely.




    RydeBot

    Team: Jiaheng Tang & Yulin Ou

    We want to build a simple Rideshare App called RydeBot. Users will be able to login and either post as a rider, or select the posts as a driver. Riders can post their requests which are shown to drivers. Drivers can view the riders locations/request schedule and select the rider they would like to pick up. They can also chat with riders. Once selected, navigation is started on their phone.




    UCSB Tracker

    Team: Zhicheng Zhang, Yunxi Li, & Rongjian Li

    An interactive app providing locations services for students on campus. Detailed locations including main buildings, trailer rooms and parking lots are covered. The app features a convenient user interfaces including voice/text recognition for campus addresses. The locating service will be available in the form of either GPS or Cellular locating. Further more, a posting sections allows organizers to broadcast event information in real time and potentially other functionalities.




    Rolling Ball

    Team: Zhaoren Zeng, Zhaorui Zeng, & Wenbo Xu

    Our project aimed to build an AR game with MCU control. The app recognized the designated print out image on a surface and generate a ball on the surface which can be controlled by user holding an external MCU with gestures through Triple Axis Accelerometer and Gyro sensor. The MCU send its data to a portable Arduino via cloud. The user will be able to mess around with the ball on the surface. The stretch goal for this project is to make the ball float in the air and the user is able to control the ball moving freely in the air with the MCU. This way the MCU is truly much more superior to the phone’s gyroscope in terms of 3D space control.




    Music Performer

    Team: Haowen Zhang & Haorui Jiang

    This project is aiming to create a new way to make music – using hand gestures. Deliver a Android app allowing user to perform music with virtual instruments using non-touch hand gesture. Implement a hand gesture recognition function to detect/recognize user’s fingers and play related musical sound. Users are able to create a piece of melody with their hands and record their works.




    Basilisk

    Team: Gokul P. Nallasami & Barath K. Ramaswami

    Our primary objective is to design a Wi-Fi based multiplayer snake game that makes uses of a mixture of hardware and software API’s with our primary focus on user experience. The game logic we intend to design is aimed at making the app easy to interact with whilst offering minimal overheads.




    Speed Sudoku

    Team: Aravind Sudharsan & Vinu B. Sankara Lingam

    The primary goal is to create an offline multiplayer game which allows users to play SUDOKU in Single Player and Multiplayer (2 player) modes. This app will allow 2 users to play while they can simultaneously monitor the opponent’s progress in percentage.




    Intuitive Grocery List

    Team: Pooja V. Kadam, Venkat Raman, & Vishal Hosakere

    App which takes the Grocery List as input, along with location input and suggests the nearest possible store where the groceries can be purchased.




    Magic Defense

    Team: John Zhou & LiYuan Wang

    In general, this project is a AR Tower Defense game deployed on Android Device. The classic Tower Defense rules apply in this game. Player need to defense their bases by purchasing and deploying towers with different element magic. The idea of this project came from the customized map of Warcraft 3, Element Defense.




    SET

    Team: Brandon Tran & Danielle Robinson

    Our Android app will implement SET! with a primary focus on the user interface. SET!’s official iOS app clunky with outdated graphic design. We aim to implement both single and multiplayer modes. For multiplayer, we will use Google’s Turn-Based Multiplayer API, as well as Wi-Fi direct for communication between two devices. Stretch goals include high score and leaderboard for single-player, and multiplayer gameplay for more than 2 players.




    Campus View

    Team: James Yang & Ashlynn Cardoso

    Give visitors a pocket guide of UCSB’s major buildings/landmarks. Access their location to give suggestions on what to view next. Offer directions to nearby buildings.




    Awareness++

    Team: Jose Acuna Moscoso & Chris Park

    This project is meant to be a proof of concept of increasing one’s spacial awareness using computer vision. The goal is to allow a user to query the app using voice commands and have the app interpret a live video feed to learn about the user’s environment and then respond to the query. We will be using Alexa Voice Service (AVS) API to allow communication with the user and Google Cloud Vision API to gather information about the user’s environment in real time.




    AR Emoji

    Team: Andrew Gonzalez & Susan Wu

    AR Emoji is an Augmented Reality (AR) mobile phone app that uses your phone camera to display emojis of real life objects. It works by using Computer Vision to detect and recognize objects captured by your phone camera. By creating a mapping between real life objects and their respective emojis the app can create or reference a correct 3D emoji model of the real life object. By augmenting this virtual object over the real object the user can view the 3D emoji that appears to be in front of the user. The purpose of this app is to provide a fun and entertaining visualization of our real world replaced with virtual objects.




    Treasure Hunt

    Team: Even Skari, Rahul Vishwakarma, & Satish Kumar

    Create a Treasure hunt game in Augmented Reality

    • Use AR core to visualize treasure creation and discovery
    • Use location based services for navigation
    • In-game notification services for information sharing


    Seamless integration

    • AR core
    • Cloud-based notification
    • Location services