HandLED is a 3D RGB LED matrix that can be used to visualize and interact with 3D
mathematical graphs, 3D models and more. A mobile application will be used to send the function or
model to be displayed on the cube via bluetooth. The user's hand movements to interact with the image on
the cube will be captured by a bluetooth connected smart glove and/or a radar.
Vehicles have penetrated virtually every industry and made themselves essential to many services that are commonplace today. However, most vehicles are expensive investments that also require care and upkeep throughout ownership including routine maintenance and costly sudden repairs. Modern vehicles warn the driver of potential signs of something that needs attention on the dash or through a phone app, but a more scalable solution would be extremely useful for a corporation or business that owns an entire fleet of vehicles. DataDriven is an end-to-end solution that collects live data from vehicles, uploads it to the cloud, and makes it available on a dashboard for analysis. Data from onboard sensors can be used to course-correct the health of vehicles through preventative and predictive maintenance. This system empowers its users to make informed and high-impact decisions that minimize costs and maximize efficiency.
The UCSB Oakley Evolution Lab seeks to investigate the evolutionary history of ostracods, in particular their bio-luminescence courtship signaling patterns, typically at night and at depths with little external light. We will upgrade the first WALL-E project, which captured and saved stereo vision footage that was used to create 3D reconstructions of the bio-luminescence patterns. Our small·e consists of three detachable subsystems: a camera, light intensity, and DNA collection system. The camera system consists of two low-light sensitive cameras. The light intensity system consists of an SiPM (Silicon Photomultiplier). A microprocessor collects and synchronizes the data from those two systems. The pump system consists of several filters used to collect samples of the ostracods' DNA. The system collects DNA samples whenever a large amount of light is detected, or at a set interval. Solenoid valves are used to isolate and select a filter for DNA collection. A flow rate sensor verifies the amount of water sampled.
Information is invaluable in a combat environment. To address this, our team gives the solution of Overwatch Drones. A drone squad consists of two or more drones to gather information, spot potential risks, and locate users to guard the target’s safety. Our project aims to provide functionalities such as auto following the target, moving to preset locations to check the surroundings, and locate the target and adjust drone’s positions automatically. We also want to provide different ways to control drones. Including auto drive, manual control through android phones, and gesture control. Our first milestone aims to get all desired drone parts, build prototype, and realize basic functionality.
Defect Detect will perform anomaly detection on a product as it comes down the conveyer belt. We will do this by utilizing images from a camera with an overhead view of the conveyer belt as well as a CNN algorithm which runs on the MAX78000 ultra low power AI microcontroller. The defective items will be removed using a powerful vacuum which will extract then place said items in a separate bin. We will also be using the images from the camera to perform object counting, which will show as a value on our touchscreen display. There will be a functionality to control the speed of the conveyer belt as needed from the display. Our performance will be measured at the end by how many of the defects we successfully identify and remove from the conveyer belt.
Current drone sensor arrangements are often varied and complex, with drones requiring many different accelerometers, gyroscopes, and pressure sensors. Our sensor stick aims to consolidate those sensors, along with a microcontroller, into a single, peripheral device which would plug into a drone running Robot Operating System (ROS), and use the data from its sensor array to create an estimation of state.
Viewpointe ties into Alcon’s Ngenuity visualization system, working to collect stereo images from a microscope used by surgeons during cataract operations. Using an FPGA, the two image inputs will be processed into a display form suitable for a 3D monitor. The user will have the ability to choose between visual formats, including side-by-side, top-bottom, and traditional intersampled mosaic. The input type will also support both HDMI and DisplayPort. All processing will happen on the camera itself, without requiring an external computer. By bypassing Ngenuity’s host computer, our hope is to decrease latency and end user costs.