Research Videos

EPIC research lab overview

This video is a brief 5 minute overview of the various research projects currently ongoing in the EPIC research lab.

Indoor localization and navigation with smartphones

These videos show how smartphones can be used for indoor navigation and localization. The first video shows a walkthrough demo of the RamLoc app developed in the EPiC lab, which uses machine learning techniques, WiFi fingerprinting, and inertial sensing to track a user in indoor environments. The second video shows a robotic rover developed in our lab that is equipped with a LiDAR sensor and that traverses indoor paths with multiple mobile devices collecting wireless signal fingerprint data (a practice called wardriving) to assist with real-time indoor navigation.

Videos courtesy: Saideep Tiku (PhD student)

Datacenter heat dissipation simulations

These videos show how heat dissipates across racks and servers in a small datacenter with multiple aisles of computing racks and air-cooling via a CRAC (computer room air conditioner). The cool air is circulated below the floor and let out via floor vents. Hot air rises and is captured at the top of the room for recirculation to the CRAC. These computational fluid dynamics (CFD) based simulations were part of our research to minimize cooling and computational power in datacenters.

Videos courtesy: Mark Oxley (PhD student)

Advanced driver assistance systems (ADAS) for autonomous vehicles

These videos show how our vision-based ADAS system works in real-time to detect vehicles, pedestrians, and lanes. Various machine learning and image processing algorithms are in play here, to enable fast, yet accurate predictions. The algorithms are implemented on an embedded hardware board. We have also explored using stereo cameras for depth estimation. The video on the bottom left shows an integration of RADAR for depth sensing with vision-based object detection. The video on the bottom right shows a convolution neural network (CNN) based object detection in real-time. Our ongoing work is integrating LIDAR and RADAR data with vision-based data for more robust ADAS predictions and enhanced automotive safety.

Videos courtesy: Jordan Tunnell (MS student), ADAS undergraduate senior design team

Silicon photonics microring resonator simulation

Theses videos show the coupling of light from a waveguide to a microring resonator (MR) device. This coupling of light in MRs is the reason why these devices are used as modulators (to convert electrical bitstream data into optical data) and as filters/detectors to receive the optical signals and drop them onto photodetectors, to recover electrical signals. The first video (on the left) shows how an MR can extract (filter) signals from one waveguide and insert (drop) the signal to another waveguide. The second video (on the right) shows how light that has been coupled by MRs along a waveguide can subsequently couple between MRs that are placed close to each other. This coupling of light between MRs is undesirable, and referred to as crosstalk. To prevent such crosstalk, MRs need to be separated from each other by a technology-dictated minimum distance.

Video courtesy: Ishan Thakkar (PhD student)

Brain controlled smart home and wheelchairs

These videos show brain/thought controlled electronics. The first video shows a brain controlled wheelchair, where the user can think about moving forward or stopping and have the wheelchair respond to those thoughts. The second video shows a use case for brain-controlled smart home. A virtual reality home was designed and various electronics devices were controlled by thought. The video also shows how an actual smart home device (smart light) can be controlled by thought. All experiments involved using the Emotiv human-brain interface and post processing of EEG data with algorithmic techniques, before pairing to connected electronics devices (Raspberry Pis, Arduinos).

Videos courtesy: brain controlled wheelchair and brain controlled smart home undergraduate senior design teams

Augmented Reality (AR) and Virtual Reality (VR) games for therapy

These videos show augmented reality and virtual reality based games that we developed to rehabilitate upper limb movement in survivors of stroke, cerebral palsy, and traumatic brain injury. The first video shows our GATOR system that uses a leap motion controller and a web connected suite of games to enable low cost, in-home therapy. The second video is an older project that used AR markers to track user movement. These games and the low-cost infrastructure reduce costs associated with occupational therapy.

Videos courtesy: GATOR and AR games undergraduate senior design teams

Inverted pendulum balancing wheelchair

This video shows an inverted pendulum balancing wheelchair (TUX2) that my student designed from scratch. You can see the smaller prototype (TUX) that he started out with in the video, before he attempted to create a full-size ‘Segway-like’ design that could hold a human and not tip over. Various real-time control algorithms are at play here, that capture data from various inertial sensors and adjust motor behavior rapidly in response to changes in user input and orientation.

Videos= courtesy: Jonathan Cox, senior design student