2021 Senior Projects Conference

Computer Science

Room 216 Session: Join us on Zoom.

1:00 p.m.

Bulldog Maps

Team Name: The Legends

Team Members: Vivian Carr, Conan Howard, Stephanie Niemiec, Jonathan Ruppel, Logan Simmons

Advisor: Dr. Kevin Cherry

Bulldog Maps is an Android application designed for students and visitors new to Louisiana Tech University’s campus. This application provides its users with the following: a marker indicating the user’s current location, map markers displaying the location of specified buildings on Louisiana Tech’s campus, and an optional AR image to display the information of each building.

1:30 p.m.

EduBot

Team Name: Team Caplet

Team Members: Landon Jackson, Cody Johnson, Prasil Mainali, Eric Pitts, Anuj Shrestha, Anthony Toussaint

Advisor: Dr. Kevin Cherry

EduBot is a Discord server management bot designed specifically for use by teachers or professors in Discord servers that they set up for their classes. Our bot abstracts away tedious aspects of server administration such as grouping students and creating channels for group assignments, large-scale role assignment, bulk message deletion, or removing groups of members at the end of a quarter or semester. In addition to server administration features, EduBot supports content moderation functionalities such as an automated profanity filter, voice and text chat muting, spam protection, and server locking to prevent misuse of the server by unruly members. EduBot also provides helpful features such as user polls, scheduled notifications, assignment reminders, and breakout room functionality to better help teachers use their server as an effective and convenient means of communicating with their classes without many of the hassles associated with server administration.

2:00 p.m.

Kill the Phish

Team Name: Gods of Rods

Team Members: Tetchi Assamoi, Landry Baudouin, Frankie Cook, Matthew Karloski, Emily Robinson

Advisor: Dr. Kevin Cherry

Our team tackles the all too common threat of finding malicious links when surfing the internet or checking emails. Our goal is to create a way for people to more easily identify links as dangerous or safe by using a browser plugin. By using a plugin, browser threat detection is easier and more available for the general public. If active, the plugin provides a small, resizable popup element that appears when hovering over any link in the browser. This popup shows whether the link is dangerous or safe by using text and imagery. Furthermore, if the link is dangerous and clicked, then the browser is redirected to a final warning page to determine whether the user truly wants to continue forward to the dangerous link. Our product was developed using common web development tools and languages such as HTML, JavaScript, JSON, and Google’s SafeBrowsing API, which checks URLs with the Google Safe Browsing server and retrieves their status. Our final project provides people with an easy, in-browser option to identify links as safe or dangerous!

2:30 p.m.

Pocket Chef

Team Name: Pocket Chef Inc.

Team Members: Ricardo Aranaga, Jacob Bordelon, Kyle Rousselle, Linh Nguyen, Jhamon Phillips

Advisor: Dr. Kevin Cherry

Pocket Chef is an Android application for organizing groceries and simplifying cooking. The application maintains an inventory of the users’ groceries and generates recipes based on that current inventory. Groceries are dynamically added to the application through image recognition or manual entry. The app will also notify the user of groceries near expiration, suggest commonly used items in their grocery list, and allow users to enter their own recipes. The goal of this project is to simplify cooking, track stored products, and give recipes at home.

3:00 p.m.

Visual Workout Planner

Team Name: Coding Gains

Team Members: Chad Bealer, Caitlin Burke, Jeremy Choyce, Tram Doan, Andrew Maurice

Advisor: Dr. Kevin Cherry

The Visual Workout Planner is a React-based website designed to make finding information on exercises for each muscle group easier. The primary features include SVG models of the human body with selectable muscles, as well as a database hosted by FaunaDB which stores information on the exercises. Selecting a muscle will pull data on all exercises pertaining to that muscle from the database and display the information in organized boxes for the user.

Room 218 Session: Join us on Zoom.

1:00 p.m.

Anchor

Team Name: The Dream Team

Team Members: Sydney Anderson, Beverly Coronel, Marco Flores, Lewis Johnson, Reginald Thomas, Promise Ward

Sponsor: Dr. Pradeep Chowriappa

Advisor: Dr. Mike O’Neal

This web application displays news articles alongside specific data to help audiences understand prejudices from a variety of media outlets. Using data mined from the article’s contents, the objectivity, sentiment, and biases identified in each article are revealed and displayed using sentence highlighting and other indicators. Based on this data, users can use the app to see whether an article is biased in some way, and then determine how much they want to rely on or trust the article’s information.

1:30 p.m.

ShopBot

Team Name: Red Stick Tech

Team Members: Matthew Alvidrez, Behram Dossabhoy, Andrew Hall, Austin Harvey, Brian McKay, Damion Owens

Sponsor: Dr. Lorraine “Lori” Jacques

Advisor: Dr. Mike O’Neal

ShopBot is a robot that does your shopping for you. Whether you need an item purchased once or multiple times on a schedule, ShopBot will place the items on your shopping list on order for you when they are in stock. By connecting with major retailers, there is almost nothing you cannot buy.

2:00 p.m.

LIDAR Glasses with Wireless Haptic Feedback

Team Name: Robert and the Davenports

Team Members: Kaleb Crysel, Robert Davenport, Alexander Floyd, David Love, James Love, Jonas Kety

Sponsor: Cody Fontenot

Advisor: Dr. Mike O’Neal

The project focuses on the idea of using LIDAR sensors mounted onto glasses to relay information on motors in wristbands that will output haptic feedback. This is intended to help the visually impaired see objects that are within 6 feet and provide some better-detailed information on how close the object is via vibration motors. The user can turn on the LIDAR sensors on the glasses and start receiving feedback immediately using two wireless wristbands each with attached haptic motors. The closer the user gets to an object, the more intense the feedback will be.

2:30 p.m.

The Apple Watch Health Monitoring App

Team Name: The Smart Apples

Team Members: Joseph Ham, Deanna Kaufman, David Milam, Jerome Reed, Blake Till

Sponsor: Dr. Mike O’Neal

Advisor: Dr. Mike O’Neal

The Apple Watch Health Monitoring App is an application that utilizes data from a user’s iPhone and/or Apple Watch to monitor the user’s activity. If the user’s data suggest that he/she is incapacitated or deceased, then the application generates and sends an alert to each contact on the user-created contact list. The Apple Watch Health Monitoring App is split into two applications: (1) an iPhone application and (2) an Apple Watch application. The Apple Watch application monitors a user’s vitals via a blood oxygen level sensor and also the user’s activity using an accelerometer. The iPhone monitors user activity via an accelerometer and uses data collected from the iWatch to determine if the user is incapacitated. If the user has a lack of activity and does not respond to a prompt, an SOS alert is sent out to a contacts list. The problem addressed by the app is that many times individuals, specifically elderly individuals who live alone, may experience a fatal tragedy, and will not be found until days or weeks later. This application could save these individual’s loved ones some heartache by knowing he/she passed with no delay.

3:00 p.m.

Object Finder Assistant

Team Name: Wii Excel

Team Members: Branson Hanzo, Nathan Hegab, Clark Foster, William Francis, Devon Knudsen, Josh Romero

Sponsor: Dr. Pradeep Chowriappa

Advisor: Dr. Mike O’Neal

The Object Finder Assistant is an iOS and watchOS – based application that utilizes convolutional neural networks, cloud computing (AWS), secure networking, and voice recognition to help users locate lost items in their homes. The user can ask the Object Finder Assistant using text or voice commands to locate an item such as keys or remote control, and the application will return the last known location in the form of the room name and the time it was last seen in that location. This information is delivered to the user via audio and text together with a picture of the room with a box drawn around the item.