top of page


#IoT #Arduino #C++ #3D-Printing



Certainly! Here's a more narrative-driven content outline for the web page of your project, organized into comprehensive paragraphs:


### Project Overview

Our project, "Unlocking Big Data & Analytics for Billions of Visually Impaired Individuals," introduces an AI-powered Accessibility Assistant designed to bridge the gap between complex digital information and visually impaired users. Our initiative focuses on transforming the way visually impaired individuals interact with and benefit from digital tools, enhancing their ability to access, understand, and utilize data-driven websites and applications.


### Problem Statement

Approximately 2.2 billion people worldwide face significant challenges when navigating digital environments due to visual impairments. Traditional tools often fail to address the complexities of navigating graphics, tables, charts, and dynamic content, leading to a fragmented user experience. Users frequently miss crucial data and context, struggle with the linear presentation of information, and spend excessive time locating relevant content. This not only hinders their ability to use digital tools effectively but also impacts their overall digital literacy and independence.


### Our Solution: Enlight

Enlight is our innovative solution tailored specifically for visually impaired users, incorporating advanced AI and machine learning technologies to interpret and present digital content in an accessible manner. Enlight provides a contextual summary of web pages, offers interactive question-and-answer features, and supports customizable accessibility options such as text resizing and pausing content. Our technology captures screenshots of web pages, analyzes them using a multimodal Large Language Model (LLM) that includes GPT-4, visualBERT, and Google Vision API, and then delivers insightful, conversational presentations of the content. This includes detailed explanations of text, graphs, and images, making complex information comprehensible and navigable.


### Technology Behind Enlight

The technology driving Enlight involves a three-step process: Capture, Analyze, and Interact. Initially, Enlight captures the visual layout of web pages through screenshots. Subsequent analysis by our multimodal LLM allows for the extraction and interpretation of text and graphical data, transforming it into an auditory format that screen readers can easily communicate. This not only ensures that visually impaired users can understand the content but also interact with it through a conversational AI that responds to user inputs and questions, facilitating a more dynamic and engaging user experience.


### Impact and User Testimonials

The impact of Enlight on the visually impaired community has been profound. Users like Sakshi Shrivastava, a visually impaired entrepreneur from Florida, have experienced significant improvements in navigating and understanding complex websites. "Enlight has transformed how I access digital tools, making data interpretation and website navigation seamless and intuitive," says Sakshi. Such testimonials underscore the practical benefits and enhanced independence that our solution offers to visually impaired individuals.


### Privacy and Security

We prioritize user privacy and data security in every aspect of Enlight's operation. Upon setup, users are asked for explicit consent to capture and analyze webpage content. All data, from screenshots to AI-generated insights, are securely processed and promptly deleted after each session to ensure confidentiality and compliance with the GDPR. Our commitment to privacy extends to capturing only the content of the active tab, preventing any unauthorized access to sensitive information.


### Design Principles for Accessibility

Our approach to design adheres to the EAST framework—Easy, Attractive, Social, and Timely—ensuring that Enlight is not only functional but also user-friendly for all visually impaired users. We emphasize simple, intuitive navigation and engaging, customizable interfaces that respect user preferences and accessibility needs. By incorporating high-contrast visuals, large text options, and voice-based interaction capabilities, we make sure that our technology is not just accessible but also enjoyable and empowering for users.


This narrative layout should effectively communicate the scope, technology, impact, and ethical considerations of your project while engaging a broad audience on your online portfolio.

1. Case Design

  • Compact, enclosed case for all controller components

  • 50% reduction in size and weight(final volume under 996 Cu. In. & final weight less than 21 lbs)

  • Ability to support a 10lb static load

2. Minimum Power Requirements

  •  Provide at least 12 volts at 2 amps per motor

3. Command Execution

  • Software developed in C++

  • Process encoder signals from each motor

  • Allow command-line input

4. Functionality

  • Maintain the full range of motion outlined in Scorbot VII user manual

  • Operate the arm with a maximum payload of 4.4 lbs

  • Manipulate the robot by rotating each motor individually

The aforementioned project was done in accordance with the final project requirement for MECH 447: Mechanical Design II at the University of Nebraska-Lincoln. This group project was accomplished in a span of 15 weeks during the Fall 2019 term. Other members of the team included Drew Jerred and Connor Kaeding. Within the allocated time for this project, the design team held weekly meetings throughout the semester with the sponsor of this project, Dr. Carl Nelson, to provide updates on the progress and adjust the output result based on the specific needs of the sponsor. At the end of this period, the aforementioned deliverables were successfully met. A compact case with a volume of 154 cubic inches and an overall weight of 0.89 pounds capable of supporting 10 pounds of the external load was 3D printed. The prototype’s cost was $287.84, with a custom-designed PCB taking $67.00 of the expenses. Moreover, sufficient power was successfully provided to operate the motors. Command execution and functionality deliverables were also achieved.


The full report detailing the approach to accomplish the above tasks with justifications to the decisions made for each phase can be found below. 

bottom of page