CSCI342_FinalProject

Angelina Sanchez




Angelina Sanchez

12/10/2025 CSCI342 - Designing Your Own OS

Angelina Sanchez

The operating system is designed to run on a wearable hardware format, specifically

smart glasses. This equipment is a projection operating system tailored for architects and engineers. The physical hardware is described as lightweight and features high-resolution projectors and multiple sensors including depth sensors in the camera. The depth sensors are crucial because the glasses' main goal is to project digital information onto the real world. The primary goal of the operating system is to provide a seamless Augmented Reality (AR) experience that brings historical architectural data directly into the user's view. The OS prioritizes efficiency and is characterized by a small footprint. This enables the OS to run well on low-power chips used in wearables which conserves battery life. Although the glasses are designed to be lightweight but fast, the focus of the OS itself is on being super lightweight and built for one specialized job. The job is to quickly spot buildings and show historical information, rather than handling random tasks. This specialization is analogous to how phone operating systems are optimized for mobile use rather than desktop use The UI format is a blend of command line and human integrated/bionic components, primarily using gestures. The user does not interact with a traditional screen because the UI is projected into their vision. The visual component is a minimalist Heads-Up Display that floats in the user's peripheral vision. This shows essential information like battery and tools. User interaction includes gesture-based commands to manipulate projected objects like grab, scale, or swipe through building versions as well as voice commands. For more precise tasks, a tiny command line appears when the user focuses on an object, allowing options to be selected through eye focus or a head movement like a tilt. The OS is designed as a hybrid system. At its core, it’s locked down to keep things secure and stable since it deals with sensitive engineering and historical data and runs on lightweight

The operating system is designed to run on a wearable hardware format, specifically

hardware. The OS avoids vulnerabilities, fragmentation, and performance issues. It’s also

optimized for speed and efficiency especially for tasks like, recognizing buildings and overlaying data in real time. Locking down the core makes sure outside code can’t interfere with the critical AR projection and sensor pipeline, which is key to a smooth user experience. However with that said, the system isn’t completely closed off. To make it useful for architects and engineers, it comes with an SDK and a clear API. This gives developers room to connect the OS with tools like CAD software or project management platforms, build new gesture or voice commands, and create custom data layers or visualization tools like overlays showing material details or stresstesting results on top of the AR projections. The OS runs on a multi-core CPU setup that balances speed and efficiency. It uses an ARM-based chip with two types of cores. First, a couple of high-performance cores that tackle heavy, time-sensitive jobs like processing sensor data, rendering 3D models, and running AI for building recognition. Secondly, a larger set of efficiency cores that handle lighter background tasks such as updating the heads-up display, keeping the network connected, and running the lightweight OS itself. This mix keeps the system fast where it matters but saves battery life. To keep everything running smoothly the OS uses a priority-based scheduling system. Tasks tied directly to the AR projection loop like sensor input, rendering, and output always get top priority. User interactions such as gestures or voice commands sit in the middle, while background jobs like syncing data or managing battery run at the lowest priority, usually on the efficiency cores. For the mostt critical AR tasks, the scheduler adds a real-time twist to make sure those jobs always finish on time which guarantees a responsive and fluid display. This OS uses mutexes and condition variables to keep everything running smoothly without threads stepping on each other’s toes. Mutexes act like locks, making sure only one

hardware. The OS avoids vulnerabilities, fragmentation, and performance issues. It’s also

process at a time can update shared data like sensor readings or gesture states so nothing gets

corrupted or misread. Condition variables work alongside these locks to help threads coordinate. For example, when the sensor thread has fresh data, it signals the rendering thread to jump in right away. This setup avoids wasted effort from constant checking and keeps the system efficient and saves battery power on the lightweight hardware. This OS needs fast and reliable storage to do its job well. It will use flash-based memory like UFS or eMMC, which is quick, energy-efficient, and durable enough for a lightweight wearable device. Even though the OS itself is small, it must handle big, complex files things like cached architectural data, detailed 3D models, blueprints, and user-created scans of construction sites. Since RAID setups don’t make sense for a single device, the system instead relies on automatic cloud backups. Whenever there is a network connection it syncs valuable project data to a secure server, so if the glasses are lost or damaged nothing important is gone for good. The file system in this OS is tightly managed to keep things secure while still being useful. Users won’t be able to touch the core system files or anything that could break stability. Instead, they will work within a clean project-based setup. This includes folders for active projects and 3D models, local scans captured by the glasses, and personal settings like gesture profiles. Behind the scenes, strict security rules make sure apps can only access user data and never mess with critical system files, so even if something goes wrong, the core OS stays safe and stable. Since the glasses hold sensitive data and are easy to carry around a strong security is a must. They will use built-in sensors for hands-free biometric login like a facial or retinal scan so the user is automatically authenticated when putting them on. All biometric info and encryption keys are stored in a secure hardware that is separate from the main OS, to block software-based

process at a time can update shared data like sensor readings or gesture states so nothing gets

theft. If the biometric login fails or the system reboots, users will need to enter a strong password

or PIN through voice commands or the projected interface. Also, all the files will nbe protected with full-disk encryption, keeping valuable designs and blueprints safe even if the device is lost or stolen. The OS will use a scheduling system that makes sure the most important jobs always get done first (prioritize), especially the real-time AR tasks. Things like pulling in sensor data, running the AI to recognize buildings, and rendering 3D overlays get top priority so the display stays smooth. User interactions like gestures, voice commands, and UI updates sit in the middle responding quickly but stepping aside if the AR pipeline needs the CPU. Background jobs like syncing to the cloud or checking system health run at the lowest priority on the efficiency cores since they are not time-sensitive. This setup keeps the AR experience fast and seamless while also saving battery life on the lightweight hardware.

theft. If the biometric login fails or the system reboots, users will need to enter a strong password



Flipbook Gallery

Magazines Gallery

Catalogs Gallery

Reports Gallery

Flyers Gallery

Portfolios Gallery

Art Gallery

Home


Fleepit Digital © 2021