Apache Storm is a popular open-source distributed computing platform for real-time big-data processing. However, the existing task scheduling algorithms for Apache Storm do not adequately take into account the heterogeneity and dynamics of node computing resources and task demands, leading to high processing latency and suboptimal performance. In this thesis, we propose an innovative machine learning-based task scheduling scheme tailored for Apache Storm. The scheme leverages machine learning models to predict task performance and assigns a task to the computation node with the lowest predicted processing latency. In our design, each node operates a machine learning-based monitoring mechanism. When the master node schedules a new task, it queries the computation nodes obtains their available resources, and processes latency predictions to make the optimal assignment decision. We explored three machine learning models, including Long Short-Term Memory (LSTM), Convolutional Neural Networks (CNN), and Deep Belief Networks (DBN). Our experiments showed that LSTM achieved the most accurate latency predictions. The evaluation results demonstrate that Apache Storm with the proposed LSTM-based scheduling scheme significantly improves the task processing delay and resource utilization, compared to the existing algorithms.
KEYWORDS: Augmented reality, Distributed computing, 3D modeling, Sensors, Data modeling, Data processing, Computer architecture, Autoregressive models, Object detection, Computing systems
Cooperative Augmented Reality (AR) can provide real-time, immersive, and context-aware situational awareness while enhancing mobile sensing capabilities and benefiting various applications. Distributed edge computing has emerged as an essential paradigm to facilitate cooperative AR. We designed and implemented a distributed system to enable fast, reliable, and scalable cooperative AR. In this paper, we present a novel approach and architecture that integrates advanced sensing, communications, and processing techniques to create such a cooperative AR system and demonstrate its capability with HoloLens and edge servers connected over a wireless network. Our research addresses the challenges of implementing a distributed cooperative AR system capable of capturing data from a multitude of sensors on HoloLens, performing fusion and accurate object recognition, and seamlessly projecting the reconstructed 3D model into the wearer’s field of view. The paper delves into the intricate architecture of the proposed cooperative AR system, detailing its distributed sensing and edge computing components, and the Apache Storm-integrated platform. The implementation encompasses data collection, aggregation, analysis, object recognition, and rendering of 3D models on the HoloLens, all in real-time. The proposed system enhances the AR experience while showcasing the vast potential of distributed edge computing. Our findings illustrate the feasibility and advantages of merging distributed cooperative sensing and edge computing to offer dynamic, immersive AR experiences, paving the way for new applications.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.