paper_authors: Dun Yuan, Ekram Hossain, Di Wu, Xue Liu, Gregory Dudek
for: 提高3D激光通信的用户体验,增强虚拟空间内人员之间的交互。
methods: 利用移动边缘计算(MEC)服务器,实现最小化总延迟的3D激光通信。
results: 对比基eline方法,提出的算法显示出显著的延迟减少效果,并在AR应用中进行了实践。Abstract
3D holographic communication has the potential to revolutionize the way people interact with each other in virtual spaces, offering immersive and realistic experiences. However, demands for high data rates, extremely low latency, and high computations to enable this technology pose a significant challenge. To address this challenge, we propose a novel job scheduling algorithm that leverages Mobile Edge Computing (MEC) servers in order to minimize the total latency in 3D holographic communication. One of the motivations for this work is to prevent the uncanny valley effect, which can occur when the latency hinders the seamless and real-time rendering of holographic content, leading to a less convincing and less engaging user experience. Our proposed algorithm dynamically allocates computation tasks to MEC servers, considering the network conditions, computational capabilities of the servers, and the requirements of the 3D holographic communication application. We conduct extensive experiments to evaluate the performance of our algorithm in terms of latency reduction, and the results demonstrate that our approach significantly outperforms other baseline methods. Furthermore, we present a practical scenario involving Augmented Reality (AR), which not only illustrates the applicability of our algorithm but also highlights the importance of minimizing latency in achieving high-quality holographic views. By efficiently distributing the computation workload among MEC servers and reducing the overall latency, our proposed algorithm enhances the user experience in 3D holographic communications and paves the way for the widespread adoption of this technology in various applications, such as telemedicine, remote collaboration, and entertainment.
摘要
三维杂alomatic通信有可能对虚拟空间中人们之间的交互方式进行革命性的改变,提供 immerse 和 realistic 的经验。然而,实现这一技术的需求包括高数据速率、极低延迟和高计算能力,这些要求成为了一个 significante 挑战。为解决这一挑战,我们提议一种基于 Mobile Edge Computing(MEC)服务器的新的任务调度算法,以最小化3D杂alomatic通信中的总延迟。我们的一个动机是避免“uncanny valley”效应,这种效应可以在延迟缓慢了杂alomatic内容的渲染,导致用户体验更加不真实、不 engagising。我们的提议的算法动态地将计算任务分配给 MEC 服务器,考虑到网络条件、服务器计算能力和3D杂alomatic通信应用的需求。我们进行了广泛的实验来评估我们的算法在延迟减少方面的性能,结果表明我们的方法在其他基准方法上显著超越。此外,我们还提供了一个实际的AR应用场景,不仅说明了我们的算法的可行性,还强调了在 достиieving高质量杂alomatic视图的过程中减少延迟的重要性。通过有效地分配计算任务和减少总延迟,我们的提议的算法提高了3D杂alomatic通信中用户体验的质量,并为这种技术在各种应用,如 теле医、远程合作和娱乐等,开创了新的可能性。
Autoregressive Coefficients based Intelligent Protection of Transmission Lines Connected to Type-3 Wind Farms
methods: 本研究使用了适应范围复杂决策系统(Adaptive Fuzzy Inference System,AFIS)检测缺陷,使用了最小重复性最大相关性算法(Minimum Redundancy Maximum Relevance,MRMR)选择3相电流的AR系数。此外,使用了深度学习网络来监控缺陷检测、缺陷的位置和类型的检测。
results: 研究结果显示,提议的方案能够在不同的系统状态和配置下具有较高的检测精度和速度,并能够适应不同的缺陷型态、位置、时间、风速、变压器连接等因素。Abstract
Protective relays can mal-operate for transmission lines connected to doubly fed induction generator (DFIG) based large capacity wind farms (WFs). The performance of distance relays protecting such lines is investigated and a statistical model based intelligent protection of the area between the grid and the WF is proposed in this article. The suggested method employs an adaptive fuzzy inference system to detect faults based on autoregressive (AR) coefficients of the 3-phase currents selected using minimum redundancy maximum relevance algorithm. Deep learning networks are used to supervise the detection of faults, their subsequent localization, and classification. The effectiveness of the scheme is evaluated on IEEE 9-bus and IEEE 39-bus systems with varying fault resistances, fault inception times, locations, fault types, wind speeds, and transformer connections. Further, the impact of factors like the presence of type-4 WFs, double circuit lines, WF capacity, grid strength, FACTs devices, reclosing on permanent faults, power swings, fault during power swings, voltage instability, load encroachment, high impedance faults, evolving and cross-country faults, close-in and remote-end faults, CT saturation, sampling rate, data window size, synchronization error, noise, and semi-supervised learning are considered while validating the proposed scheme. The results show the efficacy of the suggested method in dealing with various system conditions and configurations while protecting the transmission lines that are connected to WFs.
摘要
保护关系可能会不正确地工作,当电力 transmission lines 连接到 doubly fed induction generator(DFIG)基于大容量风力电站(WF)时。本文 investigate 的 performance of distance relays protecting such lines and propose a statistical model based intelligent protection of the area between the grid and the WF. The suggested method employs an adaptive fuzzy inference system to detect faults based on autoregressive(AR)coefficients of the 3-phase currents selected using minimum redundancy maximum relevance algorithm. Deep learning networks are used to supervise the detection of faults, their subsequent localization, and classification.The effectiveness of the scheme is evaluated on IEEE 9-bus and IEEE 39-bus systems with varying fault resistances, fault inception times, locations, fault types, wind speeds, and transformer connections. Further, the impact of factors like the presence of type-4 WFs, double circuit lines, WF capacity, grid strength, FACTs devices, reclosing on permanent faults, power swings, fault during power swings, voltage instability, load encroachment, high impedance faults, evolving and cross-country faults, close-in and remote-end faults, CT saturation, sampling rate, data window size, synchronization error, noise, and semi-supervised learning are considered while validating the proposed scheme. The results show the efficacy of the suggested method in dealing with various system conditions and configurations while protecting the transmission lines that are connected to WFs.
Impact of Artificial Intelligence on Electrical and Electronics Engineering Productivity in the Construction Industry
results: 这篇论文的研究结果表明,人工智能可以大大提高建筑设计和建造的效率和产效,同时可以降低能源消耗和提高建筑安全性。Abstract
Artificial intelligence (AI) can revolutionize the development industry, primarily electrical and electronics engineering. By automating recurring duties, AI can grow productivity and efficiency in creating. For instance, AI can research constructing designs, discover capability troubles, and generate answers, reducing the effort and time required for manual analysis. AI also can be used to optimize electricity consumption in buildings, which is a critical difficulty in the construction enterprise. Via machines gaining knowledge of algorithms to investigate electricity usage patterns, AI can discover areas wherein power may be stored and offer guidelines for enhancements. This can result in significant value financial savings and reduced carbon emissions. Moreover, AI may be used to improve the protection of creation websites. By studying statistics from sensors and cameras, AI can locate capacity dangers and alert workers to take suitable action. This could help save you from injuries and accidents on production sites, lowering the chance for workers and enhancing overall safety in the enterprise. The impact of AI on electric and electronics engineering productivity inside the creation industry is enormous. AI can transform how we layout, build, and function buildings by automating ordinary duties, optimising electricity intake, and enhancing safety. However, ensuring that AI is used ethically and responsibly and that the advantages are shared fairly throughout the enterprise is essential.
摘要
人工智能(AI)可以革命化建筑业,特别是电气和电子工程。通过自动化重复任务,AI可以提高产出力和效率,从而提高建筑设计的创新能力。例如,AI可以研究建筑设计,发现能源问题和提供解决方案,从而减少人工分析的劳动和时间。此外,AI还可以优化建筑物业的能源消耗,从而解决建筑业中的重要问题。通过机器学习算法分析能源使用模式,AI可以发现能源的浪费和提供改进建议,从而实现重要的成本节省和减少碳排放。此外,AI还可以提高建筑工地的安全性。通过分析感知器和摄像头数据,AI可以检测潜在风险并警示工作人员采取适当行动,从而避免伤害和事故发生在建筑工地上,提高工人安全性和全面的安全性。AI对电气和电子工程产出力的影响是巨大的,可以改变我们的设计、建造和运营建筑的方式,自动化常见任务、优化能源消耗和提高安全性。然而,确保AI的使用是道德和责任的,并确保产业中的利益均衡分配是关键。
Digital Twin-Empowered Smart Attack Detection System for 6G Edge of Things Networks
results: 实现了高效、可靠和适应性的攻击检测,保障6G EoT网络安全性Abstract
As global Internet of Things (IoT) devices connectivity surges, a significant portion gravitates towards the Edge of Things (EoT) network. This shift prompts businesses to deploy infrastructure closer to end-users, enhancing accessibility. However, the growing EoT network expands the attack surface, necessitating robust and proactive security measures. Traditional solutions fall short against dynamic EoT threats, highlighting the need for proactive and intelligent systems. We introduce a digital twin-empowered smart attack detection system for 6G EoT networks. Leveraging digital twin and edge computing, it monitors and simulates physical assets in real time, enhancing security. An online learning module in the proposed system optimizes the network performance. Our system excels in proactive threat detection, ensuring 6G EoT network security. The performance evaluations demonstrate its effectiveness, robustness, and adaptability using real datasets.
摘要
globally, the number of Internet of Things (IoT) devices connecting to the internet is surging, and a significant portion of these devices are gravitating towards the Edge of Things (EoT) network. This shift is causing businesses to deploy infrastructure closer to end-users, which enhances accessibility. However, the growing EoT network is expanding the attack surface, making it necessary to implement robust and proactive security measures. Traditional solutions are insufficient against the dynamic threats posed by EoT, highlighting the need for proactive and intelligent systems.To address this need, we propose a digital twin-empowered smart attack detection system for 6G EoT networks. By leveraging digital twin and edge computing, the system monitors and simulates physical assets in real time, enhancing security. An online learning module in the proposed system optimizes network performance. Our system excels in proactive threat detection, ensuring the security of 6G EoT networks. Performance evaluations demonstrate its effectiveness, robustness, and adaptability using real datasets.
Human Respiration Detection Under Interference: Challenges and Solutions
paper_authors: Kehan Wu, Renqi Chen, Haiyu Wang, Guang Wu for: 本研究旨在探讨通用WiFi设备可以检测人类呼吸速率,但这些设备在日常生活中遇到人体运动干扰的情况下具有准确检测人呼吸的能力。methods: 本研究提出了一种专门为人呼吸检测设计的不活跃感知和通信系统,该系统在60.48GHz频率带内运行,能够在人体运动干扰的情况下检测人呼吸。研究人员使用训练 neural network 来实现人呼吸检测。results: 实验结果表明,该系统在人体运动干扰的情况下可以保持高于90%的人呼吸检测精度, provided that the sensing duration is adequate. 最后,研究人员 derivated一个分析模型来实现在10秒内计算呼吸速率。Abstract
Recent research has highlighted the detection of human respiration rate using commodity WiFi devices. Nevertheless, these devices encounter challenges in accurately discerning human respiration amidst the prevailing human motion interference encountered in daily life. To tackle this predicament, this paper introduces a passive sensing and communication system designed specifically for respiration detection in the presence of robust human motion interference. Operating within the 60.48GHz band, the proposed system aims to detect human respiration even when confronted with substantial human motion interference within close proximity. Subsequently, a neural network is trained using the collected data by us to enable human respiration detection. The experimental results demonstrate a consistently high accuracy rate over 90\% of the human respiration detection under interference, given an adequate sensing duration. Finally, an empirical model is derived analytically to achieve the respiratory rate counting in 10 seconds.
摘要
近期研究表明,可用商业 WiFi 设备探测人类呼吸速率。然而,这些设备在日常生活中遇到人体运动干扰的情况下减少呼吸速率的准确性。为解决这个问题,本文提出了一种特有的投射感知和通信系统,用于在人体运动干扰的情况下探测人类呼吸速率。该系统在60.48GHz频率带内运行,能够在近距离 confronted with substantial human motion interference 下探测人类呼吸速率,并且通过我们collected数据进行训练,以实现人类呼吸速率探测。实验结果表明,在90%的人类呼吸速率探测情况下,系统具有高度的精度和可靠性。最后,我们 derivation analytical model 用于计算呼吸速率 counting 在10秒内。
A Comprehensive Indoor Environment Dataset from Single-family Houses in the US
results: 本研究得到了一个包含室内环境因素的大量数据集,可以用于提高建筑能源消耗、occupant behavior、预测维护和其他相关领域的模型。Abstract
The paper describes a dataset comprising indoor environmental factors such as temperature, humidity, air quality, and noise levels. The data was collected from 10 sensing devices installed in various locations within three single-family houses in Virginia, USA. The objective of the data collection was to study the indoor environmental conditions of the houses over time. The data were collected at a frequency of one record per minute for a year, combining over 2.5 million records. The paper provides actual floor plans with sensor placements to aid researchers and practitioners in creating reliable building performance models. The techniques used to collect and verify the data are also explained in the paper. The resulting dataset can be employed to enhance models for building energy consumption, occupant behavior, predictive maintenance, and other relevant purposes.
摘要
文章描述了一个包含室内环境因素的数据集,包括温度、湿度、空气质量和噪声水平。数据来自于美国弗吉尼亚州三座单户住宅内的10个感知设备的收集, duration of one year, totaling over 2.5 million records. 文章提供了实际的 floor plan 和感知设备的安装位置,以帮助研究人员和实践者创建可靠的建筑性能模型。文章还介绍了数据收集和验证的技术,以及可以用于提高建筑能效消耗、occupant behavior、预测维护和其他相关目的的模型。Note: Please note that the translation is in Simplified Chinese, which is the standard form of Chinese used in mainland China. If you need Traditional Chinese, please let me know.
Integrated Communication, Sensing, and Computation Framework for 6G Networks
results: 本文提出的ICSAC框架可以提高IMN应用程序的可靠性、响应速度、感知信息的准确性和时效性、计算的隐私和安全性等方面的性能。同时,通过对关键实现技术的评估结果,表明ICSAC框架的可行性。Abstract
In the sixth generation (6G) era, intelligent machine network (IMN) applications, such as intelligent transportation, require collaborative machines with communication, sensing, and computation (CSC) capabilities. This article proposes an integrated communication, sensing, and computation (ICSAC) framework for 6G to achieve the reciprocity among CSC functions to enhance the reliability and latency of communication, accuracy and timeliness of sensing information acquisition, and privacy and security of computing to realize the IMN applications. Specifically, the sensing and communication functions can merge into unified platforms using the same transmit signals, and the acquired real-time sensing information can be exploited as prior information for intelligent algorithms to enhance the performance of communication networks. This is called the computing-empowered integrated sensing and communications (ISAC) reciprocity. Such reciprocity can further improve the performance of distributed computation with the assistance of networked sensing capability, which is named the sensing-empowered integrated communications and computation (ICAC) reciprocity. The above ISAC and ICAC reciprocities can enhance each other iteratively and finally lead to the ICSAC reciprocity. To achieve these reciprocities, we explore the potential enabling technologies for the ICSAC framework. Finally, we present the evaluation results of crucial enabling technologies to show the feasibility of the ICSAC framework.
摘要
在六代(6G)时期,智能机器网络(IMN)应用需要合作机器具有通信、感知、计算(CSC)能力。本文提出了一个集成通信、感知、计算(ICSAC)框架,以实现CSC功能之间的互相关联,提高通信的可靠性和延迟、感知信息获取的准确性和时间性、计算的隐私和安全性,实现IMN应用。具体来说,感知和通信功能可以合并到同一个平台上,使用同一个传输信号,并利用实时感知信息为智能算法提供优化通信网络的优先信息。这被称为计算力 Empowered 集成感知通信(ISAC)互相关联。此互相关联可以进一步提高分布计算的性能,采用网络感知能力的协助,并被称为感知力 Empowered 集成通信计算(ICAC)互相关联。上述 ISAC 和 ICAC 互相关联可以相互强化,最终导致ICSAC 框架。为实现这些互相关联,我们探讨了ICSAC 框架的可能的实现技术。最后,我们展示了关键实现技术的评估结果,以证明ICSAC 框架的可行性。
results: 研究发现,如果使用多个时间变化的抽象阈值序列,可以大大提高矩阵完成算法的性能。此外,该论文还提出了三种变体的OB-SVT算法,其中一种使用随机抽取的数据来减少运算空间的维度,从而加速了融合。Abstract
We explore the impact of coarse quantization on matrix completion in the extreme scenario of dithered one-bit sensing, where the matrix entries are compared with time-varying threshold levels. In particular, instead of observing a subset of high-resolution entries of a low-rank matrix, we have access to a small number of one-bit samples, generated as a result of these comparisons. In order to recover the low-rank matrix using its coarsely quantized known entries, we begin by transforming the problem of one-bit matrix completion (one-bit MC) with time-varying thresholds into a nuclear norm minimization problem. The one-bit sampled information is represented as linear inequality feasibility constraints. We then develop the popular singular value thresholding (SVT) algorithm to accommodate these inequality constraints, resulting in the creation of the One-Bit SVT (OB-SVT). Our findings demonstrate that incorporating multiple time-varying sampling threshold sequences in one-bit MC can significantly improve the performance of the matrix completion algorithm. In pursuit of achieving this objective, we utilize diverse thresholding schemes, namely uniform, Gaussian, and discrete thresholds. To accelerate the convergence of our proposed algorithm, we introduce three variants of the OB-SVT algorithm. Among these variants is the randomized sketched OB-SVT, which departs from using the entire information at each iteration, opting instead to utilize sketched data. This approach effectively reduces the dimension of the operational space and accelerates the convergence. We perform numerical evaluations comparing our proposed algorithm with the maximum likelihood estimation method previously employed for one-bit MC, and demonstrate that our approach can achieve a better recovery performance.
摘要
我们研究了粗糙量化对矩阵完成问题的影响,在极端情况下,当矩阵元素与时间变化的阈值进行比较。具体来说,我们不是直接观察矩阵中一小部分高分辨率的元素,而是因为这些比较而获得的一小部分一位数据。为了使用粗糙量化知道的矩阵元素来重建低维矩阵,我们将一位矩阵完成问题(one-bit MC)变换为核心 нор 最小化问题。一位数据被探测为矩阵元素的一位数据,则被表示为矩阵元素的线性不等制约。我们运用广泛的价值阈值分布(uniform、Gaussian、精确阈值),以提高matrix completion的性能。为了加速我们的提案的整合速度,我们导入三种OB-SVT的变体。其中一种是随机当地OB-SVT,它在每个迭代过程中使用简测数据,而不是使用完整的信息。这种方法可以将操作空间的维度降低,并加速整合速度。我们通过与之前用于一位MC的最大概率估计法进行比较,并证明了我们的方法可以实现更好的重建性能。