ID |
原文 |
译文 |
25605 |
同时,节点大部分时间处于睡眠状态,仅在少部分时间内苏醒工作,造成数据备份的通信延迟过大。 |
At the same time, each node is sleeping in most of its time, and wakes upto work in only a small portion of time. This sleeping /working mode results in excessive communication delay for data back-up. |
25606 |
提出一种快速的低能耗数据保存机制。 |
A fast data storage mechanism with low energy consumption is proposed. |
25607 |
首先,源节点基于连续时间序列对感知数据进行分段线性拟合压缩; |
First, each source node performs piecewise lin-ear fitting compression on its sensing data based on continuous time series. |
25608 |
接着,节点根据预估故障概率和存储空间大小,计算出合理的压缩数据备份数量。 |
Then, the node calculates a reasonable number of compressed data backups based on an estimated failure probability and the size of its storage space. |
25609 |
在此基础上,设计一种动态自适应传输协议。 |
On this basis, a dynamic adaptive transmission protocol is designed. |
25610 |
实验仿真表明,与已有存储算法比较,该机制具有更低的传输能耗和通信延迟。 |
Experimental simulations show that this mechanism has lower energy consumption of transmission and lower communication delay compared with existing storage algorithms. |
25611 |
多标记学习用于处理一个示例同时与多个类别标记相关的问题。 |
Multi-label learning deals with the problem where each instance has a set of class labels simultaneously. |
25612 |
在多标记学习中,标记相关性能够显著提升学习算法的性能。 |
In multi-label learning, label correlations have shown promising strength in improving multi-label learning. |
25613 |
大多数现有的多标记学习算法在利用标记的相关性时,要么只使用被所有示例所共享的全局标记相关性,要么就使用局部标记相关性,它们认为不同簇中的示例应该存在不同的标记相关性。 |
Most of the existing multi-label learning algorithms exploited either global label correlations shared among all instances, or local label correlations varied across different clusters of instances. |
25614 |
本文中,我们提出了一种同时利用全局和局部标记相关性的多标记学习算法,从而为学习进程提供更全面的标记信息。 |
In this study, we propose a novel multi-label learning method by simultaneously taking into account both the global and local label correlations to capture more comprehensive label information during thelearning process. |