ID 原文 译文
39126 空频分组码(SFBC)可以为MIMO系统带来发射分集增益,而单载波频分多址(SC-FDMA)系统的发送信号具有低峰均比(PAPR),但直接传输正交SFBC(OSFBC)会出现PAPR过大的问题。 Space-Frequency Block Code(SFBC) has full diversity in MIMO systems, and Single-carrier Frequency Division Multiple Access(SC-FDMA) is a modulation technique with low Peak-to-Average Power Ratio(PAPR) of single-carrier transmission, but transmitting Orthogonal SFBC(OSFBC) signals by SC-FDMA will lead to high PAPR.
39127 为此,本文以四发射天线为例针对OSFBC提出了一种基于DFT扩频的低PAPR传输方案, To solve this problem, a new transmitting scheme based on multi-DFT spectrum spreading with four transmit antennas was proposed.
39128 通过设置多个并行DFT单元并将得到的多组频域数据等间隔交叉排列映射到子载波上,根据频域循环移位和内插的性质使时域信号等效为多条单载波信号的叠加,PAPR与同点数SC-FDMA信号大致相当。 By setting up parallel DFT units, frequency domain symbols were arranged crosswise with equal spacing on subcarriers, avoiding adjusting the symbol order in one DFT units. The corresponding transmitter and receiver were given and were easy to implement. It was proved that the generated signal is equivalent to the sum of multiple single-carrier signals according to frequency cyclic shift and interpolation property, and the overall PAPR obtained a low value close to single-carrier signals.
39129 同时,本文方案未带来误码率性能的下降,能够取得的分集增益与正交空时分组码(OSTBC)相同。 Furthermore, the proposed scheme does not degrade the BER performance, and has the same diversity of OSTBC.
39130 目前,人脸美丽预测存在数据样本少、评价指标不明确和人脸外观变化大等问题。 At present, there are some problems in facial beauty prediction, such as few data samples, unclear evaluation index and large change of face appearance.
39131 多任务迁移学习能有效利用相关任务和源域任务额外的有用信息,知识蒸馏可将教师模型的部分知识蒸馏到学生模型,降低模型复杂性和大小。 Multi-task transfer learning can effectively utilize the additional useful information of related tasks and source domain tasks. Knowledge Distillation can distill some knowledge of teacher model into student model, and reduce the complexity and size of model.
39132 本文将多任务迁移学习与知识蒸馏相结合,用于人脸美丽预测,以大规模亚洲人脸美丽数据库(Large Scale Asia Facial Beauty Database, LSAFBD)中人脸美丽预测为主任务,以SCUT-FBP5500数据库中性别识别为辅任务。 In this paper, multi-task transfer learning and knowledge distillation are combined for facial beauty prediction, in which facial beauty prediction using Large Scale Asian Facial Beauty Database(LSAFBD) is the main task and gender identification in SCUT-FBP5500 database is regarded as the auxiliary task.
39133 首先,构建多输入多任务的人脸美丽教师模型和学生模型; Firstly, multi-input multi-task facial beauty teacher model and student model are constructed.
39134 其次,训练多任务教师模型并计算其软目标; Secondly, we trained the multi-task teacher model and calculated its soft targets.
39135 最后,结合多任务教师模型的软目标和学生模型的软、硬目标进行知识蒸馏。 Finally, knowledge distillation is carried out by combining the soft targets of the multi-task teacher model and the soft and hard targets of the student model.