###
Journal of Software:2016.27(11):2843-2854

社会媒体中用户的隐式消费意图识别
付博,刘挺
(哈尔滨工业大学 计算机科学与技术学院 社会计算与信息检索研究中心, 黑龙江 哈尔滨 150001)
Implicit User Consumption Intent Recognition in Social Media
FU Bo,LIU Ting
(Research Center for Social Computing and Information Retrieval, School of Computer Science and Technology, Harbin Institute of Technology, Harbin 150001, China)
Abstract
Chart / table
Reference
Similar Articles
Article :Browse 1010   Download 1306
Received:November 20, 2014    Revised:March 11, 2015
> 中文摘要: 不同于已有的显式消费意图识别的研究,提出了社会媒体中用户的隐式消费意图自动识别方法.该方法将隐式消费意图识别视作多标记分类问题,并综合使用了基于用户关注行为、意图关注行为、意图转发行为以及个人信息的多种特征.由于隐式消费意图识别难以评价,自动抽取了大量跨社会媒体的用户链指信息,利用该方法,共抽取出12万余对的用户链指.在此自动评价集上的实验结果表明,所采用的多标记分类方法对于识别用户的隐式消费意图是行之有效的,其中使用的各种特征对于提高隐式消费意图识别的效果皆有帮助.
Abstract:Unlike previous works such as explicit consumption intent recognition research, this paper presents a method that uses user behavior analysis to automatically recognize the implicit consumption intent. Specifically, the proposed method recasts implicit consumption intent recognition as a multi-label classification problem, which combines multiple features based on follower's behavior, intent behavior, retweets behavior, and user profiles. The paper proposes a method for the automatic extraction of a large user linkage across social media. With the proposed method, more 120000 user linkage pairs are extracted. Experimental results show that the multi-label classification-based method is effective for implicit intent recognition. Especially, the exploited features are all helpful for improving the recognition performance.
文章编号:     中图分类号:    文献标志码:
基金项目:国家自然科学基金(61133012,61202277);国家重点基础研究发展计划(973)(2014CB340503) 国家自然科学基金(61133012,61202277);国家重点基础研究发展计划(973)(2014CB340503)
Foundation items:National Natural Science Foundation of China (61133012, 61202277); National Program on Key Basic Research Project of China (973) (2014CB340503)
Reference text:

付博,刘挺.社会媒体中用户的隐式消费意图识别.软件学报,2016,27(11):2843-2854

FU Bo,LIU Ting.Implicit User Consumption Intent Recognition in Social Media.Journal of Software,2016,27(11):2843-2854