User-oriented Fairness in Recommendation 摘要 对于一个高度数据驱动的app,推荐系统很容易被数据偏差影响,导致对不同数据组的不公平结果,同样这也是影响模型性能的重要原因。因此,如何定义并解决推荐系统场景下的不公平问题显得非常重要。 paper强调从user角度处理不公平问题,首先是按照他们活动的层次划分
recommendation system. We use LLMs to implement data augmentation for the inactive user group in order to increase the number of interactions for this group. We introduce our research motivation to explain what problems the “user-side group fairness problem” would bring in the following sections...
Point-of-interest (POI) recommendation faces challenges in managing data sparsity and cold-start problems, particularly in terms of personalized recommendation. Firstly, most existing users only visit a small number of POIs, which results in severe data sparsity and makes most conventional methods ...
Second, we carry out experiments that show that gender obfuscation impacts the fairness and diversity of recommender system results. In sum, our work establishes that a simple, transparent approach to gender obfuscation can protect user privacy while at the same time improving recommendation results ...
the degree to which the system is meeting the information needs of all its usersin an equal sense. In this work, we propose a probabilistic framework based on generalized cross entropy (GCE) to measure fairness of a given recommendation model. The framework comes with a suite of advantages: ...
3 公平与可负责程度(Fairness and Responsibility), 4 创造性(Creativity), 5 丰富度(Richness) Seek CreativityBrainstorming for inspiration, innovative ideas, etc.设计三个生鲜超市slogan 我在构思经济学的课题,关于后疫情时代消费者行为变化,给我几个具体的idea ...
CENTRIC 2025 The Eighteenth International Conference on Advances in Human-oriented and Personalized Mechanisms, Technologies, and Services SIMUL 2025 The Seventeenth International Conference on Advances in System Modeling and Simulation CFEF 2025 Cinema’s First Epics in Focus: Silent Epic Film from Li...
NOTE: This plugin is supported in Volcano 1.6.5 and later versions. arguments: overcommit-factor: inflation factor, which defaults to 1.2. - plugins: - name: overcommit arguments: overcommit-factor: 2.0 drf The Dominant Resource Fairness (DRF) scheduling algorithm, which schedules jobs based on...
personalization in augmented reality Group modeling and collaborative team formation Ethical issues of personalization and human-centered AI systems: Privacy, Fairness, Accountability, Transparency Creativity in User Modeling, Adaptation and Personalization Sustainability-aware methods and Sustainable Development Goa...
However, in every recommendation, some individuals might not be happy with the recommendation. For example, the Fairness Strategy (a social choice-based aggregation strategy) (Masthoff 2004) might recommend an item that one or more group members do not like but will recommend other items that ...