Venue
- ACM Transactions on Intelligent Systems and Technology
Date
- 2022
FedCTR: Federated Native Ad CTR Prediction with Cross Platform User Behavior Data
Chuhan Wu*
Fangzhao Wu*
Yongfeng Huang*
Xing Xie*
* External authors
ACM Transactions on Intelligent Systems and Technology
2022
Abstract
Native ad is a popular type of online advertisement which has similar forms with the native content displayed on websites. Native ad CTR prediction is useful for improving user experience and platform revenue. However, it is challenging due to the lack of explicit user intent, and users' behaviors on the platform with native ads may not be sufficient to infer their interest in ads. Fortunately, user behaviors exist on many online platforms and they can provide complementary information for user interest mining. Thus, leveraging multi-platform user behaviors is useful for native ad CTR prediction. However, user behaviors are highly privacy-sensitive and the behavior data on different platforms cannot be directly aggregated due to user privacy concerns and data protection regulations like GDPR. Existing CTR prediction methods usually require centralized storage of user behavior data for user modeling and cannot be directly applied to the CTR prediction task with multi-platform user behaviors. In this paper, we propose a federated native ad CTR prediction method named FedCTR, which can learn user interest representations from their behaviors on multiple platforms in a privacy-preserving way. On each platform a local user model is used to learn user embeddings from the local user behaviors on that platform. The local user embeddings from different platforms are uploaded to a server for aggregation, and the aggregated user embeddings are sent to the ad platform for CTR prediction. Besides, we apply LDP and DP techniques to the local and aggregated user embeddings respectively for better privacy protection. Moreover, we propose a federated framework for model training with distributed models and user behaviors. Extensive experiments on real-world dataset show that FedCTR can effectively leverage multi-platform user behaviors for native ad CTR prediction in a privacy-preserving manner.
Related Publications
Large Language Models (LLMs) and Vision-Language Models (VLMs) have made significant advancements in a wide range of natural language processing and vision-language tasks. Access to large web-scale datasets has been a key factor in their success. However, concerns have been …
Federated Learning (FL) is notorious for its vulnerability to Byzantine attacks. Most current Byzantine defenses share a common inductive bias: among all the gradients, the densely distributed ones are more likely to be honest. However, such a bias is a poison to Byzantine r…
The rapid development of Large Language Models (LLMs) has been pivotal in advancing AI, with pre-trained LLMs being adaptable to diverse downstream tasks through fine-tuning. Federated learning (FL) further enhances fine-tuning in a privacy-aware manner by utilizing clients'…
JOIN US
Shape the Future of AI with Sony AI
We want to hear from those of you who have a strong desire
to shape the future of AI.