Exclusive Content:

Haiper steps out of stealth mode, secures $13.8 million seed funding for video-generative AI

Haiper Emerges from Stealth Mode with $13.8 Million Seed...

“Revealing Weak Infosec Practices that Open the Door for Cyber Criminals in Your Organization” • The Register

Warning: Stolen ChatGPT Credentials a Hot Commodity on the...

VOXI UK Launches First AI Chatbot to Support Customers

VOXI Launches AI Chatbot to Revolutionize Customer Services in...

A Single Model to Bring Clarity

Modeling Uncertainty in Recommender Systems: A Unified Approach

When it comes to building effective models for recommender systems, handling uncertainty is key. In a recent series of posts, we explored the different types of uncertainty that can impact your model and discussed various methods for addressing them. Now, in this joint post with Inbar Naor, we’re excited to share how we at Taboola have implemented a neural network that estimates both the probability of an item being relevant to the user, as well as the uncertainty of this prediction.

The neural network we’ve designed consists of several modules, each serving a specific purpose in the model. The item module takes the features of an item, such as its title and thumbnail, and outputs a dense representation that contains important information about the item. The context module considers the context in which the item is being shown and generates a dense representation of that context. The fusion module combines the representations of the item and context to capture their interaction, similar to collaborative filtering. Finally, the estimation module predicts the click-through rate (CTR) of the item and also estimates uncertainty in this prediction.

But how does our model handle uncertainty? We’ll walk you through the three types of uncertainty – data uncertainty, model uncertainty, and measurement uncertainty – and show you how each is addressed in our model.

Data uncertainty is handled by explicitly estimating the noise inherent in the data. By introducing a node to output the data noise and allowing the gradients to propagate, our model can associate different levels of data uncertainty with different inputs. Additionally, we can estimate a mixture of Gaussians to capture more complex data distributions and improve the model’s capacity.

Measurement uncertainty, on the other hand, is related to noisy measurements in the data. By incorporating the measurement noise into the likelihood equation, we can separate data uncertainty from measurement uncertainty and use more data in the training process. This approach not only improves the model’s understanding of the data but also allows for greater flexibility in handling noisy features or labels.

Model uncertainty can be addressed by using techniques like dropout at inference time to understand what the model doesn’t know due to lack of data. By testing the model’s certainty over unique titles and sparse regions of the embedding space, we can see how uncertainty changes with exposure to different types of data. Encouraging exploration of these sparse regions can help reduce uncertainty and improve the model’s performance over time.

In conclusion, by modeling all three types of uncertainty in a unified way, our neural network at Taboola has shown promising results in improving recommendation accuracy and robustness. We hope this post has sparked some ideas on how you can leverage uncertainty in your own applications and training processes. Stay tuned for more insights and updates on our research in recommender systems!

This post is part of a series related to a paper we are presenting at a workshop in this year’s KDD conference on deep density networks and uncertainty in recommender systems. Check out the previous posts in the series for more in-depth discussions on handling uncertainty in models.

Latest

OpenAI: Integrate Third-Party Apps Like Spotify and Canva Within ChatGPT

OpenAI Unveils Ambitious Plans to Transform ChatGPT into a...

Generative Tensions: An AI Discussion

Exploring the Intersection of AI and Society: A Conversation...

Don't miss

Haiper steps out of stealth mode, secures $13.8 million seed funding for video-generative AI

Haiper Emerges from Stealth Mode with $13.8 Million Seed...

VOXI UK Launches First AI Chatbot to Support Customers

VOXI Launches AI Chatbot to Revolutionize Customer Services in...

Investing in digital infrastructure key to realizing generative AI’s potential for driving economic growth | articles

Challenges Hindering the Widescale Deployment of Generative AI: Legal,...

Microsoft launches new AI tool to assist finance teams with generative tasks

Microsoft Launches AI Copilot for Finance Teams in Microsoft...

Leverage Amazon SageMaker HyperPod and Anyscale for Next-Gen Distributed Computing Solutions

Optimizing Large-Scale AI Deployments with Amazon SageMaker HyperPod and Anyscale Overview of Challenges in AI Infrastructure Introducing Amazon SageMaker HyperPod for ML Workloads The Integration of Anyscale...

Vxceed Creates the Ideal Sales Pitch for Scalable Sales Teams with...

Revolutionizing Revenue Retention: AI-Powered Solutions for Consumer Packaged Goods in Emerging Markets Collaborating for Change in CPG Loyalty Programs The Challenge: Addressing Revenue Retention in Emerging...

Streamline the Creation of Amazon QuickSight Data Stories with Agentic AI...

Streamlining Decision-Making with Automated Amazon QuickSight Data Stories Overview of Challenges in Data Story Creation Introduction to Amazon Nova Act Automating QuickSight Data Stories: A Step-by-Step Guide Best...