Exclusive Content:

Haiper steps out of stealth mode, secures $13.8 million seed funding for video-generative AI

Haiper Emerges from Stealth Mode with $13.8 Million Seed...

Running Your ML Notebook on Databricks: A Step-by-Step Guide

A Step-by-Step Guide to Hosting Machine Learning Notebooks in...

“Revealing Weak Infosec Practices that Open the Door for Cyber Criminals in Your Organization” • The Register

Warning: Stolen ChatGPT Credentials a Hot Commodity on the...

A Single Model to Bring Clarity

Modeling Uncertainty in Recommender Systems: A Unified Approach

When it comes to building effective models for recommender systems, handling uncertainty is key. In a recent series of posts, we explored the different types of uncertainty that can impact your model and discussed various methods for addressing them. Now, in this joint post with Inbar Naor, we’re excited to share how we at Taboola have implemented a neural network that estimates both the probability of an item being relevant to the user, as well as the uncertainty of this prediction.

The neural network we’ve designed consists of several modules, each serving a specific purpose in the model. The item module takes the features of an item, such as its title and thumbnail, and outputs a dense representation that contains important information about the item. The context module considers the context in which the item is being shown and generates a dense representation of that context. The fusion module combines the representations of the item and context to capture their interaction, similar to collaborative filtering. Finally, the estimation module predicts the click-through rate (CTR) of the item and also estimates uncertainty in this prediction.

But how does our model handle uncertainty? We’ll walk you through the three types of uncertainty – data uncertainty, model uncertainty, and measurement uncertainty – and show you how each is addressed in our model.

Data uncertainty is handled by explicitly estimating the noise inherent in the data. By introducing a node to output the data noise and allowing the gradients to propagate, our model can associate different levels of data uncertainty with different inputs. Additionally, we can estimate a mixture of Gaussians to capture more complex data distributions and improve the model’s capacity.

Measurement uncertainty, on the other hand, is related to noisy measurements in the data. By incorporating the measurement noise into the likelihood equation, we can separate data uncertainty from measurement uncertainty and use more data in the training process. This approach not only improves the model’s understanding of the data but also allows for greater flexibility in handling noisy features or labels.

Model uncertainty can be addressed by using techniques like dropout at inference time to understand what the model doesn’t know due to lack of data. By testing the model’s certainty over unique titles and sparse regions of the embedding space, we can see how uncertainty changes with exposure to different types of data. Encouraging exploration of these sparse regions can help reduce uncertainty and improve the model’s performance over time.

In conclusion, by modeling all three types of uncertainty in a unified way, our neural network at Taboola has shown promising results in improving recommendation accuracy and robustness. We hope this post has sparked some ideas on how you can leverage uncertainty in your own applications and training processes. Stay tuned for more insights and updates on our research in recommender systems!

This post is part of a series related to a paper we are presenting at a workshop in this year’s KDD conference on deep density networks and uncertainty in recommender systems. Check out the previous posts in the series for more in-depth discussions on handling uncertainty in models.

Latest

Reinforcement Fine-Tuning for Amazon Nova: Educating AI via Feedback

Unlocking Domain-Specific Capabilities: A Guide to Reinforcement Fine-Tuning for...

Calculating Your AI Footprint: How Much Water Does ChatGPT Consume?

Understanding the Hidden Water Footprint of AI: Balancing Innovation...

China’s AI² Robotics Secures $145M in Funding for Model Development and Humanoid Robot Enhancements

AI² Robotics Secures $145 Million in Series B Funding...

Don't miss

Haiper steps out of stealth mode, secures $13.8 million seed funding for video-generative AI

Haiper Emerges from Stealth Mode with $13.8 Million Seed...

Running Your ML Notebook on Databricks: A Step-by-Step Guide

A Step-by-Step Guide to Hosting Machine Learning Notebooks in...

VOXI UK Launches First AI Chatbot to Support Customers

VOXI Launches AI Chatbot to Revolutionize Customer Services in...

Investing in digital infrastructure key to realizing generative AI’s potential for driving economic growth | articles

Challenges Hindering the Widescale Deployment of Generative AI: Legal,...

Insights from Real-World COBOL Modernization

Accelerating Mainframe Modernization with AI: Key Insights from AWS Transform Unpacking the Dual Aspects of Modernization The Importance of Comprehensive Context in Mainframe Projects Understanding Platform-Specific Behaviors Ensuring...

Apple Stock 2026 Outlook: Price Target and Investment Thesis for AAPL

Institutional Equity Research Report: Apple Inc. (AAPL) Analysis Report Overview Report Date: February 27, 2026 Analyst: Lead Equity Research Analyst Rating: HOLD 12-Month Price Target: $295 Data Sources All data sourced...

Optimize Deployment of Multiple Fine-Tuned Models Using vLLM on Amazon SageMaker...

Optimizing Multi-Low-Rank Adaptation for Mixture of Experts Models in vLLM This heading encapsulates the main focus of the content, highlighting both the technical aspect of...