TechBytes with Gil Chamiel, Director of Data Science and Algorithm Engineering at Taboola

Gil Chamiel
Gil Chamiel

Gil Chamiel
Director of Data Science and Algorithm Engineering at Taboola

In an era of personalization and web customization, marketers are increasingly relying on programmatic and AI technologies. Predictive technology and omnichannel experience help businesses increase user engagement, monetize traffic and deliver relevant content throughout the customer journey. We spoke to Gil Chamiel, Director of Data Science and Algorithm Engineering at Taboola, to understand how the predictive recommendation engine for content discovery works and how AI helps to understand audience behavior analytics.

MTS: Tell us about your role at Taboola and how you arrived at this position?
Gil Chamiel: I am Director of Data Science and Algorithm Engineering at Taboola. This means that I am in charge of our algorithmic development efforts and research. Before I came to Taboola, I completed my PhD at the University of New South Wales in Sydney, Australia where I focused on AI research for web personalization.

I joined Taboola at the end of 2010 when it was still a very small company of around 20 people worldwide – today we are nearly 800. I started as an algorithm engineer as part of a team that consisted of software developers and a few algorithm engineers. As we grew, we decided to dedicate more resources towards algorithmic efforts and established an algorithm engineering team that I then led. The team grew and became a group consisting of several teams of data scientists/machine learning researchers and software engineers. In the past 18 months or so, we have been heavily investing in our Deep Learning knowledge and capabilities. This included studying new modeling techniques as well as the engineering of production-level scalable deep learning pipelines. The result is a strategic shift in Taboola’s R&D. We graduated most of our algorithmic stacks to be based on deep learning and adopted it as the default technique.

MTS: How does Taboola’s predictive recommendation engine work?
Gil: The challenge is matching content, products, apps, videos, etc (out of hundreds of thousands of possible recommendations at any given time) to users in specific contexts in a way that increases both short/long term engagement and revenue. We look at over 100 factors including previous engagement with Taboola recommendations and at the user’s browsing behavior outside of Taboola.

We then build predictive models that try to estimate the probability that users will engage with a certain item in the future. For example, imagine predicting when an article is being written what’s the likelihood certain people will like it before they even knew it existed – it’s a complex problem, and exciting for us at the algorithm team. Due to the ever-changing nature of the internet and of our marketplace, we needed to invent an unique exploration/exploitation paradigm where most of the time we show users recommendations that we are fairly certain will work well for them, but sometimes explore new recommendations that we have a reason to believe may also work well for them. Also, since we never come across the exact same scenario twice, we must use sophisticated algorithms, such as Deep Neural Networks, in order to power the predictive models with generalization capabilities.

Just to share the magnitude of scale, we recommend 14 billion times a day to a billion people a month. We need to make a decision out of tens of millions of items (content, videos, ..) across tens of thousands of publishers/marketers – and the decision needs to take less than a second.

MTS: How is AI-based personalization technology transforming the omnichannel user experience?
Gil: AI brings together a variety of technologies from different fields. For instance, it allows computers to understand language much better than we did before. Since users consume a lot of their content by reading text, this significantly helps us to improve the user experience: we can understand what sort of language people engage with better than before and in which situations. This means understanding not only the semantics of the text but also other properties, which might influence people’s engagement with text such as style and grammar.

Another example comes from the field of Computer Vision: people engage with images during the discovery process and AI helps us understand how people interact with those images, thereby improving the user experience.

MTS: How do you see data analytics algorithms moving from descriptive towards prescriptive intelligence? How are MarTech platforms adopting Embedded Analytics over Business Intelligence?
Gil: As mentioned previously, the internet and the Taboola marketplace constantly change. This means that simple descriptive methods, with limited generalization capabilities, will fail to perform well enough in understanding the subtleties of new web content. Deep Learning algorithms are geared specifically for that type of task: to capture the essence of new cases that they have not seen previously.

In addition, we are dealing with a very high-dimension optimization problem. Simple statistical approaches may work in simpler problems, for instance, if we merely had to match advertisers to publishers. But here, we have what researchers call the “curse of dimensionality”, where the optimization problem has a vast number dimensions (for instance, individual user clicks; raw text; images), with potentially high order relations between them, which makes Deep Learning techniques the only way to deal with this task successfully.

MTS: How should businesses build and adopt the ‘right’ strategy for AI-based content marketing and customer experience?
Gil: Start simple and use what’s already out there. Ten years ago, Taboola didn’t use deep learning: we used simpler models that required less engineering complexity to get a good head start. When getting into deep learning, make sure you use the great tools and data already out there. As far as tools, there’s an abundance of open source technologies to kick off your project for many of the problems you will come across before you’ll need to get unique talent that is experienced in Deep Learning. One of the nice things about working with deep neural networks is the concept of transfer learning: you can take a solution (a neural network), which was built over a very large dataset and plug it into your solution in order to get a head start (especially if you don’t have a lot of data or a lot of computational power).

Obviously, plugging in a network that was built to optimize a problem that is different than yours is not going to just work seamlessly. But you might want to take parts of that network and then continue to optimize yourself. It’s a good way to get a head start.

Finally, build your AI pipeline in a simple manner that is geared towards experimentations. There are lots of knobs to twist in every algorithmic solution and usually it takes quite a few experiments to find a good solution. In Taboola, we constantly experiment (offline and online) with new solutions and it brings good value to the business. Also, equally important, it allows every data scientist in the team to come up with new ideas and experiment on real data, which means every individual’s level of influence is pretty much bound only by the amount of influence they would like to have.

MTS: How do discovery platforms enable marketers to leverage their assets for retargeting and republishing without losing credibility?
Gil: One of our tech innovations is to automatically find the delicate balance between recommending relevant content and defusing user fatigue (that’s the effect you get when users get exposed to the same stuff repeatedly in a way that makes them develop an aversion or even blindness to the content). In Taboola, we let the data speak for itself and model the effect of user fatigue using deep learning tools. This must not be forgotten when retargeting – even though you have a positive signal; a reason to believe the user would like to engage with some recommended item, you must cater for fatigue. This is particularly important given that the feedback we get from users is implicit – they never tell us what they really want to do on a specific page (perhaps they are not sure about it themselves). There might be a tension between the positive signals you have about the user’s past behavior and the number of times they saw a piece of content. A good algorithm will find the balance in a way that will be effective for the platform and won’t frustrate the user.

MTS: Ad fraud and bot traffic are major disruptions to marketing and advertising objectives. How are AI-powered recommendation platforms preparing to combat these challenges?
Gil: Taboola employs numerous strategies to combat fraud and bot traffic. We use predictive models to detect bot traffic by inferring whether the user response is plausible. For instance, we compare the response to personalized and optimized recommendations against the user response when recommending content at random. Humans respond differently to personalized recommendations vs.random ones, while bots respond the same.

MTS: Thanks for chatting with us, Gil.
Stay tuned for more insights on marketing technologies. To participate in our Tech Bytes program, email us at news@martechseries.com

Spread the love
Previous ArticleInterview with Amit Bendov, CEO and Co-Founder at Gong.ioNext ArticleKyvos Insights to Host Webinar Showcasing Capabilities of Kyvos 4.0

Leave a Reply

Your email address will not be published. Required fields are marked *