Chuyển đến nội dung chính
Chuyển đến trang tổng quan
Bạn không chắc nên bắt đầu từ đâu? Hãy làm một bài kiểm tra ngắn để nhận các đề xuất phù hợp với bạn.
NZME

New Zealand publisher enhances data maturity & audience targeting

NZME increased engagement and optimised registration and subscriber flows, developing a data-led lifecycle & personalised storycards on the NZ Herald.

The Challenge

In order to build effective predictive models based on user behaviour, NZME needed a pipeline of high quality and timely user event data. With millions of user events generated every day, wrangling this data in an efficient and scalable way presented a challenge. Additionally, building machine learning pipelines requires specialised expertise, and can be complex and costly.

Andy Wylie
The project had a substantial impact on NZME's ability to customise subscriber marketing communications based on user engagement, subscription propensity, and churn likelihood. The project's impact is expected to grow further as the subscriber marketing team develops new campaigns and strategies leveraging these capabilities.
Andy Wylie
NZME Head of Data and Analytics
13% increase in net premium subscriber acquisition vs forecast over trial period
6x higher click-through rate (CTR) for personalised storycards compared to the manually curated version that it replaced

The Results

To overcome these challenges, in partnership with Google, NZME utilized Google Analytics 4 user data streamed into Google Cloud’s BigQuery to identify the user events most likely to indicate user interests and behaviours. The events were then used to train Machine Learning (ML) models which could predict which other content users might be interested in, as well as their propensity to subscribe or churn.

To enable low touch and efficient machine learning pipelines, BQML (BigQuery Machine Learning) models were utilised. BQML was chosen over other approaches as it enables data teams to run sophisticated ML algorithms from within BigQuery without the complexity and overhead of moving data into separate ML infrastructure. The machine learning pipeline was orchestrated using Cloud Composer which is a managed GCP service based on Apache Airflow. Recommendation model output was then cached in Redis, a high speed in-memory database platform which ensured minimal latency when returning content recommendations to the NZ Herald.

The propensity to subscribe and propensity to churn models were utilised by the NZ Herald subscriber marketing team as part of the data-led subscriber customer lifecycle program to deliver more relevant communications and experiences. Additionally, a content recommendation engine was developed to serve personalised content to a 'storycard’ on the NZ Herald homepage.

These personalised ‘storycards’ on the homepage improved the relevance of the content served to each user by serving free articles to non-subscribers and a mix of premium and free articles to premium subscribers, with content preferences based on reading history.

Screenshot 2023-09-11 at 4.05.42 PM
New Zealand Herald homepage showing multiple news stories with personalised storycards outlined in purple
Rời khỏi và thoát tiến trình?
Khi rời khỏi trang này, bạn sẽ bị mất toàn bộ tiến trình trong bài học hiện tại. Bạn có chắc chắn muốn tiếp tục và mất tiến trình của mình không?