Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Decision Tree is a feature in C-clone that allows user to select a specific audience.Gain and Volume of unique users from an audience to target specific users

How does it work ?

1- Select a Cclone audience segment then click choose one point on a the graph one point (Volume of Unique Users and Gain)

...

2- Decision Tree is displayed with the different segments and volume from selected previous point

...

3- Audience can be saved

...

Illustration:

Cclone main segments label = test

CCclone newly created label = test_4_UU_4.39_G_13.25

  • «4» is the second point of the abacus, starting from the left

  • « UU 4.39 » means 4.39% of the Negative segment

  • « G » is the expected performance

Image Removed

Tree illustration

The graph shows the output of the algorithm, which is a classification of the internet users according to their affinity vs the parameter of the positive segment (variable to explain)

  • The vertical axis (expected gain) represents the performance index: a performance index is a cumulative % of the positive target within the segments over the % of the target within the negative segments defined as the reference population

  • The horizontal axis (% in database recency 1 day) represents the % of Weborama internet users defined as negative segment

Each blue dot along the green scoring curve represents a strong classifier family


These families are cumulated along the curve, so the advertiser can select his audience by checking the points along the curve, arbitrating between the % of the internet users targeted and the expected performance index

  • The dot with the most performing internet users is on the far left of the graph, but with the smallest volume of us

  • The dot with the largest volume of internet users is on the far right of the graph, but with a weaker performance index

...

Once the advertiser has clicked on the dot corresponding to the audience he wants to target, he will get the corresponding segmentation tree at the bottom of the interface

2- Decision Tree is displayed with the different segments and volume from selected previous point

...


This tree enables to identify and visualize the key predictors (most explanatory variables) explaining a performance (conversion…)
These explanatory variables are the result of the Khi2 test which calculates the dependency of the variables vs the variable to explain
The red nodes are those that constitute the audience selected

Illustration:
The 3 segments in red represent 9.83 % (6.02+1.52+2.29) of the Negative segment and 17.6 % (12.60+2.98+1.98) of the Positive Segment
The number of Unique Users is 1768 (1445+239+84)
They have the highest performance index vs the other segment (2.94; 2.06 and 1.09)
Explanation

Negative segment = Sum of “Database” % points in red (0.70+1.80+0.83+1.06 = 4.39)

Positive Segment =Sum of “% Target” points in red (1.09+2.06+1.06+1.08 = 5.29)

Volume of Unique Users = Sum of “Volume Target” points in red (140+59+73+6 = 278)

Performance index = top figure in red (32.33; 7.23; 17.31; 1.40)

Image Added


For the first segment, the strongest explanatory variable is that they don’t play videogames (>9most probably have a psychotherapy (>=0.5 index)

...

How to create audiences from decision tree ?

...

Once the audience is selected, save this segment by clicking on the

...

 Tip: audience label

3- Audience can be saved (and will be available in Audience monitoring)

...


Keep label of the audience created automatically. The pattern of the label is standardized

Illustration:
Test C-clones_2

Cclone main segments label = test

CCclone newly created label = test_4_UU_24.4839_G_3813.25

  • «2» «4» is the second point of the abacus, starting from the left

  • « UU 24.48 39 » means 24.48% 39% of the Negative segment

  • « G » is the expected performance

...

  • = Gain