Interpretation Rules
The interpretation rules allow to calculate the individual importance of classifier variables using Shapley values.
ClassifierInterpreter
Builds a ClassifierInterpreter
structure from a Classifier
structure.
The resulting ClassifierInterpreter
contains all the necessary information
to derive interpretation indicators for each classifier variable and target value.
ContributionAt
Numerical ContributionAt(Structure(ClassifierInterpreter),
Categorical targetValue, Categorical classifierVariableName)
Returns the Shapley value for a given target value and classifier variable.
ContributionVariableAt
Categorical ContributionVariableAt(Structure(ClassifierInterpreter),
Categorical targetValue, Numerical rank)
Returns the name of the variable at the specified importance rank (starting at 1) for a target value, based on variables ordered by decreasing Shapley values.
ContributionPartAt
Categorical ContributionPartAt(Structure(ClassifierInterpreter),
Categorical targetValue, Numerical rank)
Returns the label of the variable part at the specified importance rank (starting at 1) for a target value, based on variables ordered by decreasing Shapley values.
ContributionValueAt
Numerical ContributionValueAt(Structure(ClassifierInterpreter),
Categorical targetValue, Numerical rank)
Returns the Shapley value at the specified importance rank (starting at 1) for a target value, based on variables ordered by decreasing Shapley values.