Fig. 2

SHAP Plot on the importance of factors. This figure displays the SHAP (SHapley Additive exPlanations) plot, which illustrates the importance of various factors in the machine learning model’s predictions. Each dot represents a SHAP value for a particular feature for a specific observation. The color of each dot indicates the feature’s value (e.g., high or low). Features are ranked by their importance, with the most important features at the top. This plot helps in understanding how each feature contributes to the model’s output