Skip to main content
OpenConf small logo

Providing all your submission and review needs
Abstract and paper submission, peer-review, discussion, shepherding, program, proceedings, and much more

Worldwide & Multilingual
OpenConf has powered thousands of events and journals in over 100 countries and more than a dozen languages.

What Can Forest Workers do Differently - Defining Countermeasures For Forest Accidents Through Counterfactuals of Causal Models

The use of artificial intelligence (AI) in the investigation of occupational accidents in forestry represents a significant departure from conventional statistical approaches and provides a holistic analysis by considering multiple variables simultaneously rather than isolating individual factors. Machine learning (ML) algorithms such as decision trees (DT), random forests (RF), neural networks (NN) and tabular prior-fitted networks (TabPFN) are proving to be particularly efficient in predicting fatalities and the duration of sickness absence following forestry accidents, both in terms of time and resources. However, the explanatory power of these models is limited to extensive, complicated rules and enumerations of critical factors ordered by their relative importance. This limitation compromises their usefulness in providing comprehensive guidance for the daily decision-making processes of forestry workers.

In contrast, structural causal models (SCM) encode the dependencies between the key variables that contribute to an accident, taking into account human expertise (prior knowledge) and contextual understanding from occupational accident data sets. AI guides the computation of the functional relationships expressed by the causal dependencies, thereby being able to answer fundamental questions about the relative strength of each contributing factor with evidential reasoning. The ability to answer counterfactual questions and reasoning about alternative scenarios that are only slightly different from the original scenario, that led to a serious accident, provides forest workers with concrete suggestions on what they can do differently to avoid a fatal accident or reduce the severity of the accident. This solution is by definition better interpretable and in the realm of Actionable Explainable AI (AxAI) with the human (domain expert) in-the-loop (HITL) since the distilled insights from the counterfactual explanations define the countermeasures for accident prevention.

Anna Saranti
University of Natural Resources and Life Sciences (BOKU), Vienna
Austria

Ferdinand Hönigsberger
University of Natural Resources and Life Sciences (BOKU), Vienna
Austria

Seyedehanahid Naghibzadehjalali
University of Natural Resources and Life Sciences (BOKU), Vienna
Austria

Karl Stampfer
University of Natural Resources and Life Sciences (BOKU), Vienna
Austria

Andreas Holzinger
University of Natural Resources and Life Sciences (BOKU), Vienna
Austria