Software as a Safety Component
When Code is a Component: Software as a Safety Component Under Machinery Regulation (EU) 2023/1230
For decades, safety engineers and managers have viewed "safety components" as tangible, physical items such as light curtains, interlock switches, safety relays, and emergency stop buttons. However, the evolution of the machinery sector has fundamentally changed how safety is engineered, with software and digital means playing an increasingly dominant role in machinery design.
To address this digital transformation, the new Machinery Regulation (EU) 2023/1230 introduces a ground breaking regulatory shift: code can now be legally classified as a component.
Here is everything safety professionals need to know about the updated definition of a "safety component," how standalone software is now strictly regulated, and what its inclusion in Annex II means for your compliance workflows.
The Updated Definition: Beyond Physical Hardware
Under the old Machinery Directive 2006/42/EC, the regulatory framework struggled to classify purely digital safety solutions. The new Regulation modernizes this by explicitly updating the definition of a "safety component" to cover not only physical devices but also digital devices.
According to the Regulation, a "safety component" is now defined as a physical or digital component, including software, which is designed or intended to fulfil a safety function. To meet this definition, the failure or malfunction of this component must endanger the safety of persons, and it must be a component that is not necessary for the machinery to function (or for which normal components could be substituted).
Furthermore, a "safety function" is clearly defined as a function that serves to fulfil a protective measure designed to eliminate or reduce a risk; if this function fails, it could result in an increase of that risk.
The Trigger: "Independently Placed on the Market"
A critical distinction for safety managers and software developers to understand is the concept of being placed independently on the market.
If software is simply embedded into a machine by the original equipment manufacturer (OEM) as part of the overall machine control system, the machine as a whole undergoes conformity assessment. However, to account for the increasing use of software as a distinct safety solution, software that performs a safety function and is placed independently on the market is now legally considered a safety component in its own right.
This means that if a company develops and sells a software package strictly designed to monitor safety parameters, execute safe stops, or act as a programmable safety logic controller, and sells that software as a standalone product to end-users or integrators, that software must comply with the Regulation just like a physical piece of safety hardware. The developer of that software assumes the legal obligations of a manufacturer, which includes drawing up technical documentation, issuing an EU Declaration of Conformity, and ensuring it bears the CE marking.
Annex II: The Official Inclusion of Software
To remove any ambiguity, the Regulation provides an indicative list of safety components in Annex II. Reflecting the new digital reality, the European Union has explicitly added digital components to this list.
Safety professionals reviewing Annex II will now find the following explicitly listed:
- Item 18: Software ensuring safety functions.
- Item 19: Safety components with fully or partially self-evolving behaviour using machine learning approaches ensuring safety functions.
If you are procuring or integrating safety software from a third-party vendor, you must ensure that the software provider has treated their code as an Annex II safety component and provided the necessary compliance documentation.
Traditional Code vs. Machine Learning (Annex I, Part A)
While all independently placed safety software is now regulated, the Regulation sets different conformity assessment rules depending on the complexity and nature of the code.
For traditional, static software that is incapable of learning or evolving—meaning it is programmed only to execute specific, predetermined automated functions—the standard conformity assessment procedures apply.
However, the Regulation recognizes that systems with self-evolving behaviour (Artificial Intelligence and Machine Learning) possess characteristics like data dependency, opacity, and autonomy that can considerably increase the probability and severity of harm.
Because of this heightened risk, safety components (including software) with fully or partially self-evolving behaviour using machine learning approaches ensuring safety functions are classified as high-risk items and are listed in Annex I, Part A. For any software falling under this category, the manufacturer cannot self-certify; the conformity assessment must be carried out by an independent third-party Notified Body.
What This Means for Safety Engineers
The classification of software as a safety component bridges the gap between mechanical engineering, electrical engineering, and IT. For safety engineers and managers, this means:
- Vetting Digital Suppliers: When purchasing standalone safety software, you must treat the software vendor as a machinery safety manufacturer. You must demand an EU Declaration of Conformity and ensure the software is CE marked.
- Updating Risk Assessments: Your iterative risk assessments must now account for the failure of digital safety components just as rigorously as the mechanical failure of a physical guard.
- Strict AI Compliance: If you are exploring AI-driven safety software that learns and adapts to its environment, you must ensure it has passed stringent third-party conformity assessments under Annex I, Part A.
By explicitly regulating software, Regulation (EU) 2023/1230 ensures that the digital brains protecting workers on the factory floor are held to the exact same rigorous safety standards as the physical barriers of the past.
