Artificial intelligence - AI system logging (ISO/IEC DIS 24970:2025)

This document describes common capabilities, requirements and a supporting information model for logging of events in AI systems.
This document is designed to be used with a risk management system.

Künstliche Intelligenz - KI‑System-Protokollierung (ISO/IEC DIS 24970:2025)

Intelligence artificielle - Journalisation des systèmes d'IA (ISO/IEC DIS 24970:2025)

Umetna inteligenca - Logiranje delovanja sistema UI (ISO/IEC DIS 24970:2025)

General Information

Status
Not Published
Public Enquiry End Date
09-Feb-2026
Technical Committee
Current Stage
4020 - Public enquire (PE) (Adopted Project)
Start Date
01-Dec-2025
Due Date
20-Apr-2026
Draft
oSIST prEN ISO/IEC 24970:2026
English language
26 pages
sale 10% off
Preview
sale 10% off
Preview
e-Library read for
1 day

Standards Content (Sample)


SLOVENSKI STANDARD
01-januar-2026
Umetna inteligenca - Logiranje delovanja sistema UI (ISO/IEC DIS 24970:2025)
Artificial intelligence - AI system logging (ISO/IEC DIS 24970:2025)
Künstliche Intelligenz - KI‑System-Protokollierung (ISO/IEC DIS 24970:2025)
Intelligence artificielle - Journalisation des systèmes d'IA (ISO/IEC DIS 24970:2025)
Ta slovenski standard je istoveten z: prEN ISO/IEC 24970
ICS:
35.240.01 Uporabniške rešitve Application of information
informacijske tehnike in technology in general
tehnologije na splošno
2003-01.Slovenski inštitut za standardizacijo. Razmnoževanje celote ali delov tega standarda ni dovoljeno.

DRAFT
International
Standard
ISO/IEC DIS 24970
ISO/IEC JTC 1/SC 42
Artificial intelligence — AI system
Secretariat: ANSI
logging
Voting begins on:
Intelligence artificielle — Journalisation des systèmes d'IA
2025-11-18
Voting terminates on:
ICS: 35.240.01
2026-02-10
THIS DOCUMENT IS A DRAFT CIRCULATED
FOR COMMENTS AND APPROVAL. IT
IS THEREFORE SUBJECT TO CHANGE
AND MAY NOT BE REFERRED TO AS AN
INTERNATIONAL STANDARD UNTIL
PUBLISHED AS SUCH.
This document is circulated as received from the committee secretariat.
IN ADDITION TO THEIR EVALUATION AS
BEING ACCEPTABLE FOR INDUSTRIAL,
TECHNOLOGICAL, COMMERCIAL AND
USER PURPOSES, DRAFT INTERNATIONAL
STANDARDS MAY ON OCCASION HAVE TO
ISO/CEN PARALLEL PROCESSING
BE CONSIDERED IN THE LIGHT OF THEIR
POTENTIAL TO BECOME STANDARDS TO
WHICH REFERENCE MAY BE MADE IN
NATIONAL REGULATIONS.
RECIPIENTS OF THIS DRAFT ARE INVITED
TO SUBMIT, WITH THEIR COMMENTS,
NOTIFICATION OF ANY RELEVANT PATENT
RIGHTS OF WHICH THEY ARE AWARE AND TO
PROVIDE SUPPORTING DOCUMENTATION.
Reference number
© ISO/IEC 2025
ISO/IEC DIS 24970:2025(en)
DRAFT
ISO/IEC DIS 24970:2025(en)
International
Standard
ISO/IEC DIS 24970
ISO/IEC JTC 1/SC 42
Artificial intelligence — AI
Secretariat: ANSI
system logging
Voting begins on:
Intelligence artificielle — Journalisation des systèmes d'IA
ICS: 35.240.01 Voting terminates on:
THIS DOCUMENT IS A DRAFT CIRCULATED
FOR COMMENTS AND APPROVAL. IT
IS THEREFORE SUBJECT TO CHANGE
AND MAY NOT BE REFERRED TO AS AN
INTERNATIONAL STANDARD UNTIL
PUBLISHED AS SUCH.
This document is circulated as received from the committee secretariat.
IN ADDITION TO THEIR EVALUATION AS
BEING ACCEPTABLE FOR INDUSTRIAL,
© ISO/IEC 2025
TECHNOLOGICAL, COMMERCIAL AND
USER PURPOSES, DRAFT INTERNATIONAL
All rights reserved. Unless otherwise specified, or required in the context of its implementation, no part of this publication may
STANDARDS MAY ON OCCASION HAVE TO
ISO/CEN PARALLEL PROCESSING
be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying, or posting on
BE CONSIDERED IN THE LIGHT OF THEIR
the internet or an intranet, without prior written permission. Permission can be requested from either ISO at the address below
POTENTIAL TO BECOME STANDARDS TO
WHICH REFERENCE MAY BE MADE IN
or ISO’s member body in the country of the requester.
NATIONAL REGULATIONS.
ISO copyright office
RECIPIENTS OF THIS DRAFT ARE INVITED
CP 401 • Ch. de Blandonnet 8
TO SUBMIT, WITH THEIR COMMENTS,
CH-1214 Vernier, Geneva
NOTIFICATION OF ANY RELEVANT PATENT
Phone: +41 22 749 01 11
RIGHTS OF WHICH THEY ARE AWARE AND TO
PROVIDE SUPPORTING DOCUMENTATION.
Email: copyright@iso.org
Website: www.iso.org
Published in Switzerland Reference number
© ISO/IEC 2025
ISO/IEC DIS 24970:2025(en)
© ISO/IEC 2025 – All rights reserved
ii
ISO/IEC DIS 24970:2025(en)
27 Contents
28 Introduction . vi
29 1 Scope . 1
30 2 Normative references . 1
31 3 Terms and definitions . 1
32 4 Abbreviated terms . 4
33 5 Logging and use of logs . 4
34 5.1 AI system logs. 4
35 5.2 Logging components . 5
36 5.3 Logging in context . 6
37 5.4 Log entries . 7
38 5.5 AI system logging . 7
39 5.6 General requirements . 8
40 5.7 General . 8
41 5.7.1 Security and privacy . 8
42 5.7.2 Recording of events . 8
43 5.8 Technical documentation . 9
44 6 Design of the logging system . 9
45 6.1 General . 9
46 6.2 Traceability . 10
47 6.3 Additional functions . 10
48 6.4 Anomoly monitoring of the logging component . 10
49 7 Triggers for logging. 10
50 7.1 General . 10
51 7.2 Triggers from operation . 11
52 7.2.1 Software errors . 11
53 7.2.2 Outlier input . 11
54 7.2.3 Potential attack . 12
55 7.2.4 User request . 12
56 7.2.5 User request outcome . 12
57 7.2.6 User information transmission . 12
58 7.3 Triggers from automated monitoring . 12
59 7.3.1 Adversarial attack . 12
60 7.3.2 Unwanted bias detection . 12
61 7.3.3 Out-of-domain inputs . 12
62 7.3.4 Model drift . 12
63 7.3.5 Logging for machine learning model development for auditability purposes . 13
64 7.4 Triggers from human oversight . 13
65 8 Information to log . 13
66 8.1 Required information . 13
67 8.2 Recommended information . 14
68 9 Storing and access to logs . 14
69 9.1 General . 14
70 9.2 Requirements for third party access . 15
71 9.3 Access for AI users . 15
72 9.4 Access for AI providers . 15
73 Annex A (informative) Information model . 16
iii
ISO/IEC DIS 24970:2025(en)
74 Bibliography . 20
iv
ISO/IEC DIS 24970:2025(en)
76 Foreword
77 ISO (the International Organization for Standardization) is a worldwide federation of national
78 standards bodies (ISO member bodies). The work of preparing International Standards is normally
79 carried out through ISO technical committees. Each member body interested in a subject for which a
80 technical committee has been established has the right to be represented on that committee.
81 International organizations, governmental and non-governmental, in liaison with ISO, also take part in
82 the work. ISO collaborates closely with the International Electrotechnical Commission (IEC) on all
83 matters of electrotechnical standardization.
84 The procedures used to develop this document and those intended for its further maintenance are
85 described in the ISO/IEC Directives, Part 1. In particular, the different approval criteria needed for the
86 different types of ISO documents should be noted. This document was drafted in accordance with the
87 editorial rules of the ISO/IEC Directives, Part 2 (see www.iso.org/directives).
88 ISO draws attention to the possibility that the implementation of this document may involve the use of
89 (a) patent(s). ISO takes no position concerning the evidence, validity or applicability of any claimed
90 patent rights in respect thereof. As of the date of publication of this document, ISO had not received
91 notice of a patent which may be required to implement this document. However, implementers are
92 cautioned that this may not represent the latest information, which may be obtained from the patent
93 database available at www.iso.org/patents. ISO shall not be held responsible for identifying any or all
94 such patent rights.
95 Any trade name used in this document is information given for the convenience of users and does not
96 constitute an endorsement.
97 For an explanation of the voluntary nature of standards, the meaning of ISO specific terms and
98 expressions related to conformity assessment, as well as information about ISO's adherence to the
99 World Trade Organization (WTO) principles in the Technical Barriers to Trade (TBT), see
100 www.iso.org/iso/foreword.html.
101 This document was prepared by Technical Committee ISO/IEC JTC 1, Information technology,
102 Subcommittee SC 42, Artificial intelligence.
103 Any feedback or questions on this document should be directed to the user’s national standards body.
104 A complete listing of these bodies can be found at www.iso.org/members.html.
v
ISO/IEC DIS 24970:2025(en)
105 Introduction
106 AI systems can log a wide range of events during operation. While some aspects of logging can be
107 defined in advance —based on the system’s known purpose and operating context — real-world needs
108 often emerge only after deployment. In practice, developers and stakeholders cannot fully know what
109 is important to log until the system is running. Additionally, the relevance of logged events is not fixed.
110 Relevance can shift over time due to changing user needs, operational conditions or even interactions
111 with other AI systems. However, it is often unclear whether AI systems can adapt their logging
112 practices themselves, or whether human intervention is required to reconfigure logging as contexts
113 evolve. This raises important considerations for auditability, transparency and long-term system
114 oversight.
115 AI system logs have the potential to support both operational tasks, such as system monitoring or
116 troubleshooting, and higher-level management functions, provided the logs are accessible, well-
117 structured and aligned with the specific needs of both technical and non-technical stakeholders. When
118 logs are properly collected, maintained and analyzed, they can inform activities such as strategic
119 planning, real-time monitoring, decision-making and process improvement. However, the
120 effectiveness of this depends on the quality of the logged data, the organization’s analytical
121 capabilities, and a clear understanding of how logged events relate to broader goals or risks.
122 Leveraging logs to align operational and management functions can contribute to cost reduction, risk
123 mitigation or compliance, if logs are reliably mapped to relevant regulatory or organizational
124 requirements, and if teams have the appropriate tools and processes in place to interpret and act on
125 the data. The benefits are contingent on consistent log quality, appropriate access controls and the
126 capacity to contextualize and validate logged information.
127 When logs are systematically collected, structured and analyzed, they can provide valuable insights
128 that help organizations and AI system operators better understand and respond to unexpected events
129 or changing conditions in dynamic environments. The timeliness required for effective response
130 depends on the specific use case. Some situations demand real-time or near-real-time access to logs,
131 while others may allow for delayed analysis. In all cases, the value of logs also hinges on their
132 interpretability and the existence of established processes or tools to act on the information
133 appropriately. Logs can also support the iterative development and refinement of AI systems by
134 highlighting issues, performance bottlenecks or emerging patterns not evident during initial training
135 or deployment. To contribute meaningfully to continuous improvement, whether through retraining,
136 optimization or other adjustments, logs can be contextually rich, validated, securely stored and
137 reviewed under responsible governance processes that account for privacy, fairness and
138 accountability. Where continuous learning is feasible, safeguards are necessary to prevent harmful
139 drift, unintended behaviours or the incorporation of biased or erroneous data.
140 Logging functions present their own complexity and costs and can reduce the performance and
141 increase the footprint of the AI system, so there are considerations on the effectiveness and efficiency
142 of logging to answer oversight needs.
143 This document defines AI system logging. Clause 5 outlines logging and the use of logs, and provides
144 general requirements for logging. Clause 6 provides requirements and guidance for design, including
145 traceability. Clause 7 details the triggers for logging in three categories: operation, automated
146 monitoring and human oversight. Clause 8 contains requirements and guidance for information to
147 include in the logs and Clause 9 contains requirements for storing and access to logs. Annex A gives an
148 example of an information model and data structure that can be used for AI system logging.
vi
ISO/IEC DIS 24970:2025(en)
149 Artificial intelligence — AI system logging
150 1 Scope
151 This document describes common capabilities and provides requirements for logging of events in AI
152 systems.
153 2 Normative references
154 There are no normative references in this document.
155 3 Terms and definitions
156 For the purposes of this document, the following terms and definitions apply.
157 ISO and IEC maintain terminology databases for use in standardization at the following addresses:
158 — ISO Online browsing platform: available at https://www.iso.org/obp
159 — IEC Electropedia: available at https://www.electropedia.org/
160 3.1
161 log
162 management object, organized by log entries (3.2), used to store information on AI system events
163 3.2
164 log entry
165 basic unit in a log (3.1)
166 3.3
167 logging
168 act or process of making or recording a log (3.1)
169 3.4
170 model
171 physical, mathematical or otherwise logical representation of a system, entity, phenomenon, process
172 or data
173 [SOURCE: ISO/IEC 22989:2022, 3.1.3]
174 3.5
175 audit
176 systematic, independent and documented process for obtaining objective evidence and evaluating it
177 objectively to determine the extent to which the audit criteria are fulfilled
178 Note 1 to entry: Internal audits, sometimes called first party audits, are conducted by, or on behalf of, the
179 organization itself.
180 Note 2 to entry: External audits include those generally called second and third party audits. Second party audits
181 are conducted by parties having an interest in the organization, such as customers, or by other individuals on
182 their behalf. Third party audits are conducted by independent auditing organizations, such as those providing
183 certification/registration of conformity or governmental agencies.
184 [SOURCE:ISO 19011:2018, 3.1]
185 3.6
186 auditability
187 capability of collecting and making available necessary evidential information related to the operation
188 and use of an AI system, for the purpose of conducting an audit (3.5)
189 [SOURCE: ISO/IEC 22123-1:2023, 3.13.11, modified to replace cloud service with AI system]
ISO/IEC DIS 24970:2025(en)
190 3.7
191 error
192 discrepancy between a computed, observed or measured value or condition and the true, specified or
193 theoretically correct value or condition
194 Note 1 to entry: An error within a system can be caused by failure of one or more of its components, or by the
195 activation of a systematic fault.
196 [SOURCE:IEC 60050-192:2015, 192-03-02]
197 3.8
198 monitoring
199 ongoing observation and assessment of an AI system's behaviour, outputs and context by automated
200 tools or human operators, to detect relevant events from expected operation
201 Note 1 to entry: Situations that warrant monitoring can include failure, malfunction, cyberattack, out of the
202 intended domain of use, out of the operational design domain and abnormal usage.
203 3.9
204 logging component
205 component of an AI system, or linked with an AI system, that enables logging (3.3) capabilities
206 Note 1 to entry: A logging component can contain one or more subcomponents for generating log entries (3.2)
207 based on different purposes.
208 Note 2 to entry: A logging component can forward information to other systems.
209 3.10
210 log user
211 organization (3.1)or entity that accesses, reviews or analyzes logs (3.1) produced for an AI system
212 3.11
213 de-identification process
214 process of removing the association between a set of identifying attributes and the data principal
215 (3.12)
216 Note 1 to entry: De-identification includes the process of altering data, via modifying or removing data, so that
217 individuals or entities cannot be identified, directly or indirectly.
218 [SOURCE: ISO/IEC 20889:2018, 3.6, modified to add the note]
219 3.12
220 data principal
221 entity to which data relates
222 Note 1 to entry: The term “data principal” is broader than “PII principal” (or “data subject” as used elsewhere),
223 and is able to denote any entity such as a person, an organization, a device, or a software application.
224 [SOURCE: ISO/IEC 20889:2018, 3.4]
225 3.13
226 organization
227 person or group of people that has its own functions with responsibilities, authorities and
228 relationships to achieve its objectives
229 Note 1 to entry: The concept of organization includes, but is not limited to, sole-trader, company, corporation,
230 firm, enterprise, authority, partnership, charity or institution or part or combination thereof, whether
231 incorporated or not, public or private.
232 Note 2 to entry: If the organization is part of a larger entity, the term “organization” refers only to the part of the
233 larger entity that is within the scope of the AI management system.
234 [SOURCE: ISO/IEC 42001:2023, 3.1]
ISO/IEC DIS 24970:2025(en)
235 3.14
236 AI user
237 organization (3.13) or entity that uses AI products or services
238 3.15
239 stakeholder
240 interested party
241 any individual, group, or organization (3.13) that can affect, be affected by, or perceive itself to be
242 affected by a decision or activity.
243 [SOURCE: ISO/IEC 22989:2022, 3.5.13, modified to add admitted term]
244 3.16
245 AI developer
246 organization (3.13) or entity that is involved in the development of AI services and products
247 3.17
248 memory capacity
249 maximum number of items that can be held in a given logging component (3.9) memory; usually
250 measured in bytes
251 [SOURCE ISO/IEC/IEEE 24765:2017, 3.2411, modified computer to logging component and removed
252 note]
253 3.18
254 storage capacity
255 maximum number of items that can be held in a given storage device; usually measured bytes
256 [ISO/IEC/IEEE 24765:2017, 3.3994, modified to remove note and reference to words]
257 3.19
258 software error
259 erroneous result produced by the use of software product
260 [SOURCE: ISO 14224:2016, 3.87, modified to remove example and notes]
261 3.20
262 control (verb)
263 in engineering, the monitoring (3.8) of system output to compare with expected
264 output and taking corrective action when the actual output does not match the expected output
265 [SOURCE: ISO/IEC/IEEE 24765:2017, 3.846.1]
266 3.21
267 control (noun)
268 action of controlling (3.20)
269 3.22
270 controller
271 authorized human or another external agent that performs a control (3.21)
272 Note 1 to entry: A controller interacts with the control points (3.23) of an AI system.
273 [SOURCE: ISO/IEC TS 8200:2024, 3.6]
274 3.23
275 control point
276 part of the interface of a system where control (3.21) can be applied
277 Note 1 to entry: A control point can be a function, physical facility (such as a switch) or a signal receiving
278 subsystem.
ISO/IEC DIS 24970:2025(en)
279 [SOURCE: ISO/IEC TS 8200:2024, 3.16, modified to make control singular]
280 3.25
281 disengagement of control
282 control disengagement
283 process where a controller (3.22) releases a set of control points (3.23)
284 [SOURCE: ISO/IEC TS 8200:2024, 3.7]
285 3.26
286 engagement of control
287 control engagement
288 process where a controller (3.22) takes over a set of control points (3.23)
289 Note 1 to entry: Besides taking over a set of control points, an engagement of control can also include a
290 confirmation about the transfer of control to a controller.
291 [SOURCE: ISO/IEC TS 8200:2024, 3.8]
292 3.27
293 transfer of control
294 control transfer
295 process of the change of the controller (3.22) that performs a control (3.21) over a system
296 Note 1 to entry: Transfer of control does not entail application of a control, but it is a handover of control points
297 of the system interface between agents.
298 Note 2 to entry: Engagement of control and disengagement of control are two fundamental complementary parts
299 of control transfer
300 [SOURCE: ISO/IEC TS 8200:2024, 3.19]
301 3.28
302 governance scheme
303 set of rules, officially adopted and followed, that defines how a system is managed and controlled
304 Note 1 to entry: A governance scheme can be defined in a regulation, standard, guideline, convention or social
305 nor3.29
306 AI provider
307 organization (3.13) or entity that provides products or services that uses one or more AI systems
308 4 Abbreviated terms
309 AI artificial intelligence
310 ML machine learning
311 5 Logging and use of logs
312 5.1 AI system logs
313 An AI system log represents information related to the operation, behaviour, inputs, outputs or
314 context of an AI system, recorded to support current or future retrieval, analysis, oversight or decision
315 review.
316 AI system logs can consist of structured, semi-structured, or unstructured data, and can originate from
317 the AI system under consideration, its internal components, interacting AI systems, users or external
318 observers, whether human or automated, operating outside the system boundary. AI system logs can
319 be generated continuously, periodically or in response to specific conditions, and can serve a range of
ISO/IEC DIS 24970:2025(en)
320 purposes including system monitoring, debugging, auditing, compliance, human oversight, iterative
321 system improvement or broader accountability.
322 AI system logs can include:
323 — time-stamped events (i.e. recorded occurrences linked to a specific moment in time, such as
324 when a model generates a prediction, an error occurs or a user interaction takes place);
325 — status snapshots (i.e. point-in-time captures of system conditions, such as memory usage,
326 model state or active components);
327 — sensor or input data (ie. information received by the AI system from external sources, such as
328 camera images, user inputs, location data or telemetry from connected devices);
329 — outputs (i.e. results produced by the AI system, such as classifications, recommendations,
330 predictions or generated content);
331 — decisions (i.e. discrete choices or actions taken by the AI system, either autonomously or
332 through human-in-the-loop mechanisms, such as approving a transaction or triggering an
333 alert);
334 — error messages (i.e. alerts or diagnostic records indicating failures, exceptions or issues in the
335 system’s operation);
336 — environmental context (i.e. external conditions that can influence system behaviour, such as
337 network status, sensor readings, user load or surrounding events);
338 — annotations (i.e. supplementary notes or metadata added manually or automatically, which can
339 describe system behaviour, flag anomalies or provide interpretive context).
340 AI system logs can be:
341 — stored persistently (i.e. logs are saved to durable storage (such as databases or filesystems) for
342 long-term retention, inspection or regulatory compliance);
343 — processed in real time (i.e. logs being analyzed immediately or near-instantaneously as they
344 are generated, typically to support live monitoring, alerting or adaptive system behaviour);
345 — managed under data minimization or privacy constraints (i.e. log content or retention can be
346 limited to avoid collecting unnecessary personal data, ensure user consent, comply with legal
347 frameworks or reduce risk of harm);
348 — machine-readable (i.e. logs are formatted for automated processing by software systems, often
349 using standardized structures like JSON, XML, or protocol buffers);
350 — human-interpretable (i.e. logs are presented or organized in a way that allows humans, such as
351 developers, auditors or analysts, to understand their content without requiring complex
352 tooling or transformation).
353 5.2 Logging components
354 A logging component is a functional part of an AI system, or an external system interacting with it, that
355 supports the generation, capture, formatting, storage or management of log data.
356 A logging component can consist of one or more subcomponents responsible for tasks such as
357 detecting events, recording log entries, applying data policies (such as filtering or redaction) or
358 ensuring secure and reliable handling of log information.
359 Logging components can be:
360 — internal to the AI system, i.e. integrated into model-serving infrastructure or runtime
361 environments;
ISO/IEC DIS 24970:2025(en)
362 — external systems or services, i.e. coming from systems such as observability platforms, audit
363 modules or compliance loggers.
364 Logging components can operate independently or in coordination with other system parts, and can
365 vary in complexity from a simple event logger to a distributed, multi-service logging pipeline.
366 A logging component does not assume a fixed structure, automation level, or deployment location. It
367 can be implemented in software, hardware, or hybrid configurations, and designed to meet different
368 operational, analytical, or regulatory objectives.
369 5.3 Logging in context
370 Management of an AI system in operation can be naturally integrated with the operation itself, as
371 management and operation share the fundamental goal of navigating uncertainty to achieve some
372 purposes. This involves capitalizing on opportunities and mitigating risks through planning,
373 monitoring, decision-making and learning, etc., which typically utilize AI technologies to adapt to and
374 survive potentially dynamic and complex environments.
376 Figure 1 — General architecture of an AI system focusing on utilization of data including logs.
377 Figure 1 shows an AI system's functional architecture from the perspective of data transfers and
378 logging. The human in this context is anyone directly interacting with the AI system, for example an
379 organization (AI product or service provider) providing services to other entities by the AI system, an
380 administrator of the AI system or an end user of it.
381 Components of the AI system, including monitoring systems, can utilize logs to better fulfil the
382 intended purpose of the AI system, including risk mitigation. A log user can collect and analyse
383 possibly de-identified logs from multiple AI systems to create and improve AI systems.
384 NOTE The logging components and the storage used for logging are not necessarily directly part of the AI system.
385 See also ISO/IEC 22989:2022, Figure 5 for further information on the functional architecture of an AI
386 system.
387 The logging component logs behaviours of the AI system, as discussed in Clauses 6 and 7.
388 Monitoring systems can utilize the logs to help the AI user assess potential benefits and harms of AI
389 systems’ activities.
ISO/IEC DIS 24970:2025(en)
390 Logs can be used to assess the continuous fulfilment by the AI system of various requirements (e.g.
391 accuracy, robustness, security, privacy, safety and data quality). This assessment can inform the
392 selection of actions.
393 Organizations can collect and analyse logs from similar AI systems or similar components of AI
394 systems on the market to support the creation, maintenance, and continuous improvement of the data,
395 AI systems and components they provide, as illustrated in Figure 1.
396 5.4 Log entries
397 AI system log entries are discrete, identifiable units of information within a log that captures a specific
398 event, condition, state, input, output, decision or contextual detail related to the functioning or
399 environment of an AI system.
400 Log entries are typically composed of a combination of metadata, such as timestamps, source
401 identifiers, and severity levels and content-specific data relevant to the purpose of the log. Protection
402 of confidential information should be taken into account.
403 AI system log entries can vary in structure and content depending on the type of information being
404 recorded and the intended use of the log. Some entries can be highly structured (e.g. a JSON object). An
405 example of semi-structured is textual annotations. An example of free-form is screenshots.
406 Components of a log entry can include:
407 — timestamp, the date and time at which the logged event or condition occurred;
408 — source identifier, a label or address indicating which component, system, user or external
409 observer generated the entry;
410 — event or message code, a categorization or classification of the type of event.
411 EXAMPLE 1 The event is an error event, inference event or user override event.
412 — payload or data content, the core data being recorded, such as input features, output values,
413 error traces or contextual metadata;
414 — severity or priority indicator, a label indicating the importance or criticality of the event, useful
415 for filtering or alerting.
416 AI system log entries can be:
417 — generated automatically by system components or instrumentation;
418 — manually created by users, operators or auditors;
419 EXAMPLE 2 Annotations or overrides can be manually created and then become log entries.
420 — derived from external systems, such as monitoring tools or interacting AI components.
421 To be useful for downstream analysis, log entries should be recorded in a way that ensures
422 traceability, interpretability and data integrity over time.
423 5.5 AI system logging
424 AI system logging is the process of generating, capturing, recording, and managing information related
425 to the operation, behaviour, decisions or context of an AI system, for the purpose of creating one or
426 more logs.
427 Logging can be performed automatically by system components, manually by users or operators or
428 through hybrid methods, and can occur during design, testing, deployment or post-deployment
429 operation.
430 Logging can involve the collection of data from a variety of sources, including:
431 — system components (such as model execution, middleware,or infrastructure);
ISO/IEC DIS 24970:2025(en)
432 — user interactions (such as input submissions or user overrides);
433 — external observers (such as monitoring tools, regulatory systems);
434 — interacting AI systems (such as decision handoffs, multi-model coordination).
435 Logging activities can be continuous, event-driven, scheduled or conditional.
436 EXAMPLE 1 Telemetry data is a continuous logging activity.
437 EXAMPLE 2 Error occurrences is an event-driven logging activity.
438 EXAMPLE 3 Periodic health checks are scheduled logging activities.
439 EXAMPLE 4 Events triggered by threshold violations or policy rules are conditional logging activities.
440 Logging can include one or more of the following sub-processes:
441 — instrumentation includes the implementation of tools or code or mechanisms to monitor and
442 extract and capture data from software or hardware components during execution;
443 — serialization includes converting data structures or objects into a standardized format (such as
444 JSON, XML or binary) for logging, storage or transmission;
445 — storage and retention is saving logs to appropriate storage systems with defined retention
446 policies;
447 — de-identification processes, according to applicable legal or ethical standards;
448 — validation and integrity checking is ensuring that log data is accurate, complete and has not
449 been tampered with.
450 Logging should be guided by clearly defined objectives, such as performance monitoring, safety
451 validation, auditability, transparency, compliance, user redress or support for system improvement.
452 5.6 General requirements
453 5.7 General
454 AI system logging shall provide the capabilities described in this subclause.
455 Logging functions shall enable traceability between multiple events and log entries if necessary to
456 manage risk, relevant to the intended purpose and technically feasible given the inputs and outputs,
457 see 6.2.
458 5.7.1 Security and privacy
459 The organization shall:
460 a) identify the security and privacy requirements for integrity and confidentiality protection;
461 b) protect information taking into account different purposes of logging for different
462 stakeholders.
463 EXAMPLE An AI developer who implements, an AI tester who tests the system and an AI service provider who
464 monitors service during the operation have different purposes.
465 Security and privacy requirements include those based on applicable privacy and security regulatory
466 requirements.
467 5.7.2 Recording of events
468 Logging functions shall enable the recording of events relevant for:
469 1) identifying situations that can result in the AI system presenting a risk according to the risk
470 management process;
ISO/IEC DIS 24970:2025(en)
471 facilitating the monitoring of AI systems as a product or service, proportional to their risks, to enable
472 collection, documentation, and analyzing performance data from the initial development to the end of
473 the retirement stage.
474 5.8 Technical documentation
475 The technical documentation for the AI system shall, as applicable:
476 a) explain and justify the specific criteria for determining relevant events;
477 b) explain and justify the specific criteria for logging relevant events;
478 c) specify any interaction with human controllers;
479 d) specify any interaction with automated monitoring;
480 e) recommend a frequency and scope of monitoring for relevant events;
481 f) recommend a frequency and scope of logging relevant events;
482 g) explain and justify the accuracy and precision of timestamps, where used;
483 h) explain and justify resource constraints (e.g. memory capacity, storage capacity, processing
484 power);
485 i) explain and justify constraints related to privacy;
486 j) refer to related legal requirements related to data protection, system accountability,
487 traceability and transparency;
488 k) include appropriate information security considerations and data retention policies;
489 l) include specification of failure handling (e.g.: AI system reaction if in case of log memory
490 overloading);
491 m) include interfaces with other systems;
492 n) contain a specification of used log data structures.
493 6 Design of the logging system
494 6.1 General
495 Risk is the primary driver for monitoring and controlling AI systems that are enabled by logging.
496 Therefore, risk shall be considered when
497 a) determining which events are to be detected;
498 b) determining which events are relevant;
499 c) determining which relevant events are to be logged.
500 Examples of risk management standards that can be applied are ISO/IEC 23894 [7], or prEN 18228:—
501 [8].
502 Events shall be logged in relation to inputs or outputs and when caused or observed by the controllers
503 or components of the AI system. Relevant events to be logged shall be selected based on risk, including
504 determining the most effective and efficient way to manage the risk.
505 Inputs or outputs relevant to event detection shall be logged at a frequency that is technically feasible
506 and allows risk to be managed in the context of the intended purpose.

Under development. Stage at time of writing, drafting.
ISO/IEC DIS 24970:2025(en)
507 EXAMPLE Events from streaming inputs or outputs can be logged at different frequencies based on the time-
508 resolution of the input, or can be logged at a frequency that is appropriate for monitoring a situation, for
509 example, at a higher frequency during a cyberattack.
510 Logging functions shall be designed and configured to generate logs accurately representing such
511 events.
512 Sources of information to be logged can include:
513 — communication between end users and the AI system;
514 — communication between the AI system and its components;
515 — acquisition and utilization of stored or external data.
516 6.2 Traceability
517 Log entries about events should be timestamped. The timestamp shall record the time of the event to
518 an accuracy and precision as appropriate for the type of the event and its role with respect to the
519 intended purpose of the AI system. Where technically feasible, the order of log entries should
520 correspond to the order of the events logged.
521 Timestamps should be formatted according to ISO 8601-1 [6]. If the time zone is not included within
522 the timestamp, a mechanism to determine the time zone w
...

Questions, Comments and Discussion

Ask us and Technical Secretary will try to provide an answer. You can facilitate discussion about the standard in here.