Hogan Lovells - Asia-Pacific Data, Privacy and Cybersecurity Guide 2025 - Flipbook - Page 43
Asia-Pacific Data Privacy and Cybersecurity Guide 2025
At the same time, the AI Guidelines also allow
organisations a margin of discretion. They state
that where an organization evaluates that it is
necessary to omit any of the above information
due to commercial sensitivity or intellectual
property protection, the organisation may limit
critical details, or provide a general explanation
of the information cited above. However, the
organization must justify and document
internally the reasons for its decision. Thus,
in the deployment of AI systems, as in their
development, the AI Guidelines demonstrate
the PDPC’s innovation-friendly approach in the
regulation of personal data for AI use.
Accountability obligation in
AI deployment
The AI Guidelines also emphasise the
importance of the PDPA’s accountability
obligation – an organisation’s duty to take and
demonstrate responsibility for the personal
data in its possession or control. In the
deployment of AI systems involving personal
data, this obligation entails the development of
written policies to ensure the appropriate use of
such data: a use consistent with purposes that
individuals have consented to, or with another
legitimate purpose.
The AI Guidelines also recommend that
an organisation’s policies should contain
measures to ensure the proper use of personal
data. Examples include measures to ensure
that AI systems provide fair and reasonable
recommendations, such as recommendations
free from bias; technical safeguards to protect
personal data, such as pseudonymisation
or data minimisation; and – for
higher-impact cases – information on how
adequate accountability mechanisms and
human oversight have been implemented.
Procurement of AI systems
The AI Guidelines state that where service
providers process personal data on their
customers’ behalf in order to help develop or
deploy AI systems, such service providers may
43
occupy the role of data intermediaries. In that
capacity, they must comply with the PDPA
obligations that apply to data intermediaries:
firstly, to implement strict measures to protect
personal data from unauthorised access or use;
secondly, to retain personal data only insofar
as necessary to fulfil a legal or business need
or the purpose for which it was collected;
and thirdly, to report data breaches to the
organisation they are processing personal data
on behalf of.
The AI Guidelines also recommend that such
service providers must support their customers,
to help such customers comply with their
own notification, consent, and accountability
obligations. In particular, service providers
should understand the information that their
customers are likely to need, in order to comply
with these obligations. Service providers
should also design systems that can extract
such information.
At the same time, the AI Guidelines emphasise
that the primary responsibility for ensuring
AI systems comply with these obligations
rests with the organisation itself. In making
this point, the AI Guidelines underscore the
importance that the PDPC places in holding
organisations accountable to their PDPA
obligations. Organisations cannot avoid or
reduce their obligations by engaging third
parties to develop or deploy their AI systems.
Children’s Data Guidelines
Following a public consultation in 2023, the
PDPC issued in March 2024 its Advisory
Guidelines on the PDPA for Children’s
Personal Data in the Digital Environment
(CD Guidelines). The CD Guidelines apply
to organizations whose online products or
services are likely to be accessed by children.
This is broader than products or services
designed for and aimed specifically at children,
but rather, are those that children access
in reality. The CD Guidelines clarify that
consent from parents or legal guardians are