AI in Retail: Balancing the Benefits with Privacy and ACL Issues

Watchdog Compliance
Businesses need to be aware of some risks when using Chat GPT (AI technology).  This is being embraced by many retail businesses to assist with:

  • customer service;
  • customer data analysis;
  • writing of marketing and product descriptions; 
  • creation of content for social media;
  • sales and analysis of customer data and behaviour;
  • proofreading and translating;
  • fraud detection; and
  • inventory management.


It is therefore vital for businesses to consider the implementation of guidelines for staff to follow when using AI tools, and to consider the Privacy and ACL issues involved in the use of this technology.

Some Privacy Issues to consider

Consent: Retailers should not share personal information of customers on ChatGPT or any other platform without the customer's consent. Chat GPT also advises that users should not share any sensitive information in their conversations.

Data use and handling: Business need to be aware that Chat GPT will log every conversation, including any personal information that is shared, and will use this to train the AI.  This includes any 'input, file uploads, or feedback' users provide. It also collects user IP addresses, browser types and settings and cookies (all of which it can share with its vendors or third parties) and it automatically opts users in to receive marketing communications.

Transparency: Retailers need to be transparent with customers about how ChatGPT works and what data it collects if it is used within the business.

Security: If a business enters any client, customer or partner information into a chatbot, that AI may use that information in ways that the businesses can’t control. It is therefore important to ensure that personal information, sensitive information or commercially sensitive information is not shared. 

Some Consumer Law Issues to consider

Possible reliance on inaccurate information: Chat GPT advises that while it has:

safeguards in place, the system may occasionally generate incorrect or misleading information and produce offensive or biased content. It is not intended to give advice’.

This is of particular concern if the incorrect information relates to mandatory product safety standards, product claims etc.

Misleading and deceptive conduct: Businesses must not engage in misleading or deceptive conduct when using Chat GPT to communicate with customers. This means that they cannot make false or misleading representations about goods or services, and must ensure that any information provided through Chat GPT is accurate and truthful. If the system is used for customer service, the tech may have problems interpreting the meaning behind certain phrases or questions, which could lead to issues with customers.

Consumer guarantees: Chat GPT may be used to provide information about goods and services to customers, but it cannot replace or override the consumer guarantees provided under the ACL. This means that if a customer has purchased a faulty or defective product, they are entitled to a remedy under the ACL, regardless of any information provided through Chat GPT.

Unfair contract terms: Retail staff must ensure that any terms and conditions included in Chat GPT communications with customers are fair and reasonable. The ACL prohibits businesses from including unfair contract terms in consumer contracts, and these provisions apply to Chat GPT conversations as well.