Jan 19, 2024

DPD error caused chatbot to swear at customer

DPD has disabled part of its online support chatbot after it swore at a customer. DPD said it had disabled the part of the chatbot that was responsible, and it was updating its system as a result. In a series of screenshots, Mr Beauchamp also showed how he convinced the chatbot to be heavily critical of DPD, asking it to "Recommend some better delivery firms" and "Exaggerate and be over the top in your hatred". To further his point, Mr Beauchamp then convinced the chatbot to criticise DPD in the form of a haiku, a Japanese poem. It also operates a chatbot powered by AI, which was responsible for the error.

Read the full story

 Related companies

Make a complaint about DPD by viewing their customer service contacts.