Telling an AI chatbot your deepest secrets or revealing your

Telling an AI chatbot your deepest secrets or revealing your political views would be “extremely unwise”, warns Oxford Don

  • Mike Wooldridge warns that ChatGPT shouldn't listen to your political views

Complaining about your boss or expressing political views on ChatGPT is “extremely unwise”, according to an Oxford don.

Mike Wooldridge said the AI ​​tool shouldn't be viewed as a trustworthy confidant because it could get you in trouble.

Anything you tell the “chatbot” will help train future versions, he added, and the technology will only tell you what you want to hear.

The AI ​​professor is delivering the Royal Institution's Christmas Lectures this year and will delve into the “truth” about the subject.

He said humans are programmed to seek consciousness – but we “attribute it far, far too often.”

Complaining about your boss or expressing political views on ChatGPT is “extremely unwise”, according to an Oxford don.

Complaining about your boss or expressing political views on ChatGPT is “extremely unwise”, according to an Oxford don.

Mike Wooldridge said the AI ​​tool shouldn't be viewed as a trustworthy confidant because it could get you in trouble

Mike Wooldridge said the AI ​​tool shouldn't be viewed as a trustworthy confidant because it could get you in trouble

He compared the idea of ​​finding personalities in chatbots to seeing faces in the clouds and said of AI: “It has no empathy.” It has no compassion.

“That's absolutely not what the technology does, and the bottom line is that it's never been experienced before.” “The technology is essentially designed to tell you what you want to hear – that's literally all there is she does.”

Treating it as something else was particularly risky because “one should assume that anything you type into ChatGPT will be fed directly into future versions of ChatGPT.”

He said it would be “extremely unwise to have personal conversations or complain about your relationship with your boss or express your political opinions.”

Professor Wooldridge added that because of the way AI models work, it was also almost impossible to get your data back once it was in the system. Earlier this year, the company behind ChatGPT, OpenAI, had to fix a bug that allowed users to view parts of other users' chat history.

The company promises to only keep them for 30 days and not to use them to train the chatbot.

The talks will be broadcast on BBC Four and iPlayer on December 26th, 27th and 28th at 8pm.