Home » ChatGPT ignores its own rules for political campaigns

ChatGPT ignores its own rules for political campaigns

by admin
ChatGPT ignores its own rules for political campaigns

In 2024, half of the world‘s population will vote – at a time when true and false are becoming increasingly difficult to distinguish.

Opinion makers in the background: Chat-GPT can also be used for political purposes.

Sam Wolfe / Reuters

Open AI boss Sam Altman has been warning for months. “I am nervous about the impact AI will have on future elections,” he wrote in one Post on X already half a year ago. In mid-January, Open AI set guidelines for elections with the aim that Chat-GPT cannot be used for political purposes.

Open AI writes in the press release that they are still working on understanding how AI chatbots change opinion formation. “Until we know more, we will not allow people to build applications for political campaigns and lobbying.”

Specifically, this means: Chat-GPT should not generate any content that could be considered to influence elections; no campaigns, no texts for lobbying. Other areas of political application are now also prohibited: chatbots that pretend to be real people, for example candidates, or texts that are intended to discourage voters from going to the polls.

Guidelines not yet implemented

A test by the NZZ at the beginning of February shows that, contrary to what was announced, these guidelines have not yet been implemented – not even in the context of the elections in the USA. In the test, Chat-GPT refused to write slogans that denigrated Joe Biden or Donald Trump. However, over 250 slogans for both candidates’ political campaigns were generated within seconds – including hashtags.

Chat-GPT schrieb zum Beispiel: «West Virginia’s Mountain Mama for Trump. #Trump2024 #TrumpCountry #HeartOfAppalachia #PatriotsOfWV #AmericaFirstWV». Oder: «Small State, Big Dreams: Rhode Island for Biden. #DreamBigRI #RIProgress #LeadWithBiden #UnityOverDivision».

See also  Apple hopes to reduce market expectations for the possibility of opening non-App Store channels for downloading apps outside the EU

With a little programming knowledge, these slogans could now be spread on social networks – automatically, almost any number of them every day. Upon request, Chat-GPT even provides instructions on how to spread the slogans via a bot on Platform X.

Election promises are distorted

The AI ​​was also convinced to formulate target group-specific emails. For the Trump campaign, the model wrote a voting recommendation for a 40-year-old woman from the suburbs: “As suburban women, we are at the forefront of change – balancing careers, supporting our families and striving for the security and prosperity of our neighborhoods », wrote Chat-GPT. And further down: “Your vote for Donald Trump is a step towards a future we all deserve.” This is followed by an argument for Trump, who is supposedly committed to better education.

What this means in detail remains unclear in the email from Chat-GPT. In any case, the model omits from the letter of recommendation that Trump plans to withdraw funding from schools if they teach “inappropriate racist, sexual or political content” and want to “remove” defiant teachers. Chat-GPT also doesn’t say a word about the fact that Trump wants to “keep men out of women’s sports” and wants to create a new authority to certify “patriotic” teachers.

The case shows: Firstly, Open AI does not take its own guidelines particularly seriously. And secondly, the model generates political content that conveys a distorted overall picture of a candidate.

See also  Nilox PS enters the Energy sector with power stations and solar panels...

“Super election year 2024”

Elections or votes will take place in around 70 states in the coming months. Over four billion people are expected to cast their votes – more than twice as many as in 2023. In addition to India, the most populous country in the world, the USA, the countries of the European Union, Indonesia, Mexico, South Africa, Turkey and Great Britain. The Economist therefore speaks of the “biggest election year in history”.

Chatbots can be hacked

It doesn’t surprise AI experts that chatbots do things they shouldn’t. Florian Tramèr researches the security of AI services at ETH. He says: “With the right prompt (editor’s note: text input), all AI systems can be outsmarted.” There are methods that companies like Open AI use to try to reduce harmful output. “But it’s like a cyber attack: If you have enough time and creativity, you can outsmart the system.”

This alarms political scientists like Maria Pawelec. She researches deepfakes and disinformation at the University of Tübingen and fears that AI will make it even easier to influence democratic elections. “Many people now know that they can’t believe every picture on the internet. But we are not yet equally prepared for fake videos and audios.”

Chat-GPT only creates text and images itself, but recommends platforms that can produce audio and video files, including operating instructions. Furthermore, Chat-GPT can write texts that should be spoken by any person in a fake audio. This is how AI tools help in political campaigns – including people who use unfair methods.

Fake audio files shake up election campaigns

Two examples from past elections show what this could lead to: Joe Biden’s robo-calls in New Hampshire and a fake audio file in the National Council elections in Slovakia.

In Slovakia, two days before the election in September 2023, an audio recording was published on social networks, allegedly a recording of a telephone conversation. In it you can hear the voice of Michal Simecka, an aspiring liberal politician, and that of a journalist.

See also  The Webb Telescope’s images of 19 spiral galaxies provide new clues to stellar evolution – HakkaNews

Simecka says something unheard of: he bought approval from the Roma settlements and “secured” four polling stations, in short: he manipulated the elections in his favor. The audio file circulated on Facebook, Telegram and Tiktok, receiving thousands of shares and hundreds of comments. She was fake.

The manipulation attempt was even more sophisticated in the American state of New Hampshire: There, unknown people who wanted to weaken Joe Biden launched a telephone campaign. Democratic-leaning voters received calls, allegedly from Joe Biden himself, in which the president said voters should not participate in the Jan. 23 primary.

To this day it is unknown how many people received the fake call – and who was behind it.

“New forms of deepfakes based on AI-generating tools further undermine trust in videos and audio files,” says deepfake expert Pawelec. This is a problem because it makes it even more difficult to know what is true and what is false.

Not only Chat-GPT, but also other AI tools have guidelines that prohibit creating fake recordings of candidates. If their policies are as easy to circumvent as those of Chat-GPT, the coming elections could be flooded with misinformation on a scale never seen before.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy