Hackers misuse ChatGPT to write malware

Finance Mentor

According to recent research, hackers are creating Telegram bots with the help of ChatGPT to write new malware and steal your data. Presently, if asked, the AI bot will not create malware codes or draft a phishing e-mail while impersonating a bank, as per its user interface regulations. However, hackers are trying to find a way to tackle the restrictions on ChatGPT.

This is achieved by utilising OpenAI API to bypass the bot's barriers, according to discussions on the fraudsters’ underground forums. Checkpoint research revealed, “This is done mostly by creating Telegram bots that use the API. These bots are advertised in hacking forums.” Prior to this, it was noted that the bot was being used to enhance the Infostealer malware, first created in 2019. Even though the code is not difficult or complicated to develop, ChatGPT has helped hackers improve the code.

The present version of the AI bot’s API has minimal anti-abuse measures in place and is used by external apps. This makes it easy for fraudsters to bypass the limitations and barriers set up by ChatGPT on its user interface. Cyber-criminals can avail themselves of the AI bot's services for up to 20 queries for free. They are then charged $5.50 (about ₹455.05) for every 100 queries, as part of the business model.

1 Comment  |  
2 Dimers
  • Sort By
Deal Cadet Deal Cadet
Link Copied

Telegram is meant for financial fraud, this telegram ceo does a foul cry on whatsapp about security and privacy, so funny. Telegram itself is a hub for fraud, no surprise with this ...

Click here to reply