WormGPT: TheTO THE that threatens security
Interest around WormGPT, also known as ChatGPT's “evil twin”, is constantly growing. This tool, based on artificial intelligence, was designed with a single purpose: to make the work of cybercriminals easier. WormGPT represents a serious threat in the digital world, as it is capable of executing malicious activities by exploiting its ability to complete tasks and processes with malicious intent. Created for profit by unscrupulous developers, WormGPT has become accessible through some forums on the dark side of the web. This phenomenon represents a dark side of artificial intelligence, highlighting the ethical and security challenges that characterize the digital landscape of 2023 and beyond.
The Origins of WormGPT and the Threat It Brings
A recent report conducted by security researchers at SlashNext shed light on WormGPT. This tool is described as a "black hat" version, i.e. designed for malicious hacking purposes, of the now well-known service offered by OpenAI, which has introduced artificial intelligence into multiple fields. WormGPT is based on the GPT-J language model, released in 2021, which was trained specifically to process prompts without limits on code length or formatting. Subsequently, the AI was trained on methods, processes and purposes commonly associated with illegal activities.
Mode of Use and Associated Threats
WormGPT lends itself to various malicious activities, in particular email phishing. The aim is to deceive victims through persuasive messages and multimedia content in order to trick them into downloading infected files or visiting fake websites. Like ChatGPT, WormGPT also allows a natural conversation, speeding up and simplifying the creation of texts and content, while also offering tips on how to scam less computer-savvy victims.
A Paid Offer and a Potential Increase in Illicit Activities
The WormGPT service is available for a fee, with a cost of 60 euros per month or 550 euros for a whole year. While the price may discourage potential interested parties, there is a risk that it will be perceived as a profitable investment to increase illegal activity online. The emergence of WormGPT raises important ethical and security questions, highlighting the need to implement measures to counter the malicious use of this technology.
In conclusion, WormGPT represents a clear example of how technology can be used for illicit purposes. Its existence draws attention to the need for vigilance to address the challenges posed by artificial intelligence in the digital context.