ChatGPT definitely has changed the game but it also has come with some security concerns. Another concern is coming to surface as ChatGPT rival is being sold to DarkWeb. The researchers warn that a ChatGPT-style artificial intelligence tool "with no ethical boundaries or constraints" offers hackers a way to launch attacks on an unprecedented scale.
ChatGPT rival is being sold to DarkWeb
Cybersecurity firm SlashNext has observed the manufacturer artificial intelligence WormGPT being marketed on cybercrime forums on the dark web. The firm described this artificial intelligence tool as a "sophisticated artificial intelligence model" that can generate text that looks like man-made text that can be used in hacking activities.
"This tool presents itself as a cybercrime alternative to GPT models, designed specifically for malicious activity," the company explained in a blog post. AI has been evolving so much but it also causes some security concerns and problems worldwide.
Leading AI tools like OpenAI's ChatGPT and Google's Bard have built-in protections that prevent people from misusing this technology. In contrast, WormGPT is allegedly designed to facilitate criminal activities. What kind of problems can arise is a scary concept for everyone but still there is no going back with the development of AI.