6 scary things ChatGPT has already been used


ChatGPT is already causing problems.

the Text generator from Open AI incomplete in establishing exact facts or Fun writingbut she can Create a relatively proper text for any prompt in no time at all. that’s awesome. And even with a bunch of built-in filters, that can also be dangerous.

Perhaps unsurprisingly, people have actually found uses for the tool less than ideal. Anything that can create content from an entire piece of cloth can probably create something dangerous. It is only a matter of time.

We’ve rounded up six of the shocking — or at least questionable — uses people have already found for the app. And remember, this is all BEFORE the application goes current completely(opens in a new window) It is still in its infancy.

1. Create malware

ChatGPT creating malware is scary. Not really because malware is such a new thing, but because ChatGPT can do this endlessly. Artificial intelligence never sleeps. Information Security Journal wrote that (opens in a new window)“Cybersecurity researchers have managed to create polymorphic software that is very elusive and difficult to detect.” Now, to be clear, I am not a cyber security expert. But basically, researchers can use the app to generate code for malware, and then use the app to build variations on that code to make it harder to detect or stop.

In other words, we can change the output on a whim, making it unique each time. researchers(opens in a new window) Tell Information Security Journal.

2. Cheating at school

Well, this one is less scary, but it’s more predictable. A tool that can generate text about anything… literally perfect for a kid trying to cheat at school. Kids love to cheat at school. Masters already said(opens in a new window) They caught the students in the act While school districts across the country(opens in a new window) The application has been blocked. It’s hard to see this trend slowing down, and it’s possible that AI will become just another tool that kids are allowed to use to learn.

3- Using it to send spam on dating apps

Well, maybe spam isn’t the perfect word, however People use ChatGPT to chat with matches on Tinder. People let AI take over the conversation. While it’s not necessarily scary, it’s pretty upsetting to think you could be chatting to an app instead of a potential partner. Dating has turned into growth hacking.

4. Taking book jobs

This is just one example but uh, should I worry about my business?

5. Phishing and fraud

It’s hard to prove that this actually happened, but it stands to reason that ChatGPT, or similar AI writing tools, would be ideal for phishing. Phishing messages are often easy to spot because of their broken language, but with ChatGPT they can all be fixed. experts He said(opens in a new window) This is a very clear use case for the app.

To see if this was actually believable, we asked him at Mashable to fix shaky English in a real scam email, and not only did he make a nice cleanup in seconds, but in one output he set out to blackmail a fictional reader. without asking them.

ChatGPT cleaned scam email

So friendly!
Credit: Mashable

6. Recruiters can be deceived by work

Everyone knows how hard it is to apply for jobs. It’s a seemingly endless, often frustrating process that, hopefully, ends up rewarding a good job. But if you are applying for jobs, you may be missing out on your dream AI application opportunity. Consulting company Reportedly found that apps(opens in a new window) By ChatGPT outperformed 80 per cent of humans. It’s easy to see how ChatGPT will identify all the keywords that will catch the eye of recruiters or, more likely, pass through the filters set by the HR software.





Source link

Related Posts

Precaliga