But not before harvesting a bunch of personal info.
geekwithsoul
I setup my new account on a new instance a bit ago but want to stick it out to the end on this one!
I think the difference is that sealioning is a pattern of behavior, rather than just occasionally asking for a source. It describes the lack of intent to engage in good faith discussion and instead just is a method of trolling.
If it's malware, it - by definition - is going to need to run a privileged executable. That's the "ware" in "malware". The LLM is just explaining the specific method they're attempting to use - which again should be obvious both by the nature of the actions it's requesting from the user as well as the specific text it's asking to be run. It explicitly says it doesn't know anything about the executable that's being run, so it really isn't offering anything particularly useful or actionable - just wasting resources.
Making a scene?! Oh no! Have I shattered the fragile Lemmy decorum with my boorish behavior? How dreadful!
Listen, if you want to believe an LLM has anything useful to say about the malware you're presented with on dodgy sites, go for it.
And I'll be free to think you're a prime example for why we should start requiring a "drivers license" to get on a computer. To each their own.
"Chill out you"
Fucking priceless. The LLM didn't explain anything beyond what was obvious from just looking at it. It was trying to get you to run a privileged executable. The LLM doesn't have a clue what the executable does, and even admits that. So why bother asking it?
Let's take the tech out of it. You're at a restaurant and you're given a beverage in a glass, but you can see the glass is dirty with food residue. Do you have to consult an LLM to know not to drink out of it? Does it matter what sort of food residue it is? Of course not.
I swear people's critical thinking skills are non-existent or in complete atrophy these days. The only thing of potential interest is the executable itself and if you're posting this question, I'm not sure any explanation or details would mean anything to you.
Okay but pretty much any malware is going to follow those same steps - they're what makes it malware. The LLM doesn't "prove" anything - it's not examining the executable, it's not setting up a VM and doing deep packet analysis to see how the malware operates. It's just parroting back the fact this is malware with details seeded from the prompt. This is like yelling into a canyon and "proving" someone is stuck in the canyon and yelling because you heard an echo.
No one should be using an LLM as a security backstop. It's only going to catch the things that have already been seen before, and the minute a bad actor introduces something the least bit novel in the attack, the LLM is confidently going to say it isn't malware because it hasn't seen it before. A simple web search would have turned up essentially the same information and used only a small fraction of the resources.
Or you can just know that if some rando site is asking you to run cmd and powershell as some sort of authentication scheme, you're about to get your shit fucked up. The specifics literally don't matter, this is behavior no legit site would request you to do.
You needed an LLM to figure out this was malware?! Sweet jesus, we're well and truly fucked.
Or you could just do the thing yourself with less effort and less wasted resources. The true Turing test is knowing when something doesn't need fucking AI to do.
This article is a few years old but Brazil and Vietnam have very high numbers of pirated software https://kommandotech.com/statistics/software-piracy-statistics/
Brazil was fifth and Vietnam was 9th. And unlike many of the other countries on the list, they lack robust network operation infrastructures of built-up corporations or governments.
May just be a coincidence, but if I was a nation or other bad actor looking to create a ready botnet, seems like flooding markets with cracked but compromised copies of software (especially Windows) would be an easy way to accomplish that.
I think Robbie the Wormbot's batteries are running low.