If the company gave a noob unlimited access and can't restore their data from backups, it's really their fault, not the employee's.
Comic Strips
Comic Strips is a community for those who love comic stories.
The rules are simple:
- The post can be a single image, an image gallery, or a link to a specific comic hosted on another site (the author's website, for instance).
- The comic must be a complete story.
- If it is an external link, it must be to a specific story, not to the root of the site.
- You may post comics from others or your own.
- If you are posting a comic of your own, a maximum of one per week is allowed (I know, your comics are great, but this rule helps avoid spam).
- The comic can be in any language, but if it's not in English, OP must include an English translation in the post's 'body' field (note: you don't need to select a specific language when posting a comic).
- Politeness.
- AI-generated comics aren't allowed.
- Adult content is not allowed. This community aims to be fun for people of all ages.
Web of links
- !linuxmemes@lemmy.world: "I use Arch btw"
- !memes@lemmy.world: memes (you don't say!)
We had a management course in the university where this was one of the main things they highlighted:
Managers faults are the managers fault.
Employees faults are the managers fault. Without exception.
And if you think about it, that's completely true. If an employee does something stupid, it's most of the time because they a) had the opportunity to do it and b) they weren't taught well enough. If the employee keeps doing this mistake, the manager is at fault because he allows the employee to do the job where he can make the mistake. He obviously isn't fit for that position.
And people wonder why manager is paid more
Well yes, but they wonder that when the manager isn't taking responsibility and ensuing mistakes don't happen. A good manager is worth their weight in gold, but thanks to the Peter Principle most of them just end up there without being qualified or even wanting to do it!
When's the last time you tested backup restore and how long did it take?
Wasn't there some saying about if you're in a server room, the calmer the "Oops," the worse the problem?
"Ooopppsss... 💤", both containers of the UPS flow battery ruptured at the same time and flooded the whole server room... call me tomorrow for the planning meeting when things stop burning and firefighters have had a chance to enter the building.
If there isn't then there should be.
Had a colleague do this to the local AD server years ago.
Thankfully they pulled the plug before the changes could propagate through the network completely but it still took 3 days to recover the data and restore the AD server.
That's on the company for not having a proper disaster recovery plan in place.
Or DR test was literally the CIO wiping a critical server or DB and we had to have it back up in under an hour.
To be fair to the company it was a friday afternoon when said person ran a script
Yikes. At least it was only 3 days and not weeks or months of cleanup trying to rebuild shit!
You might like this little video then. Well, it's 10 minutes long but still. It's a story detailing a Dev who deleted their entire production database. Real story that actually happened. If you went through something similar then you definitely gonna relate a little.
That's not an oopsie daisy that's the whole oopsie bouquet
F*cking Gitlab moment
You're allowed to say "fucking" on the internet
This is funny, cute, and too relatable.
internally screaming