I totally get what you mean with this, but I don't quite think I find it as annoying as you do. I also optimize past it where I can, but I found it to be a minor inconvenience in Diablo 4.
Thanks for the reply!
I totally get what you mean with this, but I don't quite think I find it as annoying as you do. I also optimize past it where I can, but I found it to be a minor inconvenience in Diablo 4.
Thanks for the reply!
My first thought (and impetus for this thread) was the Gloomhaven series of board games.
Love the games. My group of 4 have played OG Gloomhaven, the expansion, digital, Jaws of the Lion, and now we're about 1/2 way through Frosthaven.
About 10 mission into the OG game, we decided to just do away with the Loot mechanic as it sucked and didn't make any sense whatsoever. Why would you go on a mission, kill everything, and then just... leave the loot behind if you didn't grab it in the middle of combat for some reason? If anything, you'd wait until combat was over and then gather things up. Y'know... like every other game on the planet.
What we do instead is just ignore it unless the mission is one where you're on the run or it otherwise wouldn't make sense to be able to loot the area afterwards (like a collapsing cave or somesuch). We feel it's made the game substantially better.
The part that doesn't make sense is how a guess on a QC in a binary is any better than a scientist just guessing an outcome from a binary. Yeah, it can do it a lot, but if you can't test the outcome to verify if it's correct or not, how is it better than any other way of guessing outcomes?
Statistically, it absolutely isn't. Even if it continually narrows things down via guesses, it's still no more valuable than any other guesses. Because in all the whitepapers I've seen, it's not calculating anything because it can't. It's simply assuming that one option is correct.
In the real world, it's not a calculation and it doesn't assist in... anything really. It's no better than a random number generator assigning those numbers to a result. I don't get the utility other than potentially breaking numerical cryptography.
I'm kinda torn on things. The other option in the area is a NoFrills, which is the Loblaws monopoly. They are Canadian, but evil as fuck. At least Safeway has a functional union for the workers?
Otherwise I can drive 20 minutes (both ways) and go to a Co-Op instead of a 4 minute walk. I'm legit not sure which of the options is worse or best.
It is deeply weird how the local Safeway will mark things as Canadian because manufacturing is here even though the ownership is entirely American.
They have Pepsi and Coke with Canadian flags on them for some weird reason.
So that's the part that gets me stuck. There is no clear answer and it has no way to check the result as QC aren't capable of doing so (otherwise they wouldn't be using QC since they can only be based on binary inputs and binary guesses of true / false outcomes on a massive scale). How can it decide that it is "correct" and that the task is completed?
Computations based on guesses of true / false can only be so accurate with no way to check the result in the moment.
I appreciate the reply!
I made the attempt, but couldn't parse that first link. I gathered that it was about error correction due to the absolutely massive number of them that crop up in QC, but I admit that I can't get much further with it as the industry language is thick on that paper. Error reduction is good, but it still isn't on any viable data, and it's still a massive amount of errors even post-correction. It's more of a small refinement to an existing questionable system, which is okay, but doesn't really do much unless I'm misunderstanding.
The Willow (and others) examples I'm skeptical on. We already have different types of chips for different kinds of operations, such as CPUs, GPUs, NPUs, etc. This is just one more kind of chip that will be found in computers of the future. Of course, these can sometimes be combined into a single chip too, but you get the idea.
The factorization of integers is one operation that is simple on a quantum computer. Since that is an essential part of public / private key cryptography, those encryption schemes have been recently upgraded with algorithms that a quantum computer cannot so easily unravel.
With quantum computing, a system of qubits can be set up in such a way that it's like a machine that physically simulates the problem. It runs this experiment over and over again and measures the outcome, until one answer is the clear winner. For the right type of problem, and with enough qubits, this is unbelievably fast.
Problem is, this only works for systems that have a known answer (like cryptography) with a verifiable result, otherwise the system never knows when the equation is "complete". It's also of note that none of these organizations are publishing their benchmarking algorithms so when they talk about speed, they aren't exactly being forthright. I can write code that runs faster on an Apple 2e than a modern x64 processor, doesn't mean the Apple 2e is faster. Then factor in how fast quantum systems degrade and it's... not really useful in power expenditure or financially to do much beyond a large corporation or government breaking encryption.
Well, I love being wrong! Are you able to show a documented quantum experiment that was carried out on a quantum computer (and not an emulator using a traditional architecture)?
How about a use case that isn't simply for breaking encryption, benchmarking, or something deeply theoretical that they have no way to know how to actually program for or use in the real world?
I'm not requesting these proofs to be snarky, but simply because I've never seen anything else beyond what I listed.
When I see all the large corporations mentioning the processing power of these things, they're simply mentioning how many times they can get an emulated tied bit to flip, and then claiming grandiose things for investors. That's pretty much it. To me, that's fraudulent (or borderline) corporate BS.
Yeah, most quantum science at the moment is largely fraudulent. It's not just Microsoft. It's being developed because it's being taught in business schools as the next big thing, not because anybody has any way to use it.
Any of the "quantum computers" you see in the news are nothing more than press releases about corporate emulators functioning how they think it might work if it did work, but it's far too slow to be used for anything.
We migrated a bunch of clients back when we took over for other IT. Cloud was slower, way more costly, less utilitarian, and gave less control. I have no idea why people switched in the first place.
I actually brought it up on an MSP subreddit back when I was still posting there and was relentlessly shit on.
Been an active poster still, just not on this Sub. Trying a non-political one to see if the community bites. Seems the most number of responses came when I pissed them off initially by posting in the initial topic instead of below...
Don't know how much I like using the "Facebook" style of pissing people off for engagement, but it does seem to work. Ugh.