Literally burning the planet with power demand from data centers but not even knowing what it could possibly be good for?
That's eco-terrorism for lack of a better word.
Fuck you.
This is a most excellent place for technology news and articles.
Literally burning the planet with power demand from data centers but not even knowing what it could possibly be good for?
That's eco-terrorism for lack of a better word.
Fuck you.
How can you lose social permission that you never had in the first place?
There's a latency between asking for forgiveness and being demanded to stop.
You already don't have social permission to do what you are doing, and that hasn't stopped you. The world is bigger than the 10 people around your board's table.
Yea but what he means is that nobody is cutting power lines or driving trucks through walls yet.
"Microsoft thinks it has social permission to burn the planet for profit" is all I'm hearing.
Well, they at least have investor permission...which is the only people they care about anyway
I hope all parties responsible for this garbage, including Microsoft will pay a huge price in the end. Fuck all these morons.
Stop shilling for these corporate assholes or you will own nothing and will be forced to be happy.
The oligarch class is again showing why we need to upset their cart.

Social permission? I dont remember that we had a vote or something on this bullshit.
As far as I can tell there hasn't been any tangible reward in terms of pay increase, promotion or external recruitment from using the cognitive amplifier.
Just make copilot it's own program that is uninstallable, remove it from everywhere else in the OS, and let it be. People who want it will use it, people who don't want it won't. Nobody would be pissed at Microsoft over AI if that is what they had done from the start.
No, it will be attached to every application, as well as the start menu, settings, notepad, paint, regedit, calculator and every other piece of windows you AI hating swine
we attached it to the clock in case you need it to get the time wrong.
you never had it to begin with. Goddamn leeches.
"bend the productivity curve" is such a beautiful way to say that they are running out of ideas on how to sell that damn thing.
It basically went from :
... to "bend the productivity curve". It's not how it "radically increase productivity" no it's a lot more subtle than that, to the point that it can actually bend that curve down. What a shit show.
Delusional, created a solution to a problem that doesn't exist to usurp the power away from citizens and concentrate it in the minority.
This is the opposite of the information revolution. This is the information capture. It will be sold back to the people it was taken from while being distorted by special interests.
AI industry needs to encourage job seekers to pick up AI skills (undefined), in the same way people master Excel to make themselves more employable.
Has anyone in the last 15 years willingly learned excel? It seems like one of those things you have to learn on the job as your boomer managers insist on using it.
I did and it's awesome. People like to shit on Excel, but there is a reason why every business on earth runs on Excel. It's a great tool and if you really learn it, you can do great things with it.
Funny thing about "AI skills" that I've noticed so far is that they are actually just skills in the thing you're trying to get AI to help with. If you're good at that, you can often (though not always) get an effective result. Mostly because you can talk about it at a deeper level and catch mistakes the AI makes.
If you have no idea about the thing, it might look competent to you, but you just won't be catching the mistakes.
In that context, I would call them thought amplifiers and pretty effective at the whole "talking about something can help debug the problem, even if the other person doesn't contribute anything of value because you have to look at the problem differently to explain it and that different perspective might make the solution more visible", while also being able to contribute some valueable pieces.
Excel depends on the usage. Way too many people want to use it for what it's bad at, but technically can do, instead of using it for what it's good at.
I'm fairly decent at using Excel, and have automated some database dependent tasks for my coworkers through it, which saves us a lot of time doing menial tasks no one actually wants to do.
Isn't there plenty of research it's the opposite of a cognitive amplifier, people get cognitively lazy using ai.
AI isn't at all reliable.
Worse, it has a uniform distribution of failures in the domain of seriousness of consequences - i.e. it's just as likely to make small mistakes with miniscule consequences as major mistakes with deadly consequences - which is worse than even the most junior of professionals.
(This is why, for example, an LLM can advise a person with suicidal ideas to kill themselves)
Then on top of this, it will simply not learn: if it makes a major deadly mistake today and you try to correct it, it's just as likely to make a major deadly mistake tomorrow as it would be if you didn't try to correct it. Even if you have access to actually adjust the model itself, correcting one kind of mistake just moves the problem around and is akin to trying to stop the tide on a beach with a sand wall - the only way to succeed is to have a sand wall for the whole beach, by which point it's in practice not a beach anymore.
You can compensate for this by having human oversight on the AI, but at that point you're just back to having to pay humans for the work being done, so now instead of having to the cost of a human to do the work, you have the cost of the AI to do the work + the cost of the human to check the work of the AI and the human has to check the entirety of the work just to make sure since problems can pop-up anywere, take and form and, worse, unlike a human the AI work is not consistent so errors are unpredictable, plus the AI will never improve and it will never include the kinds of improvements that humans doing the same work will over time discover in order to make later work or other elements of the work be easier to do (i.e. how increase experience means you learn to do little things to make your work and even the work of others easier).
This seriously limits the use of AI to things were the consequences of failure can never be very bad (and if you also include businesses, "not very bad" includes things like "not significantly damage client relations" which is much broader than merely "not be life threathening", which is why, for example, Lawyers using AI to produce legal documents are getting into trouble as the AI quotes made up precedents), so mostly entertainment and situations were the AI alerts humans for a potential situation found within a massive dataset and if the AI fails to spot it, it's alright and if the AI incorrectly spots something that isn't there the subsequent human validation can dismiss it as a false positive (so for example, face recognition in video streams for the purpose of general surveillance, were humans watching those video streams are just or more likely to miss it and an AI alert just results in a human checking it, or scientific research were one tries to find unknown relations in massive datasets)
So AI is a nice new technological tool in a big toolbox, not a technological and business revolution justifying the stock market valuations around it, investment money sunk into it or the huge amount of resources (such as electricity) used by it.
Specifically for Microsoft, there doesn't really seem to be any area were MS' core business value for customers gains from adding AI, in which case this "AI everywhere" strategy in Microsoft is an incredibly shit business choice that just burns money and damages brand value.
Take away:
I have a nagging feeling the general public does not hate Microsoft as much as computer nerds do and so probably overall their image is muddled to not that bad overall.
AI can absolutely be useful. But it’s been wildly oversold and the actual beneficial use cases are not nearly as profitable as the marketing around it
Eeh didn't you pay attention in economy 101? If you generate more supply than demand that's a you problem. The free market will take care.
I will try to have a balanced take here:
The positives:
The negatives
Overall I wish the AI bubble burst already
menial tasks that are important such as unit test coverage
This is one of the cases where AI is worse. LLMs will generate the tests based on how the code works and not how it is supposed to work. Granted lots of mediocre engineers also use the "freeze the results" method for meaningless test coverage, but at least human beings have ability to reflect on what the hell they are doing at some point.
Honestly, this is the most reasonable take I have heard from tech bros on ai so far... Use it for something useful and stop using it for garbage!
Ai has a million great uses that could make so many things so much easier, but instead we are building AI to undress women on twitter
Its a admission that it isn't doing anything useful.
Did they ever have social permission in the first place?
So you admit it. You admit AI isn't useful.
He could set an example by replacing himself with AI
can't keep what you never had you corrupt piece of shit.
To be honest, I did tried a couple of AI's. But all I got where solutions that would never work on the stated hardware. Code full of errors and when fixed never functions as requested. On any non-technical questions it's always agreeing and hardly (not at all actually) challenging any input you give it. So yeah, i'm done with it and waiting for the bubble to burst.
Isn't it alleged that China goes for specific use cases and not general intelligence?
Maybe that's the way to go and not the gamble that the US and western companies are doing.
Well you already lost that or rather never actually had that. You all pushed a broken and incomplete product you need to find a use not us...
We need an American Zelenskyy who would save us from the oligarchs.
Like a president but good?
It would be more useful to replace CEOs with AI. Or maybe even my dog.