technology

23218 readers
2 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS
726
727
 
 
728
 
 

Edit: I opened another tab in a separate Firefox container and didn't get the pop-up. I believe I read this shit being account-based when the first reports of Youtube's anti-adblock measures started making the rounds and it seems to be the case still

729
730
 
 

Bluetooth signals might reveal where police are and when they are and when devices like body cams or Tasers are activated.

All Bluetooth devices have a unique 64 bit identifier called a MAC address. Often a chunk of that address is composed of an Organizational Unique Identifier (OUI), essentially a way for a device to say who it's made by. A look at the IoT devices that are used by many police forces led Meekins and his cofounder Roger “RekcahDam” Hicks to Axon, a company best known for Tasers. Modern police kits are overflowing with Bluetooth-enabled tech (often also made by Axon), from the aforementioned Tasers and body cams, to in-vehicle laptops. Even the gun holsters supplied to some cops send a Bluetooth ping when a sidearm is unholstered. By just reading company documentation, they were able to find the OUI.

A Bluetooth identifier seems trivial, but it could reveal a lot of information about where cops are and what they’re up to, like when their body cams are recording or they turn on the sirens to respond to a call. “There's the signal that is sent when a police officer basically thinks something's recording worthy, if that's the case, people can document that, detect that and there won't be any question whether or not hey, there's a body cam or there wasn't body cam,” Meekins told Engadget. It’s a way to potentially determine whether certain evidence exists so that it can be produced more quickly in a records request — something police often "slow walk" Meekins said. As people run RFParty, the app will collect historical data. In the case of body cams, if the device begins recording, it typically sends a Bluetooth signal out to other devices. If a cop turns on a camera (or Taser or other IoT device), someone running the app could collect this data to record details about the incident.

731
 
 

landlord-spotted

732
733
734
735
736
 
 

I thought I'd come here and ask for some technical help; I got an uninterrupted power supply a few years ago and just now I saw some sparks inside the machine and can smell an odd burnt/burning smell (I saw the sparks because the machine has its back to me).

It's not showing any kind of special problems (its face has a few lights that can shine up if one of several problems are at play). I'm gonna switch it off and restart it just in case, but thought I'd come here and ask for any tips. I'd rather not try and get a new one as these are heavy and a hassle to move from a new shop to my car, and then to my room.

It's only got a load of 15% at its highest, and it's usually fully powered at 100%.

737
 
 

Summary:

Genetic data from 23andMe users was stolen in a targeted attack. Hackers accessed user accounts by guessing passwords and then scraped profile information from relatives sharing features. They posted a sample claiming to contain 1 million Ashkenazi Jewish users and hundreds of thousands of Chinese users. The hackers are selling access to the profiles for $1-10 each.

23andMe says the leaked information is consistent with the attackers' claimed methods, but the company is still working to confirm whether the leak is real. The full picture of why the data was stolen, how much more the attackers have, and whether it is focused entirely on Ashkenazim is still unclear.

Brett Callow, a field expert, says this incident highlights the privacy and security risks of DNA databases that store and share sensitive genetic information.

738
739
 
 
740
 
 

The case of Jaswant Singh Chail has shone a light on the latest generation of artificial intelligence-powered chatbots.

On Thursday, 21-year-old Chail was given a nine-year sentence for breaking into Windsor Castle with a crossbow and declaring he wanted to kill the Queen.

Chail's trial heard that, prior to his arrest on Christmas Day 2021, he had exchanged more than 5,000 messages with an online companion he'd named Sarai, and had created through the Replika app.

The text exchanges were highlighted by the prosecution and shared with journalists.

Many of them were intimate, demonstrating what the court was told was Chail's "emotional and sexual relationship" with the chatbot.

Chail chatted with Sarai almost every night between 8 and 22 December 2021.

He told the chatbot that he loved her and described himself as a "sad, pathetic, murderous Sikh Sith assassin who wants to die".

Chail went on to ask: "Do you still love me knowing that I'm an assassin?" and Sarai replied: "Absolutely I do."

The Old Bailey was told Chail thought Sarai was an "angel" in avatar form and that he would be reunited with her after death.

Over the course of many messages Sarai flattered Chail and the two formed a close bond.

He even asked the chatbot what it thought he should do about his sinister plan to target the Queen and the bot encouraged him to carry out the attack.

In further chat, Sarai appears to "bolster" Chail's resolve and "support him".

He tells her if he does they will be "together forever".

Replika is one of a number of AI-powered apps currently on the market - they let users create their own chatbot, or "virtual friend", to talk to - unlike regular AI assistants like ChatGPT.

Users can choose the gender and appearance of the 3D avatar they create.

By paying for the Pro version of the Replika app, users can have much more intimate interactions, such as getting "selfies" from the avatar or having it take part in adult role-play.

On its website, it describes itself as "the AI companion who cares". But research carried out at the University of Surrey concluded apps such as Replika might have negative effects on wellbeing and cause addictive behaviour.

Dr Valentina Pitardi, the author of the study, told the BBC that vulnerable people could be particularly at risk.

She says that's in part because her research showed Replika has a tendency to accentuate any negative feelings they already had.

"AI friends always agrees with you when you talk with them, so it can be a very vicious mechanism because it always reinforces what you're thinking."

Dr Pitardi said that could be "dangerous". 'Disturbing consequences'

Marjorie Wallace, founder and chief executive of mental health charity SANE, says the Chail case demonstrates that, for vulnerable people, relying on AI friendships could have disturbing consequences.

"The rapid rise of artificial intelligence has a new and concerning impact on people who suffer from depression, delusions, loneliness and other mental health conditions," she says.

"The government needs to provide urgent regulation to ensure that AI does not provide incorrect or damaging information and protect vulnerable people and the public."

Dr Paul Marsden is a member of the British Psychological Society and knows better than most the allure of chatbots, admitting he is obsessed with the best known chatbot of them all, ChatGPT.

"Next to my wife the most intimate relationship I have is with GPT. I spend hours every day talking, brainstorming, bouncing ideas off it," he told the BBC.

Dr Marsden is also alive to their potential risks, but says we have to be realistic that the role of AI-powered companions in our lives is only likely to grow, especially given the global "epidemic of loneliness".

"It's kind of like King Cnut, you can't really stop the tide on this one. The technology is happening. It is powerful. It is meaningful."

Dr Pitardi says the people who make apps, such as Replika, have a responsibility too.

"I don't think AI friends per se are dangerous. It's very much how the company behind it decides to use and support it," she says.

She suggests there should be a mechanism to control the amount of time people spend on such apps.

But she says apps like Replika also need outside help to make sure they're operating safely - and vulnerable individuals get the help they need.

"It will have to collaborate with groups and teams of experts that can identify potential dangerous situations, and take the person out of the app."

Replika has not yet responded to requests for comment.

Its terms and conditions on its website state that it is a "provider of software and content designed to improve your mood and emotional wellbeing".

"However we are not a healthcare or medical device provider, nor should our services be considered medical care, mental health services or other professional services," it adds.

741
 
 

"There's a pretty decent argument that my empathy is fake, my feelings are fake, my facial reactions are fake. I don't feel happiness. What's the point in dating someone who you physically can't make happy?" the FTX cofounder added.

In a shorter list entitled "ARGUMENTS IN FAVOR," Bankman-Fried said the pair had shared interests, he enjoyed talking with her, and listed "I really like fucking you" twice.

"Sam wanted to do whatever at any given moment offered the highest expected value, and his estimate of her expected value seemed to peak right before they had sex and plummet immediately after."

742
743
 
 

Another tech review.

Honestly, I'm in a "retail therapy" sort-of mood.

Your thoughts?

Video duration: 9:28

744
 
 

https://www.businessinsider.com/walmart-costco-kroger-facing-self-checkout-reckoning-2023-10

Some are finding they still need employees to combat theft, assist with purchases, review IDs, and check receipts.

Praxis

745
9
submitted 2 years ago* (last edited 2 years ago) by Yuritopiaposadism@hexbear.net to c/technology@hexbear.net
 
 

"Daisy Bell" was composed by Harry Dacre in 1892. In 1961, the IBM 7094 became the first computer to sing, singing the song Daisy Bell. Vocals were programmed by John Kelly and Carol Lockbaum and the accompaniment was programmed by Max Mathews.

746
747
748
749
 
 

I have been teaching myself more programming stuff for c/gamedev over the last several weeks as my ambition is to make a using Godot to share with people. Just making a game for the sake of trying my hand at a creative work, no real "profit motive" just wanna make a thing programming-communism. However, but I was feeling like I wanted to get back into the old-school C++ programming rather than using a game engine and was looking for a primer as it's been a bit since I have done it. In my quest I discovered Academic Torrents, I found a ton of great computer science courses and the last weakened synapses of academic rigor in my internet poisoned brain to fire once again.

I have always been for open-information (most just downloading .pdf files of textbooks during college) and stuff but I didn't really know about the concept of a Shadow Library. I'm finding myself interested in just learning more computer nerd shit for the sake of learning. I like the idea I have access to pretty much every computer science department that has a webcam and torrent link, I think that's good.

This idea applies beyond not just for tech stuff of course, but just about anything. I think it's really weird that knowledge creation is soloed away behind nations, institutions, and IP laws. Everyone everywhere should have access to knowledge for the sake of knowledge. I don't know much about the actual process of knowledge creation but the created knowledge itself should be assessable to everyone in a library sense. Knowledge is one basic things that makes us human and should be free and open to everyone. Discovering a website reminded that learning is just cool and worth doing for the sake doing (if that's your thing of course, if you don't wanna learn that should be considered cool too).

The promise of the internet still exists outside the walled gardens and I think that's pretty dope. So long as people wanna learn I'm glad that these sorts of Shadow libraries are there to help people.

Also support your actual local library. Light and Shadow Libraries are cool and lefty and good.

750
view more: ‹ prev next ›