this post was submitted on 03 Aug 2025
430 points (93.2% liked)

Technology

73567 readers
4234 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Red_October@lemmy.world 1 points 2 minutes ago

Okay but are any AI chatbots really open source? Isn't half the headache with LLMs the fact that there comes a point where it's basically impossible for even the authors to decode the tangled madness of their machine learning?

[–] brucethemoose@lemmy.world 36 points 17 hours ago* (last edited 15 hours ago) (1 children)

First of all...

Why does an email service need a chatbot, even for business? Is it an enhanced search over your emails or something? Like, what does it do that any old chatbot wouldn't?

EDIT: Apparently nothing. It's just a generic Open Web UI frontend with Proton branding, a no-logs (but not E2E) promise, and kinda old 12B-32B class models, possibly finetuned on Proton documentation (or maybe just a branded system prompt). But they don't use any kind of RAG as far as I can tell.

There are about a bajillion of these, and one could host the same thing inside docker in like 10 minutes.

...On the other hand, it has no access to email I think?

[–] WhyJiffie@sh.itjust.works 4 points 15 hours ago (4 children)

Why does an email service need a chatbot, even for business?

they are not only an email service, for quite some time now

There are about a bajillion of these, and one could host the same thing inside docker in like 10 minutes.

sure, with a thousand or two dollars worth of equipment and then computer knowledge. Anyone could do it really. but even if not, why don't they just rawdog deepseek? I don't get it either

...On the other hand, it has no access to email I think?

that's right. you can upload files though, or select some from your proton drive, and can do web search.

[–] brucethemoose@lemmy.world 5 points 14 hours ago* (last edited 14 hours ago)

sure, with a thousand or two dollars worth of equipment and then computer knowledge. Anyone could do it really. but even if not, why don’t they just rawdog deepseek? I don’t get it either

What I mean is there are about 1000 different places to get 32B class models via Open Web UI with privacy guarantees.

With mail, vpn, (and some of their other services?) they have a great software stack and cross integration to differentiate them, but this is literally a carbon copy of any Open Web UI service… There is nothing different other than the color scheme and system prompt.

I’m not trying to sound condescending, but it really feels like a cloned “me too,” with the only value being the Proton brand and customer trust.

load more comments (3 replies)
[–] brucethemoose@lemmy.world 12 points 17 hours ago* (last edited 17 hours ago)

OK, so I just checked the page:

https://lumo.proton.me/guest

Looks like a generic Open Web UI instance, much like Qwen's: https://openwebui.com/

Based on this support page, they are using open models and possibly finetuning them:

https://proton.me/support/lumo-privacy

The models we’re using currently are Nemo, OpenHands 32B, OLMO 2 32B, and Mistral Small 3

But this information is hard to find, and they aren't particularly smart models, even for 32B-class ones.

Still... the author is incorrect, they specify how long requests are kept:

When you chat with Lumo, your questions are sent to our servers using TLS encryption. After Lumo processes your query and generates a response, the data is erased. The only record of the conversation is on your device if you’re using a Free or Plus plan. If you’re using Lumo as a Guest, your conversation is erased at the end of each session. Our no-logs policy ensures wekeep no logs of what you ask, or what Lumo replies. Your chats can’t be seen, shared, or used to profile you.

But it also mentions that, as is a necessity now, they are decrypted on the GPU servers for processing. Theoretically they could hack the input/output layers and the tokenizer into a pseudo E2E encryption scheme, but I haven't heard of anyone doing this yet... And it would probably be incompatible with their serving framework (likely vllm) without some crack CUDA and Rust engineers (as you'd need to scramble the text and tokenize/detokenize it uniquely for scrambled LLM outer layers for each request).

They are right about one thing: Proton all but advertise Luma as E2E when that is a lie. Per its usual protocol, Open Web UI will send the chat history for that particular chat to the server for each requests, where it is decoded and tokenized. If the GPU server were to be hacked, it could absolutely be logged and intercepted.

[–] digger@lemmy.ca 177 points 1 day ago (7 children)

How much longer until the AI bubbles pops? I'm tired of this.

[–] kepix@lemmy.world 2 points 6 hours ago

as long as certain jobs and tasks can be done easier, and searches can be done faster, its gonna stay. not a fad like nft. the bubble here is the energy and water consumption part.

[–] cley_faye@lemmy.world 16 points 20 hours ago

We're still in the "IT'S GETTING BILLIONS IN INVESTMENTS" part. Can't wait for this to run out too.

[–] wewbull@feddit.uk 29 points 22 hours ago (6 children)

It's when the coffers of Microsoft, Amazon, Meta and investment banks dry up. All of them are losing billions every month but it's all driven by fewer than 10 companies. Nvidia is lapping up the money of course, but once the AI companies stop buying GPUs on crazy numbers it's going to be a rocky ride down.

load more comments (6 replies)
[–] Defaced@lemmy.world 13 points 23 hours ago (3 children)

Here's the thing, it kind of already has, the new AI push is related to smaller projects and AI agents like Claude Code and GitHub copilot integration. MCP's are also starting to pick up some steam as a way to refine prompt engineering. The basic AI "bubble" popped already, what we're seeing now is an odd arms race of smaller AI projects thanks to companies like Deepseek pushing the AI hosting costs so low that anyone can reasonably host and tweak their own LLMs without costing a fortune. It's really an interesting thing to watch, but honestly I don't think we're going to see the major gains that the tech industry is trying to push anytime soon. Take any claims of AGI and OpenAI "breakthroughs" with a mountain of salt, because they will do anything to keep the hype up and drive up their stock prices. Sam Altman is a con man and nothing more, don't believe what he says.

load more comments (3 replies)
load more comments (3 replies)
[–] cley_faye@lemmy.world 12 points 20 hours ago (2 children)

Any business putting "privacy first" thing that works only on their server, and requires full access to plaintext data to operate, should be seen as lying.

I've been annoyed by proton for a long while; they do (did?) provide a seemingly adequate service, but claims like "your mails are safe" when they obviously had to have them in plaintext on their server, even if only for compatibility with current standards, kept me away from them.

[–] EncryptKeeper@lemmy.world 10 points 19 hours ago (6 children)

they obviously had to have them in plaintext on their server, even if only for compatibility with current standards

I don’t think that’s obvious at all. On the contrary, that’s a pretty bold claim to make, do you have any evidence that they’re doing this?

[–] DeathByBigSad@sh.itjust.works 6 points 19 hours ago* (last edited 19 hours ago) (2 children)

Incoming Emails that aren't from proton, or PGP encrypted (which are like 99% of emails), arrives at Proton Servers via TLS which they decrypt and then have the full plaintext. This is not some conspiracy, this is just how email works.

Now, Proton and various other "encrypted email" services then take that plaintext and encypt it with your public key, then store the ciphertext on their servers, and then they're supposed to discard the plaintext, so that in case of a future court order, they wouldn't have the plaintext anymore.

But you can't be certain if they are lying, since they do necessarily have to have access to the plaintext for email to function. So "we can't read your emails" comes with a huge asterisk, it onlu applies to those sent between Proton accounts or other PGP encrypted emails, your average bank statement and tax forms are all accessible by Proton (you're only relying on their promise to not read it).

[–] EncryptKeeper@lemmy.world 15 points 19 hours ago* (last edited 19 hours ago) (5 children)

Ok yeah thats a far cry from Proton actually “Having your unencrypted emails on their servers” as if they’re not encrypted at rest.

There’s the standard layer of trust you need to have in a third party when you’re not self hosting. Proton has proven so far that they do in fact encrypt your emails and haven’t given any up to authorities when ordered to so I’m not sure where the issue is. I thought they were caught not encrypting them or something.

[–] Vinstaal0@feddit.nl 1 points 5 hours ago (1 children)

We need to call for an audit on Protons policy and see if they actually do what they say, that way we can know for almost certain that everything is good as they say

[–] EncryptKeeper@lemmy.world 1 points 4 hours ago (1 children)

I mean we know from documented events that Proton doesn’t store you emails in plain text because there have been Swiss orders to turn over information which they have to comply with and they’ve never turned in emails, because they can’t.

[–] Vinstaal0@feddit.nl 1 points 3 hours ago (1 children)

Do you have a source for that? I know they handed over an IP address, but I haven't heard about them handing over an email.

[–] EncryptKeeper@lemmy.world 1 points 8 minutes ago

As far as I know they have not handed over any emails.

load more comments (4 replies)
load more comments (1 replies)
load more comments (5 replies)
[–] pcrazee@feddit.org 1 points 13 hours ago (1 children)

Proton has always been shitty. They don't even give you the encryption keys. Always been a red flag for me.

Not your keys, not your encryption.

[–] Vinstaal0@feddit.nl 1 points 5 hours ago

For most people, having access to their own encryption keys will cause for data loss.

Most countries have systems in place that you can do proper audits on companies which you can trust. You can audit companies for securities or financial reports which are the most common once, but you can also audit a VPN if they keep logs or not (Pure VPN has done this) and you can audit them if they have access to your encryption keys or not.

We really need to normalise that kind of control to keep companies in check.

[–] DreamlandLividity@lemmy.world 82 points 1 day ago* (last edited 1 day ago) (13 children)

The worst part is that once again, proton is trying to convince its users that it's more secure than it really is. You have to wonder what else they are lying or deceiving about.

[–] Vinstaal0@feddit.nl 2 points 5 hours ago

We really need to audit Proton

[–] hansolo@lemmy.today 76 points 1 day ago (25 children)

Both your take, and the author, seem to not understand how LLMs work. At all.

At some point, yes, an LLM model has to process clear text tokens. There's no getting around that. Anyone who creates an LLM that can process 30 billion parameters while encrypted will become an overnight billionaire from military contracts alone. If you want absolute privacy, process locally. Lumo has limitations, but goes farther than duck.ai at respecting privacy. Your threat model and equipment mean YOU make a decision for YOUR needs. This is an option. This is not trying to be one size fits all. You don't HAVE to use it. It's not being forced down your throat like Gemini or CoPilot.

And their LLM. - it's Mistral, OpenHands and OLMO, all open source. It's in their documentation. So this article is straight up lies about that. Like.... Did Google write this article? It's simply propaganda.

Also, Proton does have some circumstances where it lets you decrypt your own email locally. Otherwise it's basically impossible to search your email for text in the email body. They already had that as an option, and if users want AI assistants, that's obviously their bridge. But it's not a default setup. It's an option you have to set up. It's not for everyone. Some users want that. It's not forced on everyone. Chill TF out.

load more comments (25 replies)
load more comments (11 replies)
[–] badelf@lemmy.dbzer0.com 14 points 23 hours ago (2 children)

Proton has my vote for fastest company ever to completely enshittify.

[–] EncryptKeeper@lemmy.world 19 points 19 hours ago

How have they enshittified? I haven’t noticed anything about their service get worse since they started.

load more comments (1 replies)
load more comments
view more: next ›