from the team:


Hi everyone,

In Proton’s 2024 user survey, it seems like AI usage among the Proton community has now exceeded 50% (it’s at 54% to be exact). It’s 72% if we also count people who are interested in using AI.

Rather than have people use tools like ChatGPT which are horrible for privacy, we’re bridging the gap with Proton Scribe, a privacy-first writing assistant that is built into Proton Mail.

Proton Scribe allows you to generate email drafts based on a prompt and refine with options like shorten, proofread and formalize.

A privacy-first writing assistant

Proton Scribe is a privacy-first take on AI, meaning that it:

  • Can be run locally, so your data never leaves your device.
  • Does not log or save any of the prompts you input.
  • Does not use any of your data for training purposes.
  • Is open source, so anyone can inspect and trust the code.

Basically, it’s the privacy-first AI tool that we wish existed, but doesn’t exist, so we built it ourselves. Scribe is not a partnership with a third-party AI firm, it’s developed, run and operated directly by us, based off of open source technologies.

Available now for Visionary, Lifetime, and Business plans

Proton Scribe is available as a paid add-on for business plans, and teams can try it for free. It’s also included for free to all of our legacy Proton Visionary and Lifetime plan subscribers. Learn more about Proton Scribe on our blog: https://proton.me/blog/proton-scribe-writing-assistant

As always, if you have thoughts and comments, let us know.

Proton Team

  • fart_pickle@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    5
    ·
    4 months ago

    Amazing thing, unfortunately I won’t be able to use it. I’m on ultimate plan which already costs quite a while and if scribe is going to be a paid add-on I will stick to local ollama models. Not the most convenient thing but it works.

    • hanke
      link
      fedilink
      English
      arrow-up
      6
      ·
      4 months ago

      What? It’s not available for ultimate users? I thought I was paying to have access to all features…

  • Adderbox76@lemmy.ca
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    7
    ·
    4 months ago

    Fucking hell its exhausting trying to keep one step ahead of having this AI bullshit shoved into every service I use.

    I’m not against AI. I’m just against it being embedded in literally everything. I

    If I want to “consult” an AI to have it look at my code for syntax errors or something like that, I’ll go to its website and use it from there, accepting that yes… That particular bit of code or text is going to be scraped.

    But the step from there to "always be reading everything I do is fucking massive.

      • Bogasse@lemmy.ml
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 months ago

        Yeah that’s fair enough. But I have to say it’s still frustrating seeing everyone investing so much money on this exact same feature. I’m not sure we have had time to figure out how people use this, everyone is just frightened to be left behind.

  • bamboo@lemm.ee
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    3
    ·
    4 months ago

    Pretty sad all the people getting mad about an optional opt-in feature. I think this is pretty cool. If it’s free to use with my existing unlimited plan I’ll probably use it regularly, otherwise if I have to pay more I’ll probably just keep using ChatGPT since I already pay for that.

  • scrion@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    1
    ·
    4 months ago

    So it’s only available in the business plan, and at additional cost? Meh.

  • LordKitsuna@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    4 months ago

    So, just for clarity. Even if this rolls out to all paid plans. This is opt in specifically right? Or at minimum can be entirely disabled?

  • Kroxx@lemm.ee
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    7
    ·
    4 months ago

    Hey look it’s one of the reasons I started my switch from Google manifesting in my new system, how wonderful! Can the AI winter fully arrive already?

    • Lumisal@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      5
      ·
      4 months ago

      This is opt in and you have to pay for it. Plus runs locally. Nothing like Google

      • oktux@lemm.ee
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        1
        ·
        4 months ago

        It’s enabled by default and can send your email drafts to their server. The first time you try to use it (by clicking the Scribe button), it asks whether you want to use the local version or the cloud version. It’s easy to disable it completely in Settings.

        It does not, and cannot, train on your inbox, due to end-to-end-encryption.

        More info: https://proton.me/support/proton-scribe-writing-assistant

        I would prefer if the inital prompt included an option to disabled Scribe completely, and a warning about the privacy implications of enabling it, but overall I think their approach is good enough for my privacy needs.

        • tyler@programming.dev
          link
          fedilink
          English
          arrow-up
          6
          ·
          4 months ago

          End to end encryption means it can be trained on your inbox, especially locally. It’s not encrypted at rest on your side, else you wouldn’t be able to read it.

          That’s why Facebook’s whole “WhatsApp is e2e encrypted, we can’t see anything” is and was a whole farce. They wouldn’t even make the claim in court. People even proved that they could exfiltrate data from WhatsApp after it made it to a users phone over to the Facebook app and boom, e2e didn’t matter at all.

          • oktux@lemm.ee
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            edit-2
            4 months ago

            It sounds like your main concern is that once your inbox is decrypted by your local device it could be used by Proton to train Scribe or for some other (perhaps nefarious) purpose.

            For the first point, I think the technical challenge of creating a distributed machine learning algorithm, which runs locally on each user’s device and then somehow aggregates the results, is much more difficult than downloading and using an existing model like Scribe does currently, but I agree that it is theoretically possible. If Proton ever overcomes that challenge and offers that feature, I hope they handle it as I suggested above for Scribe: an option to disable it the first time you use it. As long as I could disable it, I would consider the risk minimal. As it stands today, I consider the risk negligible.

            For the second point, it’s true Proton could program their app (or their website) to send your decrypted inbox elsewhere. (That’s true of every email provider, unless sender and receiver have exchanged PGP keys, since email is a plaintext protocol.) I trust that they don’t, based on my assessment of the available info, including discussions like this. I certainly consider them much more trustworthy than Facebook/Meta.

            As a general point, I think a lot of security/privacy for services like Proton comes down to trust. It’s important to keep Proton honest and to keep ourselves informed. I’m glad we have communities like this to help us do that.

            • tyler@programming.dev
              link
              fedilink
              English
              arrow-up
              2
              ·
              4 months ago

              Sorry I was not claiming that proton in any way was causing a problem here. I was just refuting the point that e2e means they cannot train on your inbox. I don’t even use proton but have been considering it. I do not mind the AI features.

              • oktux@lemm.ee
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                4 months ago

                I appreciate you pointing out the limits and pitfalls of e2e encryption. It added important nuance to the thread. Thanks!

                • tyler@programming.dev
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  4 months ago

                  You’re welcome. Honestly I wanted to point it out cuz I hate Facebook and WhatsApp 😂

      • Kroxx@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 months ago

        So that’s fair and I completely understand that. My problem is 1. How training data is obtained 2. How this change in the future and 3. I just started my proton switch about two months ago and all of the google AI integration is what broke the camels back for me.

        I wanted a platform where I didn’t have to constantly check how the AI is getting trained and handles privacy, which is now gone.

        • Lumisal@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          2
          ·
          4 months ago
          1. Mistral LLM, which I recall is open source

          2. probably not much, since they’re not doing a cloud service. You still have to set this up to work on your local server/computer if you want to use it.

          3. The local AI integration is, so far, the only AI use they seem to be planning on giving. And as they covered, it’s primarily designed for businesses. Other things they are planning on looking into that are not AI related is a web browser, possibly from scratch, but the AI stuff is using an existing model, it’s not from scratch.

  • Gleddified@lemmy.ca
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    5
    ·
    4 months ago

    VPN Linux client is still barely functional years later…

    Keep releasing new products tho

  • geography082@lemm.ee
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    2
    ·
    4 months ago

    Please review the plans model, it is very wallet aggressive currently. Example the ability to create customer own plans with the services he wants and needs.

    • smeg@feddit.uk
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      4 months ago

      very wallet aggressive

      I see you used this new AI to jazz up the word “expensive”!

      • geography082@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        Not my mother tongue . And I like to invent expressions . Anyway yes is very expensive, every month … even nowadays that we are being taxed to death

    • Nelizea@lemmy.worldOPM
      link
      fedilink
      English
      arrow-up
      19
      ·
      4 months ago

      Teams answer:

      Our business audience was the most interested in a writing assistant, this is why we started gradually rolling it out starting with Business and Visionary plans. We will look into making it available to more users at a later date!

    • troed@fedia.io
      link
      fedilink
      arrow-up
      35
      ·
      4 months ago

      The negative, since I couldn’t see it:

      Chromium-based browser. Support for the Proton Mail desktop app will come at a later date.

      Is it technically not possible on Firefox? I would’ve expected a large overlap between caring-about-privacy and not-running-chromium amongst your customers :/

      • Nelizea@lemmy.worldOPM
        link
        fedilink
        English
        arrow-up
        28
        ·
        4 months ago

        The team states the following regarding Firefox:

        Support for running language models locally is currently only available in the Firefox Nightly builds. In our testing with Firefox, we haven’t been able to get Proton Scribe to run reliably on a variety of devices. We will see how the situation evolves before adding support.

        • Broken@lemmy.ml
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          1
          ·
          4 months ago

          I’m good with this response. I’m a Firefox user so can’t yet make use of scribe, but its a feature I didn’t expect and don’t have today so I’m not missing out. For others with different threat models, if they can use it and enjoy, then more power to them.

      • Pasta Dental@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 months ago

        I did not look at the source code but I assume this uses something like webllm, which uses webgpu that Firefox currently doesn’t support as much as chromium

      • merde alors@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        8
        ·
        edit-2
        4 months ago

        ungoogled chromium is a free and open-source variant of the Chromium web browser that removes all Google-specific web services. It achieves this with a series of patches applied to the Chromium codebase during the compilation process. The result is functionally similar to regular Chromium.

        I’ve read good things about Vivaldi, which also is chromium based.

        • fluckx@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          4 months ago

          I liked Vivaldi. Its a good browser. I just switched to Firefox because the world needs more than a chromium browser owner by a single company.

          • Gauff@piaille.fr
            link
            fedilink
            arrow-up
            1
            ·
            4 months ago

            @fluckx @merde It’s my kind of reflection as well. But isn’t it crazy that we do stuff “because the world needs it”? As individual, and even as a group, because we are such a small minority?