gvprtskvni ...and now it's fucking back again
No, it is not back. This is the compromise proposal, i.e. child sexual abuse reporting with voluntary chat control. This is with most certainty what will be accepted and become law going forward, replacing the existing soon-to-expire voluntary chat control law that has been in effect in many years and introducing child sexual abuse reporting provisions.
The child sexual abuse reporting laws we want. Those lays out provisions that forces service providers to make it easy for minors to report on abuse, and that lays out EU wide foundation for reported content to be removed from all service providers offering their service in EU, legally forcing providers to remove content reported this way swiftly. Today, far to many social medias and messaging apps do not have any means to report on abuse, or make it too hard to report on, and are not acting swift at all, and if content has spread to other platforms, there is no coordination for removal today at all. This will now change for the better.
Voluntary chat control have been in effect for many years already, and there are no wide complaints about that at all, as each service provider themselves can decide if scanning private non-encrypted chats for CSAM and grooming attempts using AI tools are an appropriate measure or not. What happens now is that the temporary voluntary chat control provisions have been made permanent. Eg, Discord will be allowed to continue to scan all chats going forward, and will not be legally forced to stop doing this in March 2026, and can thus maintain their (illusions of a) reputation as a child-friendly chat platform.
This is the best we could hope for.
To be clear, the compromise proposal clearly states that no provision in the proposal must be interpreted in a way that weakens end-to-end encryption or by other means give access to end-to-end encrypted data in any way at all. That is, client-side scanning has been scrapped.
Unlike what Patrick Breyer tries to make it seem like, there is also no ban of minors from using social media or messaging apps. Rather the opposite, the proposal talks in length about what providers should do to ensure their platform is safe for minors too, and also states that any restrictions imposed should be limited as far as possible in terms of what parts of the platform they apply to, and which users they apply to. Likewise, there is no ban of internet anonymity in any sense, there are no provisions that mandate identifying yourself in any way.
Only downside is that age verification will be soft-mandated by this proposal, so we will see age verification being introduced. The proposal clearly states that age verification must be privacy preserving, and that you should never need to reveal your identity to verify your age on any platform. If I read between the lines correctly, any service provider can opt for not doing age verification by simply assuming all users are minors. This is already a commonly applied scheme to avoid age verification in other parts of the world. Just apply all protection measures mandated for minors to all users instead.
Patrick Breyer also puts a lot of focus into the fact that they changed the wording from that providers "must take reasonable measures" to prevent abuse to that providers "must take all reasonable measures" to prevent abuse. This is a much stronger wording, as providers are no longer allowed to choose which measures to implement, but must implement all of them. But in no way can banning children and teens be considered a reasonable measure, and the provisions read to me as definitely not intending that. And the provisions are very clear users should never need to identify themselves, so the statements about no anonymity is also not true. The change of wording may read as making providers obliged to implement the "voluntary" chat control measures, in effect not making it voluntary anymore, but they are not. Since end-to-end encrypted chats are exempted from scanning, they can always implement end-to-end encryption instead, which would be a good way to improve privacy for all users, while still safe-guarding minors if appropriate reporting facilities exists.
The compromise proposal is not great, but I think it is close to the best we could hope for.
Disclaimer: I have not read all the text, it is a long document. I read a little here and there, and did a few searches for keywords.