![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://lemmy.ml/pictrs/image/gWmVEUZ94Z.png)
Removed by mod
Removed by mod
If you use HTTPS, the attacker can still see what websites you connect to, they just can’t see what you are sending or receiving. So basically they can steal your browsing history, which defeats the purpose of a commercial VPN for many users.
This is blatantly false. They can see IP addresses and ports of you connect to from IP packets, and hostnames from TLS negotiation phase (and DNS requests if you don’t use custom DNS settings). HTTP data is fully encrypted when using HTTPS.
If exposing hostnames and IP addresses is dangerous, chances are that establishing a VPN connection is as dangerous.
Control of the DHCP server in the victim’s network is required for the attack to work.
This is not a VPN vulnerability, but a lower level networking setup manipulation that negates naive VPN setups by instructing your OS to send traffic outside of VPN tunnel.
In conclusion, if your VPN setup doesn’t include routing guards or an indirection layer, ISP controlled routers and public WiFis will make you drop out of the tunnel now that there’s a simple video instruction out there.
As we all know, siphoning of the power to the small percentage of people had never happened prior to capitalism.
Support for QUIC and HTTP/3 protocols is available since 1.25.0. Also, since 1.25.0, the QUIC and HTTP/3 support is available in Linux binary packages.
https://nginx.org/en/docs/quic.html
2023-05-23 nginx-1.25.0 mainline version has been released, featuring experimental HTTP/3 support.
It’s not a dev code. It would also take a mere minute to check this before failing to sound smart.
Even better, the dude forked because a security issue in “experimental” but nonetheless released feature was responsibly announced.
Talk about an ego.
Please correct me if I’m wrong, but doesn’t this allow one to represent virtually any resource as a mail inbox/outbox with access through a generic mail app?
I’m working with a specialized healthcare company right now, and this looks like a way to represent patient treatments data as an intuitive timeline of messages. With a local offline cache in case of outages. Security of local workstations is a weak point of course, but when is it not…
Sorry, but you don’t get to claim groupthink while ignoring state of Apache when Nginx got released.
Apache was a mess of modules with confusing documentation, an arsenal of foot guns, and generally a PITA to deal with. Nginx was simpler, more performant, and didn’t have the extra complexity that Apache was failing to manage.
My personal first encounter was about hosting PHP applications in a multiuser environment, and god damn was nginx a better tool.
Apache caught up in a few years, but by then people were already solving different problems. Would nginx arrive merely a year later, it would get lost to history, but it arrived exactly when everyone was fed up with Apache just the right amount.
Nowadays, when people choose a web server, they choose one they are comfortable with. With both httpds being mature, that’s the strongest objective factor to influence the choice. It’s not groupthink, it’s a consequence of concrete events.
deleted by creator
Cool. Here’s to no one starting measuring your solo work time in place of your project completion count 🍺
Moreover, “deep work” is a bullshit claim. Working solo long sessions without communicating is not an indication of… anything, really. The moment “deep work” becomes a trend, some idiot will start measuring it, making it yet another counterproductive way to torture people.
Measure business outcomes and implement changes that don’t fall victims to Goodhart’s law. If a director can think of a way to game a measure, workers will think of ten.
The primary genocide requirement is intent. Instead of throwing “maybes” around, you should spend a minute or two to read the convention in full instead of polluting the discourse.
Especially now that the intent is clearly stated by a government official.
Turns out, I do need therapy.
Sourcehut. The answer is sourcehut.
You don’t even need an account to submit patches, just configure git send-email
.
I described a route to spoof DNS root authority that Russia and China can use already. Single root is not an advantage, it’s merely a different kind of implementation with different attack vectors.
When it comes to security, it is better to have multiple different implementations coalesce at a point of service delivery, than have a single source of truth. If everything is delivered via DNS, there’s your tasty target for a capable adversary. If there are multiple verification mechanisms, it’s easier to tailor an attack for a specific target.
I want cryptographic infrastructure I rely on to be the last resort for anyone capable of dealing with it.
You gotta love confident statements that don’t stand to scrutiny.
DNSSEC keys are signed in the same recursive manner SSL certificates are. If I, as a government, block your access to root servers and provide you my own servers, I can spoof anything I want. It’s literally the same bloody problem.
Chain of trust doesn’t disappear just because you use a new acronym.
When it comes to regulations, intent doesn’t matter when they enable abuse of power.
I don’t give a fuck if this is not aimed at spying. It trivially allows it, and that’s what matters.
I mean, Comic Code is pretty damn good.
Identification != Authentication
As obvious as this sounds, I’ve learned over the years that most people don’t understand what it means exactly.