![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://lemmy.world/pictrs/image/0d5e3a0e-e79d-4062-a7bc-ccc1e7baacf1.png)
Yikes, that’s rough!
Yikes, that’s rough!
Are you 100% sure it was a form from a bank?
Everything stinks of a scammers phishing form, leading to scammer calls.
I expect the only time a bank is going to want your phone number is when you initially sign up with them. After that, they should know who you are and your contact details.
I almost got caught out by a “sorry we missed you” delivery message, until it was asking for my date of birth.
Some of these random emails and SMS can catch you off-guard and seem legit
You can do reverse proxy on the VPS and use SNI routing (because the requested domain is in clear text over HTTPS), then use Proxy Protocol to attach the real source IP to the TCP packets.
This way, you don’t have to terminate HTTPS on the VPS, and you can load balance between a couple wireguard peers so you have redundancy (or direct them to different reverse proxies or whatever).
On your home servers, you will need an additional frontend(s) that accepts Proxy Protocol from the VPS (as Proxy Protocol packets aren’t standard HTTP/S packets, so standard HTTPS reverse proxies will drop them as unknown/broken/etc).
This way, your home reverse proxy knows the original IP and can attach it to the decrypted http requests as x-forward-for. Or you can do ACLs based on original client IP. Or whatever.
I haven’t found a way to get a firewall that pays attention to Proxy Protocol TCP headers, but I haven’t found that to really be an issue. I don’t really have a use case
Tests is the industry name for the automated paging when production breaks
DRM = Direct Render Manager
I had no idea, was confused, and the article never de-acronyms/initialisms the term
Oh, no kidding.
I always thought immutable required the declarative installs.
I guess, immutable is more “containerised userland”?
Yeh, immutable distros… You can install software, it’s just you have to declaratively define what software you want, then apply that as a patch.
You don’t just apt install cowsay
, you have to create a file that defines the installation of cowsay.
This way, if you have to change how cowsay is installed, you tweak that patch file and reapply it.
If you have to wipe & reinstall (or get a new computer or whatever) you just apply all your patches, and the system is the same again.
I think that’s how themes are distributed for VSCode, right?
With VSCode, everything is an extension.
But the vscode marketplace seems to have filters for themes, so there must be some way to differentiate them.
I think extensions need a permissions system
What makes this even more sneaky is that JetBrains has a theme called “Darcula”.
So, with a wider generic theme called Dracula and themes that duplicate JetBrains Darcula theme, it is no surprise that “Darcula Official” is being installed.
It’s more than just a typosquat
Edit:
But why can a theme make web requests?!
When learning c++ you hate c++. Then suddenly you get it, and love c++. Then you learn more c++, and you end up merely liking c++
Oh man, spoilable items? Spoilable agriculture research packs?
That’s pretty intense
Just booted? Better wait 45 seconds and many failed searches, because DNS isn’t resolving.
Doesn’t matter that you are trying to launch a local program, it absolutely must delay the user experience until it can successfully resolve a DNS query.
Disgusting
I’m saying it’s false to apply Occam’s razor to this scenario and draw a conclusion that this is caused by non-human life.
I’m not assuming earth is unique. There have been many earth-like planets that have been discovered.
I’m not even assuming humans are unique, given all of space-time.
It is extremely unlikely that there exists intelligent life other than humans at this time (or within the window-function of time required for us to receive a transmission from however many million lightyears).
Like, it is vanishingly small. The insane series of events that has lead to an intelligent species being dominant on a planet is ridiculous, to be honest.
In other words, humans are essentially unique at this point in “observable” time.
It is extremely likely it is a natural phenomena that we don’t understand, or even equipment malfunction, misinterpretation, miscalculation etc.
We have discovered unknown signals, then learnt what they are. Humans don’t know everything.
We have discovered unknown signals, then realised it was a nearby microwave, or a dodgy connection, or whatever. Humans make mistakes.
The simplest explanation in order to not have to deal with a new research project is probably “aliens”. But the simplest explanation is “natural phenomena we don’t understand yet”
The complexity involved to have sentient life evolved to the point that it can create radio waves is an astronomically small possibility. Having that coincide with our ability to detect such a thing is even smaller.
The history of “we don’t know what this signal is or means” has always been “a new type/phase of star”.
The only assumption here is that life is rare, and advanced life is rarer still. Which is supported by all of our science so far
The simplest explanation is a new kind of star, or a new kind of star cycle.
We have seen interesting radio signals before, they have all been explained by some sort of star behaviour.
The simplest explanation is NOT the evolution of an entire other species that survives all the way through to advanced tech to send radio signals.
Sure, but what you are describing is the problem that k8s solves.
I’ve run plenty of production things from docker compose. Auto scaling hasn’t been a requirement, and HA was built into the application (so 2 separate VMs running the compose stack). Docker was perfect for it, and k8s would’ve been a sledgehammer.
It’s not a workaround.
In the old days, if you had 2 services that were hard coded to use the same network port, you would need virtualization or a different server and make sure the networking for those is correct.
Network ports allow multiple services to use the same network adapter as a port is like a “sub” address.
Docker being able to remap host network ports to containers ports is a huge feature.
If a container doesn’t need to be accessed outside of the docker network, you don’t need to expose the port.
The only way to have multiple services on the same port is to use either a load balancer (for multiple instances of the same service) or an application-aware reverse proxy (like nginx, haproxy, caddy etc for web things, I’m sure there are other application-aware reverse proxies).
How the Linux kernel “made it” and is still free and open source is - imo - one of the pinnacles of humanity.
It’s inspired so much other software to adopt the same philosophy, and modern humanity/science/society stands on those shoulders.
I think science has missed that boat.
Or that pinnacle was before the tools to support such an open source atmosphere/community were around… So not missed the boat, but swam before the boat was built
How to you vet papers that are being submitted?
If it is outside of your specific experience, how do you get someone else who is specialised to vet the paper?
When metrics become targets they fail to be metrics any more