![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://programming.dev/pictrs/image/170721ad-9010-470f-a4a4-ead95f51f13b.png)
API is sitting there cackling like a mad scientist in a lightning storm.
API is sitting there cackling like a mad scientist in a lightning storm.
Separate containers works like a dream when one app starts shitting the bed, gets auto-cycled, and everyone else just chills. Not surprised on the Reddit downvotes though. That place is so culty, especially now.
True, and also true.
“Hey, it appears to be int most of the time except that one time it has letters.”
throws keyboard in trash
If done correctly, it also forces devs to write smaller more maintainable packages.
Big if though. I’ve seen many a terrible containerized monolithic app.
Honestly, it wouldn’t have been a bad place to be if they hadn’t destroyed it from the inside. Windows on ARM is super stable. You can still build your own computer, or at least buy one with user-swappable parts. Linux has become much easier and wasn’t too bad to use even a decade ago, but it was nice being able to have a non-Apple computer running programs and getting work done that was just there to do the business. I’m speaking as one that attempted to use the kool-aid for a few years after Apple stopped using user-swappable batteries, memory, disk, their hardware upcharges are pure asshole insanity. I’m fully capable of using Linux, compiling my kernel, modifying driver source to work around problems, but, I don’t want to when I’m just trying to pay my bills. Streaming media services come and go with Linux support, hardware support is often lacking until the work is done to make the hardware work correctly. Windows, for all it’s … windowsness … worked. Until the last 8 months when they decided to put a molotov cocktail under the hood and see what happens.
Apple is headed this way too, now that they don’t have SJ to errantly blow up the current tech to try something new and random (although, had he survived his cancer, he’d have just gone Musky with age like a lot of that generation has, mmmm leaded gas!) Apple will hold on just a bit longer because iOS gave them one new platform reboot (ish) to live off of, while Microsoft is still kicking around technical debt until the end of time.
Oh, edit though, I’ve been migrating my machines to Linux one by one now. Not going to bother sticking around to see that Windows train wreck continue.
To point out that: even on the operating system/platform where the YouTube app comes from, it is pointless. Works fine in a browser.
Without band 71 (used for both LTE and 5G depending on parts of the country) you’ll likely see more no service scenarios while rural, but if you’re primarily metro, those will be exceedingly rare.
There’s no reason to even use the YouTube app. One of the first things I uninstall on Android.
Ouch, no band 71 at all. That’ll hurt T-Mobile extended range and indoors as it is their largest low frequency band. The EU version at least supports one T-Mobile 5G band, and their largest at that (41).
Band 12 and 13 will help low band scenarios with AT&T and Verizon respectively. No band 14 means no AT&T service in rural areas like (my always go to example) western Nebraska where it is AT&T’s only low band frequency.
Definitely not the worst band support, but not great either.
Too bad the US is such a toxic environment for cool phones with all the carrier-induced “certification” they put in the way to prevent low-volume and niche manufacturers from bothering.
Web sites like that are so annoying, they couldn’t even be bothered to find an image or video of LineageOS running on a Switch?
This and many others are reasons a switch to Linux has been so joyful. No more Windows trying to guilt me, nag me, push me, trick me, abuse me to use shit the way they want. It’s so much more…quiet.
Yeah, apologies, I was being a bit glib there. Honestly, I kinda subscribe to the Star Trek: Insurrection Ba’ku people’s philosophy. “We believe that when you create a machine to do the work of a man, you take something away from the man.”
While it makes sense to replace some tasks like dangerous mining or assembly line work away from humans, interaction roles and decision making roles both seem like they should remain very human.
In the same way that nuclear missile launches during the Cold War always had real humans as the last line before a missile would actually be fired.
I see AI as being something that becomes specialized tools for each job. You are repairing a lawn mower, you have an AI multimeter type device that you connect to some test points and you converse with in some fashion to troubleshoot. All offline, and very limited in capabilities. The tech bros, meanwhile, think they created digital Jesus, and they are desperate to figure out what Bible to jam him into. Meanwhile, corps across the planet are in a rush to get rid of their customer service roles en masse. Can you imagine 911 dispatch being replaced with AI? The human component is 100% needed there. (Albeit, an extreme comparison.)
So more an iterative family member, which I suppose was more what I’d expect with how Microsoft hisorically handled programming languages. Still interesting! Thanks for the fact-check!
Reimagined
Ftfy (/s)
Funny thing is, the CEOs are exactly the ones to be replaced with AI. Mediocre talent that is sometimes wrong. Perfect place for an AI, and the AI could come to the next decision much faster at a fraction of the cost.
I did not realize they were one and the same!
Having to do the meta-workaround of running another computer to make your computer usable is just…don’t get me wrong, I love running infrastructure, but that seems like it should be unnecessary just to use a computer.
Hahaha! I’ve been dabbling in live USB thumbdrive copies of various flavors of Linux to see which one I want to go to for a while. Did a few years back and thought, “you know, my time is worth something to me, maybe I’ll give Windows a go, 10 seems pretty stable.”
Booted up Debian Cinnamon, couldn’t get two-finger right click to work on the Synaptics config out of box, it had a few arbitrary prefs for whatever the devs decided people would probably use. Tried Debian Gnome. It had trackpad settings that were more in line with what I expected… Not giving up, but it did make me pause, because I know one can reconfigure the trackpad driver under the hood, but did I really want to jump down the rabbit hole of bespoke shellscripts again just so my audio driver correctly wakes from sleep (if it can even successfully sleep)?
Other funny to figure out, the computer has iGPU and dGPU, both were active and the battery life was maybe 2 hours. Another thing to figure out with bespoke configurations.
So it’s like, Windows and Linux (and lesser, MacOS) pain is definitely there, it is just kinda what kind of pain do you want to subscribe to? Linux pain will probably only occur during initial setup and maybe every few years when a major OS release comes out. MacOS pain is even more rare, unless a major OS release comes out with something you don’t like and you have to find where in the OS frameworks the feature is to disable it, if they have hooks in which to do. Windows pain is…every Tuesday.
“Oh here’s a new lock screen weather widget”
“Oh cool, I can get on board with that!”
Next week:
“Oh, here’s a new stocks and news widget to go along with the weather.”
“Hold on there buddy, I didn’t sign up for the first and you’ve pushed two more? Time to shut those two off. Oh, it’s all or nothing, thanks! Nothing it is.”
“Don’t worry, we’ll reinstall Dev Home next week and flag it a system app so you can’t uninstall it, and then we’ll force Copilot to be present, and then we’re going to screw with the start menu, and then we’re going to delete WordPad, and reinstall all those Office/cloud 365 shim apps and and and.” That was like, last month.
The walled garden needs to be destroyed. Compute devices should not be bound to the whims and fancies of their manufacturer. I’m honestly surprised it has gone on so long.