

Someday I will impress someone with this knowledge, and then I’ll look back at the time I’ve wasted scrolling Lemmy and tell myself it was worth it.


Someday I will impress someone with this knowledge, and then I’ll look back at the time I’ve wasted scrolling Lemmy and tell myself it was worth it.
Why does your friend not just use a grocery delivery service?
I think it’s pretty easy to draw connections among “unable to leave house in fear,” “unable to work,” and “unable to afford food/grocery delivery services.”
Also, digital fingerprinting would easily identify individuals who would not deliver groceries to their immigrant neighbors. Pick out the more vocal members of the remaining people, find which ones are active in community social media groups, and you have a list of people that is much smaller than just “white people.”


If my 3600X has taught me anything, it’s that AM4 platform is truly a long-haul legend.
For me, it’s rarely ever furries. Usually it’s the waifu posts that get the hammer from me, mostly because I scroll this thing in public and around people I know.


I’m struggling to see where the person you replied you ever suggested that it was Aldi’s original idea. It was Aldi’s decision to keep those things, which are common in Europe, in a market where it is not common (US).
I love several of their business decisions
Not
I love Aldi’s idea of [insert list]
That is the point they are making.


Grok must be tired of switching between mechahitler mode and trying to logically think through questions.
They’re just not compatible, and yet somehow they keep trying to force it.
(I know, LLMs do not have feelings or get tired)


Oh, certainly. The reason I focused on speed is because an idiot using a shoddy LLM may not notice it’s hallucinations or failures as easily as they’d notice it’s sluggishness.
However, the meaningfulness of the LLM’s responses are a necessary condition, whereas the speed and convenience is more of a sufficient condition (which contradicts my first statement). Either way, I don’t think the average users knows what hardware they need to leverage local AI.
My point is that this “AI” hardware gives a bad experience and leaves a bad impression of running AI locally, because 98% of people saw “AI” in the CPU model and figured it should work. And thus, more compute is pushed to datatcenters.


Infinitely better, as long as your network and encoding are set up properly. At the very least, you won’t need the ice bricks.


I feel like the “AI capable” marketed CPUs are a sham. For the average user, it’s just going to feel slow compared to cloud compute, so it’s just training the average person to not bother buying AI-labelled hardware for AI.


I heard that when you roll into a military base by accident, you can’t just turn around and leave. Did they have to let you in and escort your vehicle out?
I was thinking this. Choose vibe code, start from scratch. As cool as it seems to work with FORTRAN I’d probably hit a brick wall much sooner, and harder.


But did you drive up the stairs? It sounds like an exhilarating experience!
But what if the audio fucking up in Teams ends up costing a big man the wealth he’s entitled to?


Hah. I never even said the comment hurt me.
I’m just speaking on behalf of those who have been hurt. If you can’t recognize why someone might do that, you’re either a rape apologist or you’re an incel troll. Probably both.


They had a little bit of reprise with the surge of SFF PCs but not much.


I’ve never tried it but this method looks interesting. I wonder if it would work with clear primer instead of paint.
https://www.instructables.com/Easy-Way-to-Smooth-PLA-No-Sanding-No-Chemicals/


Because some people have digital libraries but no hardware to run them on.
At the very least, this is a loss in gaming accessibility by cost since a month of GeForce Now used to be a decent gaming backup for when mygaming system was down (had to RMA GPU) or a friend wanted to test the PC gaming waters.


…do we really need to use the word “raping” to talk about PC performance or can we agree that there are a hundred other words that fit better in that spot?
EDIT: Wooo free downvotes. Y’all are a bunch of snowflakes. I tried to make a point on behalf of others, since there are people who have traumatic lived experience with the concept of “rape” and would probably prefer not to be reminded of it. Nobody serious about computing is going to go out and say “this process is raping the performance” because it’s just not a good idea. I bet you (if in tech field) wouldn’t say it in front of your boss. But sure, call me sensitive and pull out a semantic argument.
I can smell you through your screen. Go take a shower and try being human.
4K vs 8K on a 49" screen across the room is going to have much less of a noticeable difference than 4K vs 1080p on a 24" screen a foot or two away (dancing around the boundary of retina).
I think an 8K 42" would make a great single monitor for productivity, I just can’t imagine driving 8K at idle is very efficient if there aren’t software/firmware solutions to recognize non-moving screens.