Fushuan [he/him]

Huh?

  • 1 Post
  • 242 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle
  • Because it was the west that decided to invade in 2014, it was the west that made Ukraine feel so fucking vulnerable for 8 years that they decided to turn into NATO, and then it was the west again who sent tanks into Kiev. Yeah.

    I’m not blind to not see that this benefits the US, but honestly, it looks like an opportunity for the west, not something they provoked. If Russia didn’t want the west to benefit so much from it maybe they shouldn’t have had invaded in 2014 because the people of a separate sovereign country revolted against the Russian puppet President.










  • Shared poibters are used while multithreading, imagine that you have a process controller that starts and manages several threads which then run their own processes.

    Some workflows might demand that an object is instantiated from the controller and then shared with one or several processes, or one of the processes might create the object and then send it back via callback, which then might get sent to several other processes.

    If you do this with a race pointer, you might end in in a race condition of when to free that pointer and you will end up creating some sort of controller or wrapper around the pointer to manage which process is us8ng the object and when is time to free it. That’s a shared pointer, they made the wrapper for you. It manages an internal counter for every instance of the pointer and when that instance goes out of scope the counter goes down, when it reaches zero it gets deleted.

    A unique pointer is for when, for whatever reason, you want processes to have exclusive access to the object. You might be interested in having the security that only a single process is interacting with the object because it doesn’t process well being manipulated from several processes at once. With a raw pointer you would need to code a wrapper that ensures ownership of the pointer and ways to transfer it so that you know which process has access to it at every moment.

    In the example project I mentioned we used both shared and unique pointers, and that was in the first year of the job where I worked with c++. How was your job for you not to see the point of smart pointers after 7 years? All single threaded programs? Maybe you use some framework that makes the abstractions for you like Qt?

    I hope these examples and explanations helped you see valid use cases.


  • It’s not really about the hardware, is it? The option you mentioned won’t enable an alternative app store, it won’t enable access to android app emulators (which would be a huge boom in the open source app offering). The level of trust iPhone users give to appeal is wildly higher that what android users that tweak their phones give the manufacturers. It is what it is, but don’t delude yourself in thinking that it’s about what they do in the kernel level, it’s about the fact that they store tons of sensitive data in their american servers and that they have an obligation to share that data with the country, and as someone from Europe that doesn’t sit well with me.









  • That stalker had to have access to your google account to do so, you are utterly fucked if that’s the case by that point. Like, why would they need to install a tracking app, when the google findmyphone feature just gives them the info. Anything that the phone stores that isn’t recorded by google pales in comparison to what they have access to with your account.

    That’s like saying that you are saving money by buying a kilo of salt that lasts a year, 50 cents cheaper. Yeah you technically saved money but it’s so irrelevant in the grand scheme of things that you shouldn’t even consider it.


  • oh, yeah I’ve read and heard of plenty people saying that they definitely notice it. I’m lucky enough not to because most ARPGs don’t run 60FPS on intense combat, let alone 120 fps on a rtx3080 lmao.

    I was talking more about the jump from 240 and beyond, which I find surprising for people to notice the upgrade on intense gaming encounters, not while calmly checking or testing. I guess that there’s people who do notice, but again, running games on such high tick rate is very expensive for the gpu and a waste most of the time.

    I’m just kinda butthurt that people feel like screens below 120 are bad, when most games I play hardly run 60 fps smooth, because the market will follow and in some years we will hardly have what I consider normal monitors, and the cards will just eat way more electricity for very small gains.