• 0 Posts
  • 35 Comments
Joined 1 year ago
cake
Cake day: June 18th, 2023

help-circle





  • That just goes with a territory of having an iPhone. When you bought that device you signed on to a culture of consumption that is enforced by the developer of that device.

    The developer can’t force Apple to let the developer give it to you for free. Apple doesn’t tolerate free very well and anything that is free on Apple is likely either a privacy nightmare or is paid for by some subscription you have with Apple.

    This isn’t a problem with the app It’s a problem with the Apple.








  • If you maintain public goods for the good of the public you have a lot less crime. It’s precisely because there is such extreme wealth that is not paying to maintain the public goods that we have the crime.

    The people destroying the stuff are doing that because they have been robbed of a place in society and their futures have been foreclosed to them.

    Building hostile anti-human infrastructure, housing that costs 60 hours a week to live in, and unaffordable food that the government subsidizes to make MORE expensive are all not so subtle ways to tell these people that society does not value them.





  • It depends on what kind of RAM you’re getting.

    You could get Dell R720 with two processors and 128 gigs of RAM for $500 right now on eBay, but it’s going to be several generations old.

    I’m not saying that the model is taking up astronomical amounts of space, but it doesn’t have to store movies or even high resolution images. It is also not being expected to know every reference, just the most popular ones.

    I have 120tb storage server in the basement. So the footprint of this learning model is not particularly massive by comparison, but It does contain this specific whole joker image. It’s not something that could have been generated without the original to draw from.

    In order to build a bigger model they would need not necessarily just more storage but actually a new way of having more and faster RAM connected to lower latency storage. LLMs are the kinds of software that become hard to subdivide to be distributed across purpose-built arrays of hardware.