Computers and the internet gave you freedom. Trusted Computing would take your freedom.
Learn why: https://vimeo.com/5168045

  • 3 Posts
  • 645 Comments
Joined 1 year ago
cake
Cake day: June 7th, 2023

help-circle
  • atomic has had a meaning for a very long time in IT, don’t pretend that it’s something made up bullshit. with this thinking we could just throw out the word mutable/immutable too, what is it my computer is radioactive and I’ll get cancer from it? of course not, because it has a different meaning with computers, and people in the know (not even just professionals because I’m not one) know it.

    atomic means that if multiple things would change, they will either change at once, or if the task failed none of it will change.
    sometimes these are called transactions, suse calls it transactional updates. but is that any better? now the complaint will be that suse must have transacted away all the money from your bank account!

    and distros are obviously not immutable, that’s just plainly misleading. we update them, someone does that daily. updating requires it to be mutable, to be modifiable.






  • since your CPU has 16 threads (“cores” but not really cores, you probably only have 8 of that), if a process uses up all the capacity of a single core, that will have a 100/16 = ~6% cpu usage. In my experience looking for this really works… at least on windows, please don’t hurt me. it should on linux too, but there I don’t have it at such a visible place.

    this may not work that much though when your system is under a higher load, and the process you’re looking for also has a higher CPU usage, like 30% or something.
    in this case you’ll want to look for the cpu usage of the individual threads of processes with a higher cpu usage. if you have a process which has a thread with 6% cpu usage (in case of a 16 hardware thread cpu), then that process is at fault. by looking at the name of the thread you may even find out what is its purpose.






  • oh and if you’re interested in archiving, definitely check out the Archive Team!

    they are always running archiving projects, they even participate in preserving reddit content, and they have a connection with archive.org and the Wayback Machine.
    they maintain a virtual machine image that you can run at home even on a simpler PC, and help in their projects. It does not consume much storage actively, only some network bandwidth. It’s basically a distributed archiving tool, a lot of people running it download all kinds of data (good for performance and to avoid restrictions) for the selected project, and upload it to AT for preservation


  • Oh my sweet summer child, 12 GB is not a lot! :)

    well, for one you can start saving webpages you found helpful, maybe your useful links collection or bookmarks, if you have any of those. I would recommend using Firefox and the singlefile addon, or the webscrapbook addon. feel free to look into their settings, but don’t let it overwhelm you, of you need it take it in smaller pieces, there’s no shame in it.
    since this is mostly text, it shouldn’t take up that much space quickly, and it’s also very efficiently compressible! for example with 7zip.

    if you often watch videos, like on YouTube or somewhere else, and you find something useful or otherwise you think it’s worth preserving (entertainment is also a valid reason), you can grab it too. have a look at yt-dlp, it’s very versatile, very configurable, and not only for youtube.
    but this will easily take up a lot of space, videos are huge and not really compressible losslessly.

    other than that, have a look at the DataHoarder community: [email protected] (I hope the link works). for even more, you may check the datahoarder and opendirectories subreddits through libreddit/redlib