It was a great adventure. But yeah, that setup was on 24/7. Not because of compilation, but it definitely made a lot of this more feasible
Gentoo unstable was a little bit tiring in the long run. The bleeding edge, but often I needed to downgrade because the rest of the libraries were not ready
Gentoo stable was really great. Back then pulseaudio was quite buggy. Having a system where I could tell all applications and libraries to not even link to it (so no need to have it installed at all) made avoiding its problems really easy
But when my hardware got older and compilation of libreoffice started to take 4h, I remembered how nice it was on Slackware where you just install package you broke and you’re done
Arch looked like a nice middle-ground. Most of the things in packages, big focus on pure Linux configurability (pure /etc files, no Ubuntu(or SUSE?) “you need working X.org to open distro-specific graphics card settings”) and AUR for things there are no official packages for. Turned out it was a match :)
Windows (~6 years) -> Mandriva (Mandrake? For I think 2-3 years) -> Ubuntu (1 day) -> Suse (2 days) -> Slackware (2-3 years) -> Gentoo unstable (2-3 years) -> Gentoo stable (2-3 years) -> Arch (9 years and counting)
The only span I’m sure about is the last one. When I started a job I decided I don’t have the time to compile the world anymore. But the values after Windows sum up to 21, should be 20, so it’s all more or less correct
If you want to access your computer from outside your LAN, it would be a good idea to at least secure it or, unfortunately the best, learn to understand what you are doing
Coming back to the topic, though, I’d start with checking these out
Characters in the title are not the regular ones making it look like a spam mail, no link, description sounds like corpo LLM. If there really is some podcast somewhere, I think it deserves better
Two way as in “upload too” or “another share from other device”?
https://f-droid.org/pl/packages/be.ppareit.swiftp_free/
https://f-droid.org/en/packages/com.daemon.ssh/
I create a Directory, and that directory and it’s files become available to network
So basically you want to set up an ftp server?
You mean like LocalSend, croc, Share via HTTP or ShareX?
I think I’ve found an app some time ago that was just setting up an HTTP file server, so you could share whole directories. But I didn’t have use for it and I forgot the name
Unless there’s a bunch of people that create open-source firmware/HAL
But even if, I think it’s still an improvement. Even if firmware is proprietary, you could flash it on “titan-compatible” chip when yours dies making your device independent from chip ownership etc
But machine will not do the creative part. It can only fill in the time-sinks around our creative ideas. Ask an LLM to tell you a joke no-one has ever heard before and then google it. The creative part still has to come from humans
EDIT; and the truth is that we very rarely come up with something creative. We mostly just recompile previously met combinations
trying to weasel out of putting some effort into something that sounds worth putting some effort into
But that depends what do they need it for
Personally I don’t see a difference between legalese boilerplate and 10k word story. But that discussion might lead us nowhere
What about text creation have you learned
In many cases I don’t want nor need to learn that. I just need volume about the key points
Why an LLM is any different?
Let’s say I want my RPG players to find a corporate mail that gives them some plot info. Why not ask an LLM to write the boilerplate around the info I want to give them? Just as example
Let’s not put any effort into anything: the machine will do it for me
So you are not using a calculator, I presume? Only math done on abacus is not being lazy?
If you want something local and open source, I think your main problem will be the number of parameters (the b
thing). ChatGPT-3 is (was?) noticeably big and open source models are usually smaller. There is, of course, an exchange about how much the size of the model matters and how the quality of the training data affects the results. But when I did a non-scientific comparison ~half a year ago, there was a noticeable difference between smaller models and bigger ones.
Having said all of that, check out https://huggingface.co/ it aims to be like GitHub for AIs. Most of the models are more or less open source, you will only need to figure out how to run one and if you have some bottlenecks on PI
Haven’t tested it but it seems so. Android client has the button too