I dunno, I mean are the train company allowed to take my money and then go “sorry we fell out with the fuel company so we’re just gonna keep your money and not take you to your destination. Soz babe x”
I dunno, I mean are the train company allowed to take my money and then go “sorry we fell out with the fuel company so we’re just gonna keep your money and not take you to your destination. Soz babe x”
No one is suggesting that open source is inherently less secure
Unfortunately, I’ve met a number of people who genuinely do believe this! The same demographic who don’t know how copy and paste works or take photos of stuff on their monitor instead of print-screening and tend to end up running large corporations even though they’re completely out of touch.
There is also a lot of “security by obscurity” in the corporate/fintech world - “it’s open source so everyone can see the code which makes it less secure”. The inverse is often true thanks to Linus’s Law.
This is as transparent as hell. It reminds me of a TV show where a bunch of idiots plot to murder someone so they decide that if they all pull the trigger together, none of them are “technically” the murderer. Of course, that just meant they were all culpable.
It’s only a few layers of abstraction above “we didn’t ban these books, we flipped a coin to decide whether to ban them and fate chose tails…”
Pathetic.
I used to agree until I saw corporations starting to fork open source projects to run them internally like the “I made this” meme.
If I spend months or years of my life toiling over a project and license it permissively with MIT or such, they can just swoop in one day and take it for free and be like “thanks, we’re going to make mega bucks off your code and give you nothing” (and yes this does happen https://www.elastic.co/blog/why-license-change-aws).
No, screw that! I’m gonna make my stuff AGPL and those guys can damn well pay me for my time of they want to use my stuff or more cynically, do it anyway or go and reimplement it themselves in-house knowing damn well I can’t afford an army of lawyers to actually do anything about it.
Counter counterpoint: Often frontend js code is minified so that it is smaller and more efficient to transfer to the browser. For FOSS projects you should still be able to get access to that code, unminified, from the project git repo. In the same way desktop apps often ship as binary executables but you can still see the code that was compiled to build them if you find the source repo.
It does make things harder to debug for an average user but it makes it faster/more efficient to run for most end users (in the case of the desktop or phone app it makes it possible to run without needing compiler toolchains that mom and pop likely wouldn’t be able to grasp).
The key thing isn’t that what the end user’s computer runs is readable and editable but whether the code used to build that artifact is available easily and what restrictions there are on editing and redistributing that code.
TL;DR The new method still requires his art.
LoRA is a way to add additional layers to a neural network that effectively allow you to fine tune it’s behaviour. Think of it like a “plugin” or a “mod”
LoRas require examples of the thing you are targeting. Lots of people in the SD community build them for particular celebrities or art styles by collecting examples of the that celebrity or whatever from online.
So in this case Greg has asked Stable to remove his artwork which they have done but some third party has created an unofficial LoRA that does use his artwork to mod the functionality back in.
In the traditional world the rights holder would presumably DMCA the plugin but the lines are much blurrier with LoRA models.
Megadeth - Hangar 18 https://youtu.be/rUGIocJK9Tc
It seems to be back now. I think it runs on a small server and quite often gets hug of deathed
Wow the enshittification is at full throttle across silicon valley! Guess those investors gotta get those returns now that interest rates are spiking!
Yeah that makes sense! I totally agree! Search is becoming pretty difficult these days!
API calls are almost always private between the caller and the endpoint (think telegram bots or mobile apps). There isn’t really a technically feasible way for a crawler to somehow “infer” any kind of knowledge of how api calls are being used unless the result has some kind of publically visible side effect (E. G. The program using the api is generating a web page and uploading it somewhere crawlable). Google et Al go by how many links from other pages to the page of interest exist (inbound links) and multiply by a smattering of other things like quality of keywords, length of content etc.
That said, if you’re implying that the api changes mean that:
That is a plausible concern.
I think IPFS often unfairly gets lumped in with crypto bro shite but it seems to me like a pretty useful technology in many other contexts too.
I think it’s a fair concern. We’ve seen other parts of the fediverse successfully implement crowd sourced funding via patron and similar to keep mastodon servers running and I suspect if Lemmy remains “the place to be” admins will have reasonable success with a similar model. Lemmy is super efficient and can support 100s of users on a single box so I think if 1% of users paid like $5 a month you could probably still support 99% of users “for free”.
Yeah agreed, I felt like it just needs a but more intelligence to auto import and categorise data. They do have an auto import plugin that uses bank apis but it’s tricky to set up and I always found it wasn’t all that reliable. I might go back and make some contributions to that project one day.
I spent a lot of time setting up firefly-iii, a really neat and feature-rich finance manager. It’s a really great piece of software by a very responsive and friendly dev but after about 6 weeks I still couldn’t get used to it and ended up going back to paying for YNAB.
I swear by memos now though - highly recommended. It’s like having a private twitter stream where you can send thoughts, notes and files that you want to store/refer back to.
Hey - I found the same thing WRT the docker files - the compose files from the official project are ever-so-subtly wrong.
Tagging a docker network as internal
blocks outside network comms afaik so the default compose file essentially puts the lemmy server inside its own little sandbox and prevents it from communciating with other servers.
The solution I found was to add lemmy to both the internal network and the external proxy network:
## this is what the networks part looks like by default
networks:
# communication to web and clients
lemmyexternalproxy:
# communication between lemmy services
lemmyinternal:
driver: bridge
internal: true
#... other stuff here
#lemmy service inside your services: section
lemmy:
image: dessalines/lemmy:0.17.3
hostname: lemmy
networks:
- lemmyinternal
- lemmyexternalproxy # this is the important addition
restart: always
environment:
- RUST_LOG="warn,lemmy_server=info,lemmy_api=info,lemmy_api_common=info,lemmy_api_crud=info,lemmy_apub=info,lemmy_db_schema=info,lemmy_db_views=info,l
emmy_db_views_actor=info,lemmy_db_views_moderator=info,lemmy_routes=info,lemmy_utils=info,lemmy_websocket=info"
volumes:
- ./lemmy.hjson:/config/config.hjson
depends_on:
- postgres
- pictrs
Another thing I noticed was that in the documentation they bind nginx on port 80 but the docker-compose provided binds to port 8536
which is the default port that lemmy seems to listen on. I bound 8536 to my host machine and use caddy as a reverse proxy (because it does letsencrypt for you which is nice).
(Writing to you now from my self-hosted instance which I set up with the above notes)
In the early years Boston was essentially Tom Scholz who played pretty much all of the instruments on their demo tape of course with Brad on vocals (RIP). Then they had to go out and hire a band to actually tour with!