Mossy Feathers (They/Them)

A

  • 0 Posts
  • 232 Comments
Joined 1 year ago
cake
Cake day: July 20th, 2023

help-circle




  • I’m… honestly kinda okay with it crashing. It’d suck because AI has a lot of potential outside of generative tasks; like science and medicine. However, we don’t really have the corporate ethics or morals for it, nor do we have the economic structure for it.

    AI at our current stage is guaranteed to cause problems even when used responsibly, because its entire goal is to do human tasks better than a human can. No matter how hard you try to avoid it, even if you do your best to think carefully and hire humans whenever possible, AI will end up replacing human jobs. What’s the point in hiring a bunch of people with a hyper-specialized understanding of a specific scientific field if an AI can do their work faster and better? If I’m not mistaken, normally having some form of hyper-specialization would be advantageous for the scientist because it means they can demand more for their expertise (so long as it’s paired with a general understanding of other fields).

    However, if you have to choose between 5 hyper-specialized and potentially expensive human scientists, or an AI designed to do the hyper-specialized task with 2~3 human generalists to design the input and interpret the output, which do you go with?

    So long as the output is the same or similar, the no-brainer would be to go with the 2~3 generalists and AI; it would require less funding and possibly less equipment - and that’s ignoring that, from what I’ve seen, AI tends to be better than human scientists in hyper-specialized tasks (though you still need scientists to design the input and parse the output). As such, you’re basically guaranteed to replace humans with AI.

    We just don’t have the society for that. We should be moving in that direction, but we’re not even close to being there yet. So, again, as much potential as AI has, I’m kinda okay if it crashes. There aren’t enough people who possess a brain capable of handling an AI-dominated world yet. There are too many people who see things like money, government, economics, etc as some kind of magical force of nature and not as human-made systems which only exist because we let them.




  • Maybe? I guess when I think of asocial vs anti-social, I think of someone who is apathetic towards social interaction vs someone who doesn’t like socializing and only does it because it tends to be a mandatory part of society. As such, I’d think someone who’s asocial would be indifferent to a host cancelling, while someone who’s antisocial would be happy or excited by it.

    Edit: I am aware that anti-social tends to have different connotations when used generally, but in a social setting I tend to think of antisocial being opposed to socializing.


  • The comic made me chuckle; but why has introversion become synonymous with being anti-social or asocial? My understanding is that it’s entirely possible to be highly social and introverted, because being introverted just means you gain energy from being alone. It doesn’t mean you hate social gatherings or don’t like having friends; it just means you discharge when socializing and recharge during alone time (and the opposite is usually true for extroverted people).

    I wanted to point that out as I seem to be a social introvert. I like socializing and love being invited to things (even if I’m not available or it’s something I don’t like doing, because it means someone remembered that I exist), but my battery wears out fairly quickly when doing so. Strangely, I used to be very extraverted, but at some point I swapped to being introverted.

    Edit: I guess I will say that the thing the comic gets right is that usually I won’t hold it against someone for cancelling. However, I don’t get excited about it, and might even be disappointed if it was something I was looking forward to, but I usually am okay with it.

    Edit 2: made a small edition in bold.


  • You’re the one contradicting yourself when you’re saying that linux requires a Translation layer. And the translations are not always 1:1. Please show me the benchmarks.

    How is this a contradiction? It seems like it’d be the opposite. Translation layers reduce performance as they translate programs from one system to another, so the fact that Linux can run games in a translation layer and still get as good, or better, performance than Windows means that Linux is fast enough to make up for the translation layer performance penalty.

    Regardless, here are some benchmarks.

    From 2019, Windows 10 vs Pop_OS:

    https://www.forbes.com/sites/jasonevangelho/2019/07/17/these-windows-10-vs-pop-os-benchmarks-reveal-a-surprising-truth-about-linux-gaming-performance/?sh=6035a5e65e74

    While these are all in 1080p, several are also running in translation layers. The ones that are running native were faster in Linux, while the ones running in proton achieved roughly the same performance. This was also 4~5 yrs ago, and proton has improved a lot. Additionally, these were run on an Nvidia card using their proprietary drivers, and Linux is known to be AMD-biased.

    So here’s another one from a couple years ago with Windows 11 vs Manjaro (benchmark totals for 4k, 1440p and 1080p at the end): https://m.youtube.com/watch?v=xwmNLqJL7Zo

    While they found that games tended to perform better on windows in 4k, they also found that games in 1440p were roughly the same while 1080p averaged faster on Linux despite running in a mix of proton, Proton-GE, and wine. This is also a couple years old though, and while the average might be better on Linux, there were some pretty significant performance gaps at the top and bottom of the chart.

    Here’s a third one from about 6 months ago. This was pretty highly circulated on Lemmy, so I’m surprised you didn’t see it, but here it is:

    https://discuss.tchncs.de/post/5340976

    They claim to have seen an average 17% improvement on the games they benchmarked, and included a video of the benchmarks. There was a later benchmark where they claimed they got +20% performance using a tweaked version of Garuda Linux, but that required user tweaks and I’m mainly concerned with “un-tweaked” performance.


    Linux isn’t perfect, and if you want to play games with no hassle, then Windows is probably still your best bet. However, in situations where you’re trying to squeeze as much performance as you can out of an underpowered device, Linux just seems obvious. You have standardized hardware that allows you to spend the time and effort to iron out bugs and deficiencies with fewer edge cases than you’d get with non-standardized hardware. I think that’s why Steam(Deck) OS is so good. It runs on standardized hardware and so it’s easy for Valve to configure and optimize for user-friendliness because they don’t have to worry about ten billion different hardware configurations.

    Also, as a side note, I’ve found that older games just run better on Linux. They ironically tend to be way less of a hassle to get working. It’s because Wine (and I think Proton/Proton-GE) have compatibility for 16bit programs, while windows doesn’t. You have to run a virtual machine with Windows XP or earlier to run 16bit programs, and I’ve found that to be a mess.

    Seriously, I cannot get a Windows 98 virtual machine set up on Windows 10 to save my life. It just won’t properly install on software like VMWare, and I’ve had to resort to actual PC emulators to get 16bit games to run on a modern windows PC (which are slow as fuck). I’ve read it has something to do with AMD CPUs? I don’t know what the specific issue is though, just that it supposedly works just fine on Intel but not AMD. However, I haven’t encountered that mess on Linux.

    Edit: as an amusing side-side note, I’m old enough that a number of my favorite games from when I was growing up are no longer able to run on Windows because they require a 16bit OS (or a 32bit OS with 16bit compatibility). Despite that, my grandfather’s Hoyle card game that’s older than I am, still somehow runs flawlessly on Windows 10. What the fuck?


  • Many of the games are made to be run on windows, windows is still a effecient os, it’s just a lot of bloat, which can be disabled.

    A) as someone else pointed out, “bloat” and “efficient” are exclusive to one another. Now, you can argue that windows is efficient in some areas and bloated in others, but “bloat” and “efficiency” are mutually exclusive when applied generally.

    B) yes, most, if not all of it, can be disabled through registry edits and 3rd party hacks. However, in my experience, the more you try to debloat windows, the more unstable it gets. Then, it will all come back eventually via updates, which means you get to disable it all again. Finally, again in my experience, the more you try to debloat windows, the less stable it gets, and this carries over even when the OS reinstalls/reenables bloat you tried to get rid of. Seriously, my experience is that even after windows updates rebloat everything, the OS remains unstable, and becomes even more unstable after you debloat again. Granted this was with windows 10, but I imagine the same is more or less true for windows 11.

    Also a lot of optimizations in nt has been done for gaming, features which are missing in the linux kernel, but there are RFCs to add nt like synchronization primitives, in the linux kernel.

    C) and yet, iirc, recent Linux vs Windows 11 benchmarks show Windows games running on Linux via Proton/Proton-GE anywhere from slightly slower to slightly faster than Windows, despite requiring translation layers to run; while the Linux-native games typically run faster than their Windows counterparts.

    Windows is just that bloated.





  • They’ve come up with a way they could do it. I dunno why you’re mad about that, I was just wanting to share an interesting tidbit I’d learned.

    My understanding is that the reason why scientists like playing with the idea is that it’s more feasible than it immediately seems, and it’d solve some of the issues that a Mars colony would have (increased solar radiation due to low atmospheric density and weak electromagnetic field as well at very low gravity).

    Would it be expensive? Yeah. We’re talking about colonizing another planet though. It already is going to cost hundreds of billions if not trillions to do.





  • Researchers are unsure about the causes for the behaviour, but theories include that it is a playful manifestation of the mammals’ curiosity, a social fad or the intentional targeting of what they perceive as competitors for their favourite prey, the local bluefin tuna.

    Wasn’t there a big thing about how they suspected these attacks might have started with a mother watching her baby get maimed and killed by a yacht? Like, I swear it was a thing, but I can’t find any articles about it and Wikipedia says roughly the same thing as the quoted paragraph.

    I will say though, that I found this bit from Wikipedia interesting:

    Researchers have also suggested that the behaviour could be a “fad”. Other such cultural phenomena among orcas have been short-lived, such as in 1987 when southern resident orcas from Puget Sound carried dead salmon around on their heads.

    Fascinating.