Due to the “being allergic” part this is unfortunately not possible for me. But thanks for trying to provide me with a different perspective.
Due to the “being allergic” part this is unfortunately not possible for me. But thanks for trying to provide me with a different perspective.
Fair point. Although one may say this is fine here for comic purposes.
The same argument could be made about the statement “Gods perfect creation”.
But I’d argue that the suggestion of a creationist god expands the distance to scientific contexts even more while simple speech bubbles are fine due to less ideological conflict potential.
Admittedly, I am also rather allergic to religions, which is why I am having a difficult time with that part of the meme.
Me too. It’s a science meme community after all.
There’s no evidence for gods though.
Shit! They found me! crawls away
Rebranding a Markov Chain stapled onto a particularly large graph
Could you elaborate how this applies to various areas of AI in your opinion?
Several models are non-markovian. Then there are also a lot of models and algorithms, where the description as or even comparison to Markov-chains would be incorrect and not suitable.
The level of your argumentation:
Are you a firefighter or a medical doctor? If not, you’re obviously in favour of fires, death and disease.
Why aren’t you donating all of your stuff to homeless people? Or are you happy all those people don’t have a home?
Why aren’t you saving the world already???
You know, demanding change and maybe showing some sort of protest does not mean you need to do those things exactly as you would like to see them, especially if those efforts wouldn’t change anything on the larger scale and rather lead to a bunch of problems in your life.
I feel this. Fell into a similar rabbit hole when I tried to get realtime feedback on the program’s own memory usage, discerning stuff like reserved and actually used virtual memory. Felt like black magic and was ultimately not doable within the expected time constraints without touching the kernel I suppose. Spent too much time on that and had to move on with no other solution than to measure/compute the allocated memory of the largest payload data types.
Where do you get this attitude that everything should be provided to you for free and you’re entitled to it?
From (non-capitalistic) utopic ideas, where humans try to be excellent to each other.
But, consider you’re stranded in the wild. All technology lost due to an accident. It’s just you, nature and your skills. How will you know then for how many days the melons you’ve foraged will suffice if you’ve found N of them and eat one a day? /j
researchers are paid by the university
Not necessarily. A lot are paid by external research grants.
Another one, Frontiers:
Doubt that for the next decade at least. Howver, we can already replace CEOs by AI.
A chinese company named NetDragon Websoft is already doing it.
https://www.independent.co.uk/tech/ai-ceo-artificial-intelligence-b2302091.html
If you want to play with fire, Mr. Amazon Guy, don’t be surprised to get burned. :]
This again?
If we’re speaking of transformer models like ChatGPT, BERT or whatever: They don’t have memory at all.
The closest thing that resembles memory is the accepted length of the input sequence combined with the attention mechanism. (If left unmodified though, this will lead to a quadratic increase in computation time the longer that sequence becomes.) And since the attention weights are a learned property, it is in practise probable that earlier tokens of the input sequence get basically ignored the further they lie “in the past”, as they usually do not contribute much to the current context.
“In the past”: Transformers technically “see” the whole input sequence at once. But they are equipped with positional encoding which incorporates spatial and/or temporal ordering into the input sequence (e.g., position of words in a sentence). That way they can model sequential relationships as those found in natural language (sentences), videos, movement trajectories and other kinds of contextually coherent sequences.
Today on the internet: Fun with spherical geometry.
He doesn’t mean anything.
Progress takes time. Overton window and shit like that. Babysteps. Slow, but steady.
When no one was looking.