Imo turbulence is “unsolved” in the same way the 3-Body problem is unsolved. It’s chaotic.
Imo turbulence is “unsolved” in the same way the 3-Body problem is unsolved. It’s chaotic.
Removed by mod
What’s the meme origin?
I found a band called Flake Michigan that maybe was the source but I don’t have Instagram to confirm.
but irrational numbers like pi
i is a rotational operator. So equations with i in them have pi encoded into them too.
Trying to expand a quantized rotation into a quantized linear coordinate is attempting to square a circle.
Yes, it does not make any sense. If the link above is what it appears from the summary, some students unknowingly attempted to square the circle.
Imaginary numbers are a rotational operator.
You don’t need quantum mechanics to observe rotation in the real world.
It’s Algebra 2. I just checked and only 6 states require it. Crazy. I was in a state that didn’t require it but finished Calculus 2 at graduation.
Math is personalized in American schools. There’s on grade, advanced, gt, and accelerated. Each level above on grade is how many years ahead your class math is. Depending on how large your school is, gt and accelerated math students will take math with the grades above them.
On grade would be quadratic in 9th.
Perhaps this is why these features will only be available on iPhone 15 Pro/Max and newer?
I’m not guessing. I linked to the article about the M3 which is much more powerful than the a17 pro in the 15 pro and has the same NPU.
Nothing AI about it.
Voice processing is AI and was done by Apple servers. Previously, only the keyword “Hey Siri” was local. Onboard AI chips will allow this to be local. The actual queries will go to the servers. Phones do not have the power to run useful LLM locally- at least not with the near instantaneous response times phone users expect. A 56 Watt 128GB RAM M3 Max does around 8.5 tokens/second.
That Google is the search engine means Google gets that valuable search data. So they pay to be the default search engine to get your data.
Because Apples lawyers will go ham.
Google pays Apple $20 billion a year to keep their search on Apple devices. The subtext of “search” is Google pays Apple for your search data.
Apple has sold your data for the right price to Google, so there should be no expectation that they won’t do the same with other companies.
Which is exactly what I said. It’s not local.
That they are keeping the data you send private is irrelevant to the OP claim that the AI model answering questions is local.
Most requests are handled on-device.
Literally impossible.
“Hey Siri, what’s the weather forecast for tomorrow.”
< The Farmer’s Almanac that is in my local model says it will rain tomorrow. >
Well, most of the requests are handled on device
Doubt.
Voice recognition, image recognition, yes. But actual questions will go to Apple servers.
It’s the force vector he is solving for.
Smaller doesn’t need to be more complex. 3.5" drives weren’t more complex than 5.25" drives.
A smaller head means a smaller drive actuator. Less mass and smaller size means it can compensate much quicker in response to vibration detection.
Back when full height 5.25" drives were the norm, you couldn’t pick up your PC while running without causing an error. Those tiny CF card sized drives failed but took extreme abuse compared to big drives.
Ssd for boot but not cost effective for nas. Nor do I trust their longevity.
Smaller stuff has smaller mass and therefore can be more reliable.
There were portable mp3 players with mechanical hard drives that were reliable despite extreme abuse.
Is Putin aware of this map? Is he going to return Smolensk back to its traditional Lithuanian?