Even worse is they reply to themselves figured it out
and say nothing else.
yea but on the other hand this is clearly from a comedy skit so not sure this is worth the exercise in pedantry.
A lot of machine learning work is empirical so it was more a joke at the expense of oversimplifying the situation.
Real science is trying random stuff until you get slightly better performance out of your model and then creating contrived explanations for why you think it worked
btw yann lecun is the head of meta ai so this is just a couple of rich dickheads having a slap fight
i can imagine some kind of LRU cache being reasonably useful for this situation, assuming you have some latency hierarchy. For example if the desktop has an SSD, HDD, and some USB HDDs attached I can imagine you having a smaller cache that keeps more frequently accessed files on the SSD, followed by a bigger one on the internal HDD, and followed again by USB HDDs as the ultimate origin of the data. Or even just have the SSD as cache and everything else is origin. I don’t know if there’s software that would do this kind of thing already though.
You may want to consider zipping files for transfer though, especially if the transfer protocol is creating new tcp connections for every file.
Here we have a couple of people enjoying themselves and dickheads in the background taking pictures of them and posting it on the internet to laugh at.
I have never met a real linkedin Poster before. They must be amazing in person.
if gravity was 33 orders of magnitude stronger we’d be having a bad time right now
took a little while to figure out that somewhere between four and ten appendages with a mouth on the front and a butthole on the back was the best layout.