Simiiformes is a clear and distinct clade.
Yes but who says that specific clade maps to the colloquial taxonomic word “monkey”?
Simiiformes is a clear and distinct clade.
Yes but who says that specific clade maps to the colloquial taxonomic word “monkey”?
Monkeys are a social construct. Like trees.
Yes, everything that can be expressed as letters is in the Library of Babel. Finding anything meaningful in that library, though, is gonna take longer than just writing it yourself.
Yeah, I’m not sure my reaction to them adding Pandas as a playable race (in the Warcraft III expansion) was that they were “really badass” as OP seemed to think.
The thing is, if Intel doesn’t actually get 18A and beyond competitive, it might be on a death spiral towards bankruptcy as well. Yes, they’ve got a ton of cash on hand and several very profitable business lines, but that won’t last forever, and they need plans to turn profits in the future, too.
Compared to AMD FX series, the Intel Core and Core2 were so superior, it was hard to see how AMD could come back from that.
Yup, an advantage in this industry doesn’t last forever, and a lead in a particular generation doesn’t necessarily translate to the next paradigm.
Canon wants to challenge ASML and get back in the lithography game, with a tooling shift they’ve been working on for 10 years. The Japanese “startup” Rapidus wants to get into the foundry game by starting with 2nm, and they’ve got the backing of pretty much the entirety of the Japanese electronics industry.
TSMC is holding onto finFET a little bit longer than Samsung and Intel, as those two switch to gate all around FETs (GAAFETS). Which makes sense, because those two never got to the point where they could compete with TSMC on finFETs, so they’re eager to move onto the next thing a bit earlier while TSMC squeezes out the last bit of profit from their established advantage.
Nothing lasts forever, and the future is always uncertain. The past history of the semiconductor industry is a constant reminder of that.
I just mean does it keep offline copies of the most recently synced versions, when you’re not connected to the internet? And does it propagate local changes whenever you’re back online?
Dropbox does that seamlessly on Linux and Mac (I don’t have Windows). It’s not just transferring files to and from a place in the cloud, but a seamless sync of a local folder whenever you’re online, with access and use while you’re offline.
Intel got caught off guard by the rise of advanced packaging, where AMD’s chiplet design could actually compete with a single die (while having the advantage of being more resilient against defects, and thus higher yield).
Intel fell behind on manufacturing when finFETs became the standard. TSMC leapfrogged Intel (and Samsung fell behind) based on TSMC’s undisputed advantage at manufacturing finFETs.
Those are the two main areas where Intel gave up its lead, both on the design side and the manufacturing side. At least that’s my read of the situation.
Does it do offline sync?
iCloud doesn’t have Linux, Android, or Windows clients. It’s basically a non-starter for file sharing between users not on an Apple platform.
I don’t like the way Google Drive integrates into the OS file browsing on MacOS, and it doesn’t support Linux officially. Plus it does weird stuff with the Google Photos files, which count against your space but aren’t visible in the file system.
OneDrive doesn’t support Linux either.
I just wish Dropbox had a competitive pricing tier somewhere below their 2TB for $12/month. I’d 100% be using them at $5/month for like 250 GB.
So with the case/mobo/power supply at $259, the CPU/GPU at $329, you’ve got $11 left to work with to buy RAM and SSD, in order to be competitive with the base model Mac Mini.
That’s what I mean. If you’re gonna come close to competing with the entry level price of the Mac Mini (to say nothing of frequent sales/offers/coupons that Best Buy, Amazon, B&H, and Costco run), you’ll have to sacrifice and use a significantly lower-tier CPU. Maybe you’d rather have more RAM/storage and are OK with that lower performing CPU, and twice the power consumption (around 65W rather than 30W), but at that point you’re basically comparing a different machine.
Ok, let’s put together a mini PC with a ryzen 9700X for under $600. What case, power supply, motherboard, RAM, and SSD are we gonna get? How’s it compare on power, sound, form factor?
It’s an apples to oranges comparison, and at a certain point you’re comparing different things.
When I was last comparing laptops a few years back I was seriously leaning towards the Framework AMD. It was clearly a tradeoff between Apple’s displays, trackpad, lid hinges, CPU/GPU benchmarks, and battery life, versus much more built in memory and storage, a tall display form factor, and better Linux support. Price was kinda a wash, as I was just comparing what I could get for $1500 at the time. I ended up with an Apple again, in the end. I’m keeping an eye on progress with the Asahi project, though, and might switch OSes soon.
For the Mac Mini? The Apple Silicon line has always been a really good value for the CPU, compared to similar performance from Intel and AMD. The upcharge on RAM and storage basically made it break even somewhere around 1 or 2 upgrades, if you were looking for a comparable CPU/GPU.
For my purposes the M1 Mac Mini was cheaper than anything I was looking at for a low power/quiet home server, back in 2021, through some random Costco coupon for $80 off the base $599 configuration. A little more CPU than I needed, and a little less RAM than I would’ve preferred, but it was fine.
Plus having official Mac hardware allows me to run a Bluebubbles server and hack Backblaze pricing (unlimited data backup for any external storage you can hook up to a Mac), so that was a nice little bonus compared to running a Linux server.
On their laptops, they’re kinda cost competitive if you’re looking for high dpi laptop screens, and there’s just not really a good comparison for that CPU/GPU performance for power. If you don’t need or want those things then Macs aren’t a good value, but if you are looking for those things the other computer manufacturers aren’t going to be offering better value.
You can’t just use an audio file by itself. It has to come from somewhere.
The courts already have a system in place that if someone seeks to introduce a screenshot of a text message, or a printout of a webpage, or a VHS tape with video, or just a plain audio file, needs to be able to introduce that as evidence, with someone who testifies that it is real and that it is accurate, with an opportunity for others to question and even investigate where it came from and how it was made/stored/copied.
If I just show up to a car accident case with an audio recording that I claim is the other driver admitting that he forgot to look before turning, that audio is gonna do basically nothing unless and until I show that I had a reason to be making that recording while talking to him, why I didn’t give it to the police who wrote the accident report that day, etc. And even then, the other driver can say “that’s not me and I don’t know what you think that recording is” and we’re still back to a credibility problem.
We didn’t need AI to do impressions of people. This has always been a problem, or a non-problem, in evidence.
A camera that authenticates the timestamp and contents of an image is great. But it’s still limited. If I take that camera, mount it on a tripod, and take a perfect photograph of a poster of Van Gogh’s Starry Night, the resulting image will be yet another one of millions of similar copies, only with a digital signature proving that it was a newly created image today, in 2024.
Authenticating what the camera sensor sees is only part of the problem, when the camera can be shown fake stuff, too. Special effects have been around for decades, and practical effects are even older.
uh that was Siri’s fault
He’s a great guy, but sometimes a little hard to follow if you’re only taking part in one conversation at a time when he’s talking in two and listening to a third because he expects you to be on the ball in your own discussion when he jumps in to drop a tidbit or ask a question like a chess master playing 4 games in the park at once
If it’s like simultaneous chess, why isn’t the single thread sufficient context for everything that happens in that thread? It just sounds like the guy you’re describing has low cognitive empathy and doesn’t understand other people’s minds. At that point you’re just describing a neurodivergent person who may or may not be a genius in certain domains, while being a moron in this one domain that you’ve described.
Yeah, Netscape 4.0 was simply slower than IE 4.0. Back then, when a browser was a program that would actually push the limits of the hardware, that was a big deal.
For the news articles themselves, each of the major companies is using a major CMS system, many of them developed in house or licensed from another major media organization.
But for things like journalist microblogging, Mastodon seems like a stand-in replacement for Twitter or Threads or Bluesky, that could theoretically integrate with their existing authentication/identity/account management system that they use to provide logins, email, intranet access, publishing rights on whatever CMS they do have, etc.
Same with universities. Sure, each department might have official webpages, but why not provide faculty and students with the ability to engage on a university-hosted service like Mastodon or Lemmy?
Governments (federal, state, local) could do the same thing with official communications.
It could be like the old days of email, where people got their public facing addresses from their employer or university, and then were able to use that address relatively freely, including for personal use in many instances. In a sense, the domain/instance could show your association with that domain owner (a university or government or newspaper or company), but you were still speaking as yourself when using that service.