Good!
At risk of sounding like a shill, NewAtlas is a great source for exciting upcoming tech. I find myself reading it more these days.
Good!
At risk of sounding like a shill, NewAtlas is a great source for exciting upcoming tech. I find myself reading it more these days.
Discord (to me) has better UX than any IRC I’ve ever experienced.
Email, on the other hand, is total baloney if it’s not interoperable. It’s why SMS/MMS is like a zombie that just won’t die, and telecoms are more cooperative than most of Big Tech.
for reasons that aren’t entirely clear beyond aesthetics and bragging rights
Oh my sweet summer child.
Maser drills: https://newatlas.com/energy/geothermal-energy-drilling-deepest-hole-quaise/
In a nutshell, it’s a economically brilliant idea: take hand-me-down microwave(ish) spectrum lasers from fusion research, drill holes deep into the crust (leaning on the fossil fuel industry), then hook up the resulting steam to existing coal plants, so you don’t have to build anything else. The coal plant gets free geothermal fuel, they move onto the next site: everyone wins.
It’s taking a worryingly long time though. I hope it gets enough funding.
It’s because it isn’t a silo?
Discord, Slack and a bajillion similar apps do not meld with other apps. Email just happened to hit critical mass before “let’s try to get a monopoly” became the slogan of all tech, and collectively Big Tech is too stupid/hostile to replace it with some cooperative protocol.
iMessage is another pure example of this.
Certain subreddits used to be like this.
But all my favorites have taken one of two paths:
Get algorithmically deprioritized (due to a “bug” as an admin told a mod), and hemorrhage users. The collective ‘intelligence’ of the sub in particular drains; interesting intellectual discussions are gone. One such example: /r/localllama
The sub gets huge. Bots repost memes and bait as attention farms. It doesn’t feel like a small town anymore. Deeper discussions drain away in favor of shallow repetition of the same things over and over again. One example of this for me is /r/thelastairbender.
Grok is the laughing stock of the ML field. It’s horribly inefficient, performance is not good for its size/compute, and it’s “leverage” (Twitter’s userbase, Tesla cars) is objectively at risk. They’re even more closed than OpenAI, much more than Google. They only exist because Elon burned billions on a shit ton of H100s and seemingly copied what others are doing.
xAI (so far) is a joke. That could change, but unless they do something interesting (like actually publishing a neat paper or model), if they were even public, I would short them.
The base M4 is a very small chip with a modest memory config. Don’t get me wrong, it’s fantastic, but it’s more Steam Deck/laptop than beefy APU (which the M4 Pro is a closer analogue to).
$1200 is pricey for what it is, partially because Apple spends so much on keeping it power efficient, rather than (for example) using a smaller die or older process and clocking it higher.
Yeah, that would be perfect!
Or (alternatively) they could majorly underclock the a shrunken series X chip to make it equivalent to an S.
Games are complex. Qualcomm/MS may tune it for the most popular titles, but I just don’t see how they can catch up to years of desktop GPU driver development.
Like… a wiki for memes? Some already exist AFAIK, even though they aren’t fully decentralized per se.
The problem is you need people documenting this stuff, like KYM presumably pays their staff to do, and good SEO/marketing to snag critical mass. That is a tall order for a volunteer Fediverse project of this nature, I think, as keeping up is many full time jobs.
Yeah, any framework with a “big” GPU is just so expensive.
Eh, yeah, and it’s backordered.
Ideally I’d like a full x16 slot too (or at least electrical x8), but perhaps that’s asking too much.
Also, is it even standard ITX?
I will believe it when I see it. I hope so.
Qualcomm makes a lot of hype/noise but historically tends to overpromise, and also makes some unforced blunders. But a real ARM competitor would be great.
It means emulation with pretty much every current title, and graphics driver issues and sluggish game out of the wazoo (as Qualcomm is very different than AMD/Intel/Nvidia).
ARM being more power efficient is also kind of a meme. Intel/AMD can be extremely good when clocked low (which they can do since there’s no emulation overhead), with both the CPU/GPU. Apple just makes x86 look bad because they burn a ton of money on power efficiency, but Qualcomm is more in the “budget” space. No one is paying $2K for an Xbox handheld like they would for an Apple product.
With a Qualcomm chip though… there will be some teething issues, best case.
These things are awesome.
My dream is:
One embedded onto an ITX board.
An SKU with a single (8 core (ideally X3D?)) CCD but the full GPU.
Using Qualcomm chips
Oof.
Why didn’t they go AMD, or heck, even Intel? They have GPU-heavy APUs in the pipe that would mostly just work.
Intel, in particular, is not bad power-wise as long as they aren’t clocking chips to very edge like they’ve been doing, and won’t necessarily have the TSMC capacity constraint. That’s huge.
Hence she chases it down with Coca Cola ™
My interpretation of the NFT/Crypto Future argument is “They’re perfect! It’s just that humans have to stop behaving like humans!”