Plus domains should’ve gone left to right in terms of root, tld, domain, subdomain, etc., instead of right to left.
- 1 Post
- 529 Comments
My anecdotal observation is the same. Most of my friends in Silicon Valley are using Macbooks, including some at some fairly mature companies like Google and Facebook.
I had a 5-year sysadmin career, dealing with some Microsoft stuff especially on identity/accounts/mailboxes through Active Directory and Exchange, but mainly did Linux specific stuff on headless servers, with desktop Linux at home.
When I switched to a non-technical career field I went with a MacBook for my laptop daily driver on the go, and kept desktop Linux at home for about 6 or 7 more years.
Now, basically a decade after that, I’m pretty much only driving MacOS on a laptop as my normal OS, with no desktop computer (just a docking station for my Apple laptop). It’s got a good command line, I can still script things, I can still rely on a pretty robust FOSS software repository in homebrew, and the filesystem in MacOS makes a lot more sense to me than the Windows lettered drives and reserved/specialized folders I can never remember anymore. And nothing beats the hardware (battery life, screen resolution, touchpad feel, lid hinge quality), in my experience.
It’s a balance. You want the computer to facilitate your actual work, but you also don’t want to spend too much time and effort administering your own machine. So the tradeoff is between the flexibility of doing things your way versus outsourcing a lot of the things to the maintainer defaults (whether you’re on Windows, MacOS, or a specific desktop environment in Linux), mindful of whether your own tweaks will break on some update.
So it’s not surprising to me when programmers/developers happen to be issued a MacBook at their jobs.
Year of birth matters a lot for this experiment.
Macintosh versus some IBM (or clone) running MS DOS is a completely different era than Windows Vista versus PowerPC Macs, which was a completely different era from Windows Store versus Mac App Store versus something like a Chromebook or iPad as a primary computing device.
Installing MacOS on Intel Macs is really easy if you still have your recovery partition. It’s not even hard even if you’ve overwritten the recovery partition, so long as you have the ability to image a USB drive with a MacOS installer (which is trivial if you have another Mac running MacOS).
I haven’t messed around with the Apple silicon versions, though. Maybe I’ll give it a try sometime, used M1 MacBooks are selling for pretty cheap.
GamingChairModel@lemmy.worldto Google@lemdro.id•Google announces 1st and 2nd gen Nest Thermostats will lose support in October 2025English8·16 days agoI don’t really mind when a cloud-connected device gracefully falls back to an offline-only device. It seems like it retains all of the non-cloud functionality: reading and setting temps, including on a schedule.
It’d be nicer if they gave the option to let people run their own servers for the the networked functionality, but it doesn’t seem like they’ve reduced or otherwise held back on the offline functionality one would expect from a thermostat.
GamingChairModel@lemmy.worldto Technology@lemmy.world•White House Says It Has Tech That Can 'Manipulate Time and Space'English7·25 days agoI can travel forward in time at a rate of 60 seconds per minute, and I think the US government can, too.
Which is is such a high dollar count that this simply cannot be USD
So I haven’t used Windows on my own machines in about 20 years, but back when I built my own PCs that seemed about right. So I looked up the price history, didn’t realize that Microsoft reduced the license prices around Windows 8.
I remember 20 years ago, Windows XP Home was $199 and Professional was $299 for a new license on a new computer. Vista and 7 were similarly priced.
Since Windows 8, though, I just don’t understand their pricing or licensing terms.
GamingChairModel@lemmy.worldto Technology@lemmy.world•Meta’s AI research lab is ‘dying a slow death,’ some insiders say. Meta prefers to call it ‘a new beginning’English2·30 days agoI think back to the late 90’s investment in rolling out a shitload of telecom infrastructure, with a bunch of telecom companies building out lots and lots of fiber. And perhaps more important than the physical fiber, the poles and conduits and other physical infrastructure housing that fiber, so that it could be improved as each generation of tech was released.
Then, in the early 2000’s, that industry crashed. Nobody could make their loan payments on the things they paid billions to build, and it wasn’t profitable to charge people for the use of those assets while paying interest on the money borrowed to build them, especially after the dot com crash where all the internet startups no longer had unlimited budgets to throw at them.
So thousands of telecom companies went into bankruptcy and sold off their assets. Those fiber links and routes still existed, but nobody turned them on. Google quietly acquired a bunch of “dark fiber” in the 2000’s.
When the cloud revolution happened in the late 2000’s and early 2010’s, the telecom infrastructure was ready for it. The companies that built that stuff weren’t still around, but the stuff they built finally became useful. Not at the prices paid for it, but when purchased in a fire sale, those assets could be profitable again.
That might happen with AI. Early movers over invest and fail, leaving what they’ve developed to be used by whoever survives. Maybe the tech never becomes worth what was paid for it, but once it’s made whoever buys it for cheap might be able to profit at that lower price, and it might prove to be useful in the more modest, realistic scope.
GamingChairModel@lemmy.worldto Technology@lemmy.world•Meta’s AI research lab is ‘dying a slow death,’ some insiders say. Meta prefers to call it ‘a new beginning’English43·1 month agoFor example, as a coding assistant, a lot of people quite like them. But as a replacement for a human coder, they’re a disaster.
New technology is best when it can meaningfully improve the productivity of a group of people so that the group can shrink. The technology doesn’t take any one identifiable job, but now an organization of 10 people, properly organized in a way conscious of that technology’s capabilities and limitations, can do what used to require 12.
A forklift and a bunch of pallets can make a warehouse more efficient, when everyone who works in that warehouse knows how the forklift is best used, even when not everyone is a forklift operator themselves.
Same with a white collar office where there’s less need for people physically scheduling things and taking messages, because everyone knows how to use an electronic calendar and email system for coordinating those things. There might still be need for pooled assistants and secretaries, but maybe not as many in any given office as before.
So when we need an LLM to chip in and reduce the amount of time a group of programmers need in order to put out a product, the manager of that team, and all the members of that team, need to have a good sense of what that LLM is good at and what it isn’t. Obviously autocomplete has always been a productivity enhancer for long before LLMs have been around, and extensions of that general concept may be helpful for the more tedious or repetitive tasks, but any team that uses it will need to use it with full knowledge of its limitations and where it best supplements the human’s own tasks.
I have no doubt that some things will improve and people will find workflows that leverage the strengths while avoiding the weaknesses. But it remains to be seen whether it’ll be worth the sheer amount of cost spent so far.
You cant remove that double negative without making it incorrect
Sure you can: The IP Laws That Reinforce Enshittification.
I’m pretty sure every federal executive agency has been on Active Directory and Exchange for like 20+ years now. The courts migrated off of IBM Domino/Notes about 6 or 7 years ago, onto MS Exchange/Outlook.
What we used when I was there 20 years ago was vastly more secure because we rolled our own encryption
Uh that’s now understood not to be best practice, because it tends to be quite insecure.
Either way, Microsoft’s ecosystem on enterprise is pretty much the default on all large organizations, and they have (for better or for worse) convinced almost everyone that the total cost of ownership is cheaper for MS-administered cloud stuff than for any kind of non-MS system for identity/user management, email, calendar, video chat, and instant messaging. Throwing in Word/Excel/PowerPoint is just icing on the cake.
GamingChairModel@lemmy.worldto Technology@lemmy.world•Framework “temporarily pausing” some laptop sales because of new tariffsEnglish16·1 month agoThey were largely unaffected by the tariffs targeting China, because US trade policy distinguishes between mainland China and Taiwan. Problem was that Trump announced huge tariffs on everyone, including a 32% tariff on Taiwan.
GamingChairModel@lemmy.worldto Technology@lemmy.world•China launches HDMI and DisplayPort alternative — GPMI boasts up to 192 Gbps bandwidth, 480W power deliveryEnglish14·1 month agoI wonder what the use case is for 480W though. Gigantic 80" screens generally draw something like 120W. If you’re going bigger than that, I would think the mounting/installation would require enough hardware and labor that running out a normal outlet/receptacle would be trivial.
GamingChairModel@lemmy.worldto Technology@lemmy.world•Coin-sized nuclear 3V battery with 50-year lifespan enters mass productionEnglish30·1 month agothis battery can deliver 0.03mA of power
0.03mA of current. That times the 3 volts = 0.1 mW of power.
GamingChairModel@lemmy.worldto Technology@lemmy.world•Anthropic has developed an AI 'brain scanner' to understand how LLMs work and it turns out the reason why chatbots are terrible at simple math and hallucinate is weirder than you thoughtEnglish181·1 month agoThis is pretty normal, in my opinion. Every time people complain about common core arithmetic there are dozens of us who come out of the woodwork to argue that the concepts being taught are important for deeper understanding of math, beyond just rote memorization of pencil and paper algorithms.
GamingChairModel@lemmy.worldto No Stupid Questions@lemmy.world•What if Apple / other brands sold desktop chips?2·1 month agoDo you have a source for AMD chips being especially energy efficient?
I remember reviews of the HX 370 commenting on that. Problem is that chip was produced on TSMC’s N4P node, which doesn’t have an Apple comparator (M2 was on N5P and M3 was on N3B). The Ryzen 7 7840U was N4, one year behind that. It just shows that AMD can’t get on a TSMC node even within a year or two of Apple.
Still, I haven’t seen anything really putting these chips through the paces and actually measuring real world energy usage while running a variety of benchmarks. And the fact that benchmarks themselves only correlate to specific ways that computers are used, aren’t necessarily supported on all hardware or OSes, and it’s hard to get a real comparison.
SoCs are inherently more energy efficient
I agree. But that’s a separate issue from instruction set, though. The AMD HX 370 is a SoC (well, technically, SiP as pieces are all packaged together but not actually printed on the same piece of silicon).
And in terms of actual chip architectures, as you allude, the design dictates how specific instructions are processed. That’s why the RISC versus CISC concepts are basically obsolete. These chip designers are making engineering choices on how much silicon area to devote to specific functions, based on their modeling of how that chip might be used: multi threading, different cores optimized for efficiency or power, speculative execution, various specialized tasks related to hardware accelerated video or cryptography or AI or whatever else, etc., and then deciding how that fits into the broader chip design.
Ultimately, I’d think that the main reason why something like x86 would die off is licensing reasons, not anything inherent to the instruction set architecture.
GamingChairModel@lemmy.worldto No Stupid Questions@lemmy.world•What if Apple / other brands sold desktop chips?2·1 month agoit’s kinda undeniable that this is where the market is going. It is far more energy efficient than an Intel or AMD x86 CPU and holds up just fine.
Is that actually true, when comparing node for node?
In the mobile and tablet space Apple’s A series chips have always been a generation ahead of Qualcomm’s Snapdragon chips in terms of performance per watt. Meanwhile, Samsung’s Exynos has always been behind even more. That’s obviously not an instruction set issue, since all 3 lines are on ARM.
Much of Apple’s advantage has been a willingness to pay for early runs on each new TSMC node, and a willingness to dedicate a lot of square millimeters of silicon to their gigantic chips.
But when comparing node for node, last I checked AMD’s lower power chips designed for laptop TDPs, have similar performance and power compared to the Apple chips on that same TSMC node.
GamingChairModel@lemmy.worldto Programmer Humor@programming.dev•Bingo of crappy IT processes16·1 month agoThe person who wrote it has been gone for like four years
Four years? You gotta pump those numbers up. Those are rookie numbers.
GamingChairModel@lemmy.worldto Technology@lemmy.world•Finally, a Linux laptop with a brilliant display and performance that rivals my MacBook (from Germany)English1·1 month agoYeah, Firefox in particular gave me the most issues.
Configuring each app separately is also annoying.
And I definitely never got things to work on an external monitor that was a different DPI from my laptop screen. I wish I had the time or expertise to be able to contribute, but in the meantime I’m left hoping that the Wayland and DE devs find a solution to be at least achieve feature parity with Windows or MacOS.
That’s why the best places to work tend to be the places where your CEO has had your job before.