Depends on the game. Apex, Riot, ubisoft, and EA all ban vm players. A list of other companies do as well.
Depends on the game. Apex, Riot, ubisoft, and EA all ban vm players. A list of other companies do as well.
Easy way to get yourself banned in online games just an FYI. Most online games will detect and ban virtual machines now since they’ve become commonplace in cheat/hack communities.
Reddit is dead to me, and given their stance on their apis, should be dead to pretty much all hobbiests deeply interested in self hosting.
Dude, you need to see how nasty the justice system in the US really is. They’re not your protectors and they’re literally not required by law to be so. Supreme Court decided that decades ago.
I’d recommend against it. Apple’s software ecosystem isn’t as friendly for self hosting anything, storage is difficult to add, ram impossible, and you’ll be beholden to macOS running things inside containers until the good folks at Asahi or some other coummity startup add partial linux support.
And yes, I’ve tried this route. I ran an m1 mac mini as a home server for a while (running jellyfin and some other containers). It pretty consistently ran into software bugs (less maintained than x64 software) and every time I wanted to do an update instead of sudo whateveryourdistroships update, and a reboot, it was an entire process involving an apple account, logging into the bare metal device, and then finally running their 15-60 minute long update. Perfectly fine and acceptable for home computing, but not exactly a good experience when you’re hosting a service.
Wait… You want us to pay humans? - Every triple A gaming company since 2010.
Part of this is Apple’s fault. They were part of the head council of the Kronos group responsible for Vulkan, but chose to implement a proprietary graphics API (Metal) over just rolling Vulkan… Developers obviously don’t want to support an additional graphics library on top of what they already do (significant effort) so you lose a lot of games that would’ve been otherwise marginally expensive to port over.
A full year of multi month hikes across the world. I want to see it all and meet new people.
Nothing forever will feel oh so fast when you lose any frame of reference.
I pasted some links, but the DoE says groundwater will most likely be contaminated. Depends on who you trust and how willing you are to suffer radioactive contamination. Granted, it’s probably a better risk profile than say… Coal… But that doesn’t change the fact we have no good longterm plan to store any amount of radioactive waste, and if history is your teacher, a plan will most likely not come to fruition.
Honestly, despite all of nuclears many benefits, there’s still no good action plan for the significant amounts of substantially dangerous waste it leaves around. Hard to figure out a storage plan for an invisible poison seeping from a rock for the next 50,000 years.
You know, almost every phone still has an ir blaster… It’s just not made Available to you.
(Auto focusing in cameras is largely done via an ir blaster and corrisponding receiver)
Embrace. Extend. Extinguish.
Those words proved the folly of the “free as in freedom” open source many moons ago.
This ignores so much that has been fought for and done by so many politicians who actually have a desire to make things better. It’s honestly disgraceful you’re that bitter you can’t see the good faith efforts that have been made.
I will say I’ve never ever even once had an issue with my M1 pro 16", can’t say that about any other laptop I’ve owned (be it battery swelling, software bugs, or “issues” one learns to live with like sleep mode causing boot crashes or sleep mode draining battery %). Kinda amazing in hindsight.
Isn’t it all unicode at the end of the day, so it supports anything unicode supports? Or am I off base?
Not to defend nvidia entirely, but there are physical cost savings that used to occur with actual die wafer shrinkage back in the day since process node improvements allowed such a substantial increase in transistor density. Improvements in recent years have been lesser and now they have to use larger and larger dies to increase performance despite the process improvements. This leads to things like the 400w 4090 despite it being significantly more efficient per watt and causes them to get less gpus per silicon wafer since the dies are all industry standardized for the extremely specialized chip manufacturering equipment. Less dies per wafer means higher chip costs by a pretty big factor. That being said they’re certainly… “Proud of their work”.