I’ve heard of being blocked from a specific service like Xbox Live but never the whole of Microsoft. How would people get Windows updates? It’s crazy.
I’ve heard of being blocked from a specific service like Xbox Live but never the whole of Microsoft. How would people get Windows updates? It’s crazy.
Mainly because energy and data centers are both expensive and companies want to use as little as possible of both - especially on the energy side. OpenAI isn’t exactly profitable. There is a reason companies like Microsoft release smaller models like Phi-2 that can be run on individual devices rather than data centers.
Yep and somehow people who don’t know better are up voting him. Not surprising for this platform.
Yeah it’s not always that simple. You haven’t been around long enough to see the stuff that can go wrong with installing Windows. For example I recently had Windows refuse to see both SSDs in a machine. All because of something called Intel VMD. Took me a handful of attempts before I found the problem.
When Windows installs work they are fairly simple if long, but when they don’t work oh boy.
The unplugging of internet to get a local account?
Also they disabled that for Windows Home.
Some Lemmy users are actually just wankers. I would like it if you all stopped. It’s especially great when I have people like you who probably aren’t even experienced in tech.
This actually happened to me before recently and all it took is one firmware setting. So frustrating.
Actually no. It’s not Mint’s decision whether to start the install USB with UEFI or BIOS. It actually depends on what the firmware chose to start and how the install medium is formatted. Some install media is only setup for BIOS booting, some for only UEFI, and some can do both. If the firmware detects the medium as supporting both then it should choose UEFI first but this depends on what settings you have in the firmware, and if you choose an option at a boot menu as boot menus allow you to override the default. When it comes to actually installing the OS most sane installation software will look at how it booted and install that way. So if it detects it was starting with UEFI it will configure the install to be UEFI, same if it was started with BIOS it will install as BIOS. How does it know? UEFI variables are one way. They can normally only be accessed if the system was started with UEFI.
If you truly wipe a drive you wipe the partition table as well. You say the table is outside the file system formatting, and this is sort of true, but they are both just data on the disk. Disk don’t care where the partition table ends and the file system begins. In fact you don’t even need a partition table at all. Unlike some other systems Linux will let you put a file system straight on the disk, the whole disk, with no partition table in sight. It’s not recommended mind you, because it will freak Windows out if it sees it. Windows will see it as a blank disk and not so helpfully offer to format the thing. When I say format a disk, I mean the whole thing, partition table and all. It’s also not possible to make a partition tableless disk bootable in UEFI. In BIOS it’s possible though as BIOS doesn’t read partition tables. It just needs a boot sector and that’s it.
Also if you’re trying to change a disk from MBR to GPT, and you don’t care about data, you shouldn’t be converting it. You should be formatting/wiping the whole thing and making a new partition table. Which is normally what it offers to do if you tell it to erase everything and install it.
Edit: Getting down voted for actually knowing how computers work and bothering to explain it. Shock horror.
UEFI won’t boot from MBR drives unless it’s in BIOS compatibility mode. What format the drive is in isn’t determined by a firmware setting, though it can affect the boot process. I don’t think you actually understand what you are talking about here. The easiest way to install OSes both Windows and Linux is by wiping the drive, which would have solved this issue. Dual boot on single drive configurations normally have issues and will always be more complicated. It’s better to use two drives where possible in most cases. I suggest you read up on BIOS vs UEFI and how partition tables work if you want to do a complex setup like that.
Mint is known for having older kernels and therefore not supporting the latest hardware. They have a different edition for newer computers called Linux Mint Edge edition. Something Arch derived like CachyOS or another distro using recent kernels will always have the best support for bleeding edge hardware. The CachyOS installer is also pretty friendly, though maybe not as much as Mint.
This isn’t true. Try Linux Mint or Ubuntu, their installers are much better. Those installers used by Fedora, RedHat, and even SUSE can be a bit weird.
They specifically say unbloated Windows as well which while it’s not as difficult as they make out is still somewhat annoying.
I’ve recently had a Windows installer fail to see my NVMe drives until I changed some random UEFI setting because it was missing a driver. Linux could see it just fine, as could Hirens boot.
It was an improvement but still not great. Ideally they would have kept the Windows 7 interface with maybe some upgrades like virtual desktops, then continued with all the under the hood improvements.
Yes, yes they can.
I mean for one it supports a lot less hardware. Second it’s significantly less reliable. Third it has thing like Co-Pilot built-in. I don’t know how people aren’t criticizing it more frankly.
Someone using Linux Mint would be a good guess as I don’t think they default to Google.
Then you use DuckDuckGo like I do. Not every search engine has gone to complete shit. Google was just an example. Obviously it’s not the current meta in terms of search engines.
Are you honestly telling me there aren’t people asking basic questions that could be solved with a google search? Don’t get me wrong the kind of question you are talking about does exist, but that’s now what I am discussing here.
Yeah unfortunately this is a real issue. I also think it’s an issue that experienced users don’t really want to help newbies, especially those who can’t or won’t do research by themselves. Ideally experienced users would be more helpful, but at the same time that isn’t their job. There are many who learned Linux more or less on their own so it’s understandable they don’t want to help given they didn’t use any help when it was their turn. I think now that the community is growing this might start to change a bit, as the newcomers are more likely to have had help and be willing to help others.
I sometimes try to advocate for using Linux, and I don’t mind giving friends advice from time to time. That being said I don’t want to be stuck answering stupid questions all the time that could have been solved with a google search or a YouTube video. I have my own stuff to worry about both technical and otherwise.
That’s why I think teaching new users how to access resources like man pages, gnu info pages, google, and so on is the correct approach to take. It is empowering having the skills to work through your own issues. That being said I also think it’s important for experienced people to give advice on more complex questions.
People see AI and immediately think of ChatGPT. This is despite the fact that AI has been around far longer and does way more things including OCR and data mining. It’s never been AI that’s the problem, but rather certain uses of AI.
I’ve seen teachers use this stuff and get actually decent results. I’ve also seen papers where people use LLMs to hack into a computer, which is a damn sophisticated task. So you are either badly informed or just lying. While LLMs aren’t perfect and aren’t a replacement for humans, they are still very much useful. To believe otherwise is folly and shows your personal bias.
I am not talking about things like ChatGPT that rely more on raw compute and scaling than some other approaches and are hosted at massive data centers. I actually find their approach wasteful as well. I am talking about some of the open weights models that use a fraction of the resources for similar quality of output. According to some industry experts that will be the way forward anyway as purely making models bigger has limits and is hella expensive.
Another thing to bear in mind is that training a model is more resource intensive than using it, though that’s also been worked on.
Fair enough