technically the same as forgejo, codeberg is the main forgejo contributor/the org owning it
Just some IT guy
technically the same as forgejo, codeberg is the main forgejo contributor/the org owning it
Yes it was, the release notes explicitly specify it for 1.32.4 and 1.32.5
Onyx uses Android for their OS as such there are pretty much no restrictions on book formats
don’t forget the additional ssl cert for the second domain (assuming it’s not a wldcardable subdomain)
the problem is less how often I have to charge my mouse and rather that I’m almost always using it and on top forgetting to charge overnight so there is no “convenient” time to charge it during the day where I’d not use it
given how often my mouse has to charge while I’m using it I’d have a major issue with it
if water makes other things wet then most water is wet because it (usually) is surrounded by more water. qed
Last I checked (which was a while ago) “AI” still can’t pass the most basic of tasks such as “show me a blank image”/“show me a pure white image”. the LLM will output the most intense fever dream possible but never a simple rectangle filled with #fff coded pixels. I’m willing to debate the potentials of AI again once they manage to do that without those “benchmarks” getting special attention in the training data.
You apparently have little interaction with regular users because one of the top problems a non-power user has is “oops I accidentally hit delete on this important file I don’t have a backup of”.
Not saying qbittorrent-nox of all things switching makes a ton of sense but at least for desktop applications there is a very good reason why deleting things becomes a two step process.
I’d argue in the states you can lick those you really shouldn’t because it will instantly freeze your tongue off
I somewhat disagree that you have to be a data hoarder for 10G to be worth it. For example I’ve got a headless steam client on my server that has my larger games installed (all in all ~2TB so not in data hoarder territories) which allows me to install and update those games at ~8 Gbit/s. Which in turn allows me to run a leaner Desktop PC since I can just uninstall the larger games as soon as I don’t play them daily anymore and saves me time when Steam inevitably fails to auto update a game on my Desktop before I want to play it.
Arguably a niche use case but it exists along side other such niche use cases. So if someone comes into this community and asks about how best to implement 10G networking I will assume they (at least think) have such a use case on their hands and want to improve that situation a bit.
Personally going 10G on my networking stuff has significantly improved my experience with self-hosting, especially when it comes to file transfers. 1G can just be extremely slow when you’re dealing with large amounts of data so I also don’t really understand why people recommend against 10G here of all places.
Yeah they definitely could have been quicker with the patches but as long as the patches come out before the articles they are above average with how they handle CVE’s, way too many companies out there just not giving a shit whatsoever.
If I buy a switch and that thing decides to give me downtime in order to auto update I can tell you what lands on my blacklist. Auto-Updates absoultely increase security but there are certain use cases where they are more of a hindrance than a feature, want proof? Not even Cisco does Auto-Update by default (from what I’ve managed to find in this short time neither does TrendNet which you’ve been speaking well of). The device on its own deciding to just fuck off and pull down your network is not in any way a feature their customers would want. If you don’t want the (slight) maintenance load that comes with an active switch do not get one, get a passive one instead.
So first of all I see no point in sharing multiple articles that contain the same copy-pasted info, one of those would have been enough. That aside, again, patches were made available before the vulnerability was published and things like MikroTik not pushing Updates being arguably more of a feature since automatic updates cause network downtime via a reboot and that would be somewhat problematic for networking equipment. Could they have handled that better? Yes, you can almost always handle vulnerabilities better but their handling of it was not so eggregious as to warrant completely avoiding them in the future.
Can you elaborate on how their response was lacking? From what I found the stable branch had a patch for that vulnerability available for several months before the first report while the lts branch had one available a week before the first article (arguably a brief period to wait before releasing news about the vulnerability but not unheard of either).
MikroTik also offers a 2 year warranty since they legally have to, no idea what you’re on about there. Also also not sure what you think they sell other than networking because for the life of me I can’t find anything other than networking related stuff on their website.
Torrenting was created precisely to solve the bandwidth problem of monolithic servers. You very obviously have no idea how torrents (or PeerTube for that matter) works.
You mean they don’t already do that?
Earnings is incoming money before any expenditures
Pretty much yes, codeberg integrates some additional services and branding on top (such as codeberg-pages for static page hosting or forgejo-runners for CI) but you can integrate those yourself as well, it’s just extra work.
If you’re looking for an open alternative to github/gitlab codeberg is imo definitely the way to go