• 6 Posts
  • 310 Comments
Joined 3 years ago
cake
Cake day: June 18th, 2023

help-circle

  • Oh, damn. You’re right.

    When I first saw this, I read through the readme, and it sounded pretty cool. Needless to say, I know nothing about physics.

    I didn’t suspect AI in the slightest, until I saw this comment thread.

    Now I’m pretty taken aback. Looking at it again, it should be pretty obvious. I wonder what was it about the way it was presented that made me believe it and not suspect AI in the slightest, because that’s a mistake I don’t really want to do again.

    Probably a combination of passionate presentation, topic I know nothing about combined with topic I love (game engines), and my whole interaction being “this is pretty cool” and moving on. I did try looking for some actual sources about the Tesla’s mythical “standart model”, which I found none, plus got suspicious about definiton of “standart model” feeling like it doesn’t match what the text was talking about, and I just moved on, but the conclusion I had was “i wonder what will turn up out of it”, instead of “probably llm halucination” as ot should’ve been.

    Oh well, I guess it’s time to properly lock in on actual textbook knowledge in fields I’m interrested in, because recognizing stuff like this in tutorials/posts and eventually books will be only harder, and it won’t be really feasible to rely on “I’ll research it on the internet when I need it”



  • I don’t really do courses anymore, but one thing that kind of matches the questions was playing through Turing Complete.

    It’s a game where you start with NAND gates, and slowly build up from there. Other gates, then a counter, adder, single-bit memory, etc, where every puzzle uses the component design’s you’ve build before. Eventually you build up to an ALU, RAM, add instructions and connect it up to a working CPU.

    It’s super fun, and even though hardware isn’t really something I usually look into, it has taught me a lot, way more than college courses about CPU architecture. Plus, seeing (and actually programming, in later levels) on a CPU of your own design, using your own opcodes, is actually pretty cool.




  • To add to this excelent answer, one thing that made me really understand and realize quite a lot about how do CPUs actually work, and why is most of the stuff the way it is, was playing through the amazing “Turing Complete” puzzle game.

    The premise is simple - you start with basic AND/OR/NOT gates, and slowly build up stuff. You make a NAND, and then can use your design. Then you make a counter, and can use that. The one bit memory. An adder. A multiplexer. All using the component designs you have already done before.

    Eventually, you build up to ALU and RAM, until you end up with a working CPU. Later levels even add creating your instruction sets and assembly language, but I never really got far into that part.

    It’s a great combination of being a puzzle game - you have clear goals, and everything is pretty approachable and very well paced. I had no idea how is memory done on the circuit level, but the game made me figure it out, or had hints when I got stuck.

    And seeing a working CPU that you’ve designed from scratch is pretty cool, but most importantly - even though I’ve had courses on hardware, CPU architecture and the like on college, there’s a lot of stuff I kind of understood, but it never really clicked. This game has helped tremendously in that regard, and it was full of “aha moments” finally connecting a lot of what I know about low-level computing.

    I’m not even into puzzle games that much, but this was just a joy to play. It was so fun I sat through it in one session, up until I got to a complete CPU. I very highly recommend it to anyone.



  • This is a really good point.

    This post is a great example of what will skipping a research and just trusting the first solution you find lead to.

    When you are researching the thing yourself, you usually don’t find the solution immediately. And if you immediately have something that seems to work, you’re even less likely to give up on that idea.

    However, even taking this into account (because the same can probably happen even if you do research the thing yourself - jumping to a first solution), I don’t understand how it’s possible that the post doesn’t make a single mention of any remote desktop protocols. I’m struggling to figure out how would you have to phrase your questions/promts/research so that VNC/RDP, you know - the tools made for exactly the problem they are trying to solve - does not comes up even once during your development.

    Like, every single search I’ve tried about this problem has immediately led me to RDP/VNC. The only way how I can see the ignorance displayed in the post is that they ignored it on purpose - lacking any real knowledge about the problem they are trying to solve, they simply jumped to “we’ll have a 60 FPS HD stream!”, and their problem statement never was “how to do low-bandwith remote desktop/video sharing”, but “how to stream 60 FPS low-latency desktop”.

    It’s mindboggling. I’d love to see the thought and development process that was behind this abomination.


  • Uh, I’m pretty damn sure I have seen an office with hundreds of people, all connected remotely to workstations, on enterprise network, without any of the problems they are talking about. I’ve worked remotely from a coffee shop Wifi without any lag or issues. What the hell are they going on about? Have they never heard about VNC or RDP?

    But our WebSocket streaming layer sits on top of the Moonlight protocol

    Oh. I mean, I’m sitting on my own Wifi, one wall between me with a laptop (it is 10 years old, though) and my computer running Sunshite/Moonlight stream, and I run into issues pretty often even on 30FPS stream. It’s made for super low-latency game streaming, that’s expected. It’s extremely wrong tool for the job.

    We’re building Helix, an AI platform where autonomous coding agents…

    Oh. So that’s why.

    Lol.









  • Well, Element seems to still be running at the unupdated version even after update, so I’m just shutting the server down.

    I’m bummed that it took me 5 days to learn about it, does anyone have some tips how to get early warnings for techs you’re using? I’m guessing there’s a way with npm.

    Also, anyone has some tips how to properly compromise-check your server? I’m guessing there are logs to check for compromise, and audit your startup scripts for persistence? Any tools that could help with that?



  • First time I’m seeing Uiua, and I like it. It’s kind of cute, even though I know I’ll probably never use it.

    However, seeing one of their goals being “code that is as short as possible while remaining readable” is kind of ironic, given how it looks and reads. But I don’t mind, it’s still pretty adorable.

    It looks like it’s hell to learn and write. It’s possible that once you learn all the glyphs (which IMO adds unneccessary complexity that goes against their goal of being readable), it might be easier to parse. I’m probably not the target audience, though.