Jump to content

tikker

Member
  • Posts

    6,103
  • Joined

  • Last visited

Everything posted by tikker

  1. That could work, if compiling file A does not need knowledge of compiling file B. As soon as B needs to know about A you lose your ability to parallelise over A and B. It exists to some degree already: https://www.gnu.org/software/make/manual/html_node/Parallel.html and I found this exploring some aspects of parallelising GCC more: https://gcc.gnu.org/wiki/ParallelGcc The big question is what kind of operation is most common in compiling code and how would you parallelise and/or optimise that? "Compile a C program" is not the operation. Ideally you want something specific like "multiply two 3x3 matrices of integers". The former is vague while the latter you can design a circuit/component for.
  2. I think matrices can be seen as a 2D tensor. I'm not sure if AI uses higher-order ones much, I don't use AI in my coding and uni math courses have been a while, but I thought I'd use the term tensor to implicitly draw the parallel with e.g. tensor cores.
  3. I think that varied amount of logic is one of the limitations. Hardware acceleration leverages doing a specific operation efficiently. For AI these are things like tensor operations, which you can clearly define like multiplications etc. If you have a large variety of logic, what would you have the hardware do specifically? GPUs also excel at doing things massively parallel, but if A depends on B then you inherently must do B before A and can't (effectively) parallelise.
  4. I did do the online "check my connection" thing from Sky and it has been back to quite a stable 7-8 down, 1 up since, so that 0.4 was a different problem indeed. Since I'm still in the 14-day cool-off period though I am interested in trying out pulse8, since they seem to offer fibre (to the cabinet) broadband and advertise speeds up to 78/20 for the same price.
  5. Thanks for all the respons and suggestions folks. I am currently trying Sky, but frankly it's quite bad. On good times I reach the 7-8 Mbps advertised speeds, but this morning it was a whopping 0.4 if I was lucky. After checking with Openreach I did find two other providers that seem to do deliver decent speeds which are pulse8 and Your Co-op. I am not familiar with them, however. Has anyone heard of them or have any experience with them? They mention speeds in the 40-70 Mbps range, which is a whole lot better than what I have now, for roughly the same monthly price.
  6. Mass Effect: the whole trilogy, it's pretty good. Crysis: definitely 1. If you like the story and game 2 and 3 are nice, but the first installment was something else. Portal: I never played the second one, but the first one is fun and an iconic one of the time. Resident Evil: I would definitely play the HD remaster of RE1.
  7. No 5G option from them at my location sadly. Ah that is good to know.
  8. That sounds attractive. I don't know why it didn't show up earlier, but when I check for coverage on their website I see I can get 4G from them for indeed £11 then £22 after 6 months. I may keep this in mind as well. I have no experience with 4/5G routers. So I'd plug the SIM card in there and then proceed as usual? Good point. They don't mention anything about latency in the Sky offer, but I guess it'll still be more reliable in the end. I do plan on gaming and video calls so decent latency that would be nice. Yeah I tried pushing them to be less salesy about it, but all I got was "the cabinet is currently full, that's why it sucks; fiber soon (TM)". So in light of the above, would you then recommend taking the ADSL line and getting the Three 4G as additional one? It would actually be a good bit cheaper than Starlink in both initial cost and monthly cost. That is pretty good. I'm currently spoilt with 300 up / 40 down (which is sufficient for my needs, but the speeds you show are totally acceptable (ignoring the ping haha, but I guess this confirms a cable line would still make sense to reduce latency).
  9. I'll be moving to the UK soon and the maximum speed I'm currently offered at the place I'll likely be staying appears to be a whopping 7.8 Mbps down and up in the single decimals as it stands now. I've been told Sky is planning to hook up fiber "[probably] within a year", but since I'm only there for two years initially that is quite a long uncertain period. To that extent, I was wondering what people's experiences are with Starlink? The agency is trying to convince me with "but this offer is a guaranteed minimum speed of 3.4, starlink is whatever and whenever they feel like", the supposedly future fiber being connected and stuff like that, but it's an 18 month minimum contract and it sounds a bit salespersony to be honest. It is a bit expensive at 75 GBP per month (ignoring initial purchase costs), but I'm hoping it beats the 7 Mbps for 35 GBP per month or worse... Any other suggestions are of course also welcome.
  10. That's pretty awesome. I'm curious about the longetivy. Ignoring that the drives may cost a fortune I'd love to store 100+ TB on a single disc, but with that amount of data on a single thing I would want it to be pretty robust if it's ever coming to home use. Going with the SI prefixes peta is 10^15 and tera is 10^12. So in that case, if we divide 1 petabit by 8 you get 0.125 * 10^15 bytes, or 125 * 10^12 bytes, or 125 terabytes. If we take the binary "terabyte" or tebibyte (TiB) to be 1024^4 bytes then we get ~1.1 TB for every TiB which would give 125 TB / 1.1 TB/TiB = ~113.7 TiB. Not sure what conversion that calculator is doing, but also this decimal/binary unit stuff while keeping the same labels has been the most confusing we've ever made things tbh.
  11. They don't. In the case of YouTube, the most likely explanation is that your TV announces to the network it's connected to that it supports casting. YouTube picks up on that and thus provides you the option. It's the hardware that is communicating their presence and making it available. YouTube the site doesn't know about them. They have either been linked using a link code in the past or the devices themselves are simply announcing they can be cast to and some code is making that available. My Nvidia Shield is on the same network as my PC and phone, for example, so both are able to cast to them. If I disconnect my phone from the Wi-Fi and go to mobile data the Shield disappears as well, because it's no longer on the same network, showing that the site doesn't inherently know my device is there. If you don't use it, maybe your TV (or whatever the device is) has an option to disable casting?
  12. The silicon "ingot" from which they are made is produced as a cilindrical shape so you get circular discs:
  13. On the off chance this is useful to anyone else with an Nvidia Shield (in my case 2019 Pro model): the AI upscaling can mess up Steam in-home streaming performance big time. For months I had this strange issue where nothing over 24-30 FPS was stable and the lag would just start growing continuously to half a second and beyond, making everything unplayable. Turned the upscaling back from AI enhanced to basic and boom, back to 10-20 ms at most again.

  14. At the end of the day they serve similar purposes. A problem with a 12-hour clock, for example, lies with noon and midnight: is that 12:00 AM or 12:00 PM? There is no agreement on it. To complicate things further, AM and PM mean before or after noon, so noon itself does not exist. Similarly midnight is as far from the previous noon as the coming noon so is it before or after noon? There can be no "12 AM" or "12 PM" just like there can be no 24:00 in the 24:00 clock. From a table on Wikipedia it even seems the US Government Publishing Office switched from calling noon 12 AM to calling it 12 PM in 2008, so it's not fixed either. The 24-hour clock removes that "12" ambiguity. The day starts at 00:00:00 and ends at 23:59:59. Something at 9:45 will never be confused for something at 21:45, a country 6 hours ahead will simply be at 15:45 instead of 9:45. You can also argue common sense and say a day has 24 hours, so a 24 hour clock fits well.
  15. 12 AM vs PM always trips me up with midnight. We don't even really have the AM and PM qualifiers in a direct sense at all in Europe as far as I know. When speaking it is on the 0-12 scale where it's either clear form context which one or you just say "in the morning/afternoon". Writing and the likes are 24-hours all the way. It wasn't taught to me as "military style" either, but simply analogue and digital clock reading.
  16. I don't know about the watch, but the 24 hour clock is rather straightforward: the AM hours are 00:00 to 12:00 and the PM hours are 12:00 to 23:59. If the numer is larger than 12, subtract 12 and you know the time in PM. For example, 09:45 is 9:45 AM while 21:45 is 9:45 PM.
  17. I don't know anything about that file, sorry. I don't program in C, so I never have to deal with setting it up. That page does link to another one where the spec for that file is defined: https://clang.llvm.org/docs/JSONCompilationDatabase.html It seems clang with the -MJ argument might be able to generate it? It's not uncommon for this stuff to be more linux oriented (or linux first), so outside of IDE's liek Visual Studio or VS Code you may be pulling at the short end of the stick a bit with trying it on Windows.
  18. Ok everything seems running and healthy, and clangd is recognised. Seems like you need to generate that compile_commands.json file that they linked instructions for. I would try that next.
  19. Firstly by not letting people that you think will steal your data work on it. It sounds like your company needs to look into some data policies. Something like a "nothing leaves the premises" approach where all work is done on a machine that requires some sort of verified login, stays locally, is monitored all the time, is disconnected from any network etc. There are many approaches.
  20. Okay, so stuff seems to be running. First I would try: open one of your source files and run :checkhealth and :LspInfo to see if things are good. The checkhealth command will show a general overview of configuration and whether any errors are found there. LspInfo will tell you what kind of file it has detected, the servers you have configured and whether any LSP's are active for the file.
  21. Hmm I've never used it on Windows, but I guess to some extent things should be similar. Let's start at the beginning and see which plugins are present. Assuming you are running the latest version that uses lazy as a package manager instead of Mason, what does it show if you type :Lazy?
  22. What do you mean with "it doesn't know any include files"? Like it errors on them? I don't program C so I've never set it up for that. There might be something about the setup that needs to be tweaked for it judging from e.g. https://stackoverflow.com/questions/73395641/in-included-file-begin-code-h-file-not-found-occurring-in-neovim-lsp and https://github.com/neovim/nvim-lspconfig/blob/master/doc/server_configurations.md#clangd.
  23. Personal preference. It is meant to provide a basic experience on its own that you can customise to your liking through plugins. I don't want or need a file tree in vim, for example. Did you install the LSP for C as mentioned in this issue, for example? https://github.com/nvim-lua/kickstart.nvim/issues/268 Harpoon: https://github.com/ThePrimeagen/harpoon/tree/harpoon2 Takes a bit of tinkering to set up, but the Telescope + Harpoon combo has skyrocketed my development pace if I'm in the flow.
  24. For me it's great simply because it suits my programming workflow vastly better than Windows, and I don't own a mac. For non-programming stuff like gaming or graphic design kind of things I still use Windows, since there isn't a satisfying alternative with linux support for the latter for me, for example. I would say that crown goes to Apple's ecosystem, because I consider a human-centric technology one that focuses on ease of use and not configurability of freedom to do whatever you want within the OS. The vast majority of users will never need that. The way things interact in Apple's ecosystem is pretty darn smooth as far as I've seen. It has gotten a lot better since its start, but running into support issues with Wayland is still not rare. Teams couldn't screenshare because of it, Barrier/Synergy didn't work with it, Nvidia is hit or miss. Now a common thread in my experience is that there is no support (either because they simply haven't done it, or because the way things operate go against Wayland's operating model), which relates to the next: If something is not supported or implemented, then for the end user that still makes it a Wayland issue.
  25. These are not some sort of magic words though. Open source is nice, but you still need leadership with a vision of where the project is going and a big team of developers and maintainers to keep the project going and maintained. That ties into the second point: "free" doesn't support the livelihood of people working on the project. Either their employers need to allow for a certain amount of work time to be spent on arbitrary community software (meaning money lost for the company), or they need to spend their free time on it. Either way it is likley they will have limited time to spend on the project, or at least a lot less than e.g. Adobe's teams can whose literal job it is to develop it. You are asking a project with substantially less money, less time and maybe less experience than Adobe to beat Adobe at their own game. That is not impossible, but at least very hard. Moreso because you need to break the industry standard.
×