Jump to content

Doldol

Member
  • Posts

    42
  • Joined

  • Last visited

Awards

This user doesn't have any awards

About Doldol

  • Birthday January 17

Profile Information

  • Gender
    Male
  • Location
    Belgium
  • Interests
    Technology
    Programming
    Gaming
    Food
  1. When I saw your Youtube video this sounded really exciting to me, up until I saw you guys where trying to do this in Node.JS. NodeJS is shiny, new and fancy, it's fast, but you will be held back by your DB before you'll be held back by your web stack. Caching is often the answer. My main gripe with NodeJS is that it's hard to maintain. Debugging is poop and there are no real standards. Hence it isn't great for large projects. IMO you guys should have gone with Python technologies, Django, Django-channels, Transcypt, etc. And possibly write transcoding servers in C++, C or even Cython (or just use a Python library that implements ffmpeg)! (Again whatever you pick Python is excellent here as glue language.) If your service goes anywhere and your developers aren't one-trick Node ponies, I guarantee you'll switch away from Node. Since 2010 (when Node was starting to gain traction) I tried to to give it a chance multiple times, but I can't live without exceptions, standards and proper documentation. I chuckled a bit when you guys mentioned agile-development just after NodeJS. However I wish you guys the best of luck! Attempting to compete with Youtube is very bold!. Edit: Maybe give these articles a read, It's about someone who tried to switch their project to NodeJS from Python, but eventually went back. https://blog.geekforbrains.com/why-im-switching-from-python-to-nodejs-1fbc17dc797a https://blog.geekforbrains.com/after-a-year-of-using-nodejs-in-production-78eecef1f65a
  2. Obviously.... It's not really hard to hook up an 8 pin pcie (150+75w, should be enough, could throw in another 6 pin) connector, and the mobo is def wired to 16x in every slot and supports up to 32 pcie lanes total I believe. They could at least make a 3 port 8x card no? 2 data 1 charge + data?
  3. This is kind of driving me nuts, I'm on x58 and looking to add some type-c connectivity to my PC, the best i can find are 2 port 4x cards. Am I just blind or is no one making 16x cards for this? Why not? You don't really need 5 type-c ports atm, but it's the new standard, and it's a waste to plug a 4x card into a 8/16x slot. 16x pcie 2.1 supports 64gbps, so that means 6 ports at full speed are possible, since usb 3.1 is 10 gbps. But I'd be happy with 5, 2 100w charging ports, 3 data transfer ports.
  4. Sublime Text 3 But as pointed out before, you're going to have a hard time. I would get some Android ROM from somewhere and modify the Linux kernel to my needs.
  5. I'm of the opinion that you need to have the drive to want to be able to do or understand something so badly that you persist and learn how to. Learn the difference between language and implementation. If you are interested in the mechanics I suggest you stick with CPython (Python interpreter in C, what people most commonly refer to as Python), learn how it is implemented, compare it to Pypy, learn the differences, then move into cython (which is Python-like and compiles into C), then finally learn C/C++ and how that works. At that point you will know about interpreters, just-in-time compilers and regular compilers. And an implementation of a language to go with each. Everything is great at something, nothing is great at everything. That is something very true for programming languages. And I bet that must be a quote from somewhere.
  6. Just treat them like any other costumer, maybe contact them to discuss what they're looking for and what their budget is? Then adjust your service accordingly.
  7. I don't see a year mentioned. Hehe, sneaky.
  8. App nr 1xxx to include an FPS counter..... WOW!..... They're never as good as in-engine ones, and people run MSI Afterburner/Geforce Experience/AMDs thingy already anyway. #notimpressed Or FPS only: cl_showfps 1
  9. That Razer Turret looks interesting, but I'm sad to see that they didn't use a Tegra K1 SoC. I've owned one for about 2 years now and when possible always use it when gaming on my laptop or Android Tab, and while I've before and after that bought and used other controllers (Ipega, Moga, Samsung, Logitech, 360 and PS3), none came as close in accuracy, feel, and durability. My only possible gripes with it would be the lack of a shorter cable, because it has a proprietary connector and the fact that the extra buttons can only be bound to existing buttons. If you're going to game on your TV go with the Nvidia Shield Tablet, it has the most powerful SoC (Tegra K1) available.
  10. Mad Catz and their brave 300. That's a more-than-decent tablet right there..... I'd buy it if it was 200USD or less!
  11. YES, yes we do! Mhuhahaahahahaha! But seriously, we do. Trust me.
  12. I think with the GPU requirements you're falling into the ultraslim gaming notebook category, I own a 1st gen MSI GS70, and use it for what you're planning to use the laptop, I did need to buy an expensive external 19.5v, high amp battery, as the GS70 on makes it to 4.5h with extreme battery saving precautions. =[ Slim + Nvidia/AMD + 8h battery life is a dream, sadly. There is not enough space to fit and the GPU and the big battery in a slim form factor.
  13. Seconded! Especially the latest version!
  14. Well not not really, C# is not really meant for graphics, at least for 3D. That's C/C++ domain, mostly. I would agree if we're talking about 2D/GUI. I think it's more accurate to say that if you want speed you want C/C++, if you want language features, you want C#. C# is by nature slower, which really matters in 3D graphics.
  15. As I said before I think this is a rebranded product, or they are probably very similar, so from another brand: http://www.amazon.de/Mushkin-Ventura-Ultra-USB-Flash-Laufwerk-MKNUFDVU120GB/dp/B00FSAHH5I/
×