Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

igormp

Member
  • Content Count

    1,113
  • Joined

  • Last visited

Reputation Activity

  1. Like
    igormp got a reaction from DailyProcrastinator in Sun Ultra 45   
    Hi everyone.
     
    I'll use this space as the build log for my project using an old Sun Ultra 45. For those who never heard about it, the Sun Ultra line was a series of servers and workstations made by Sun before they were bought by Oracle, usually using SPARC CPUs instead of your regular x86 (however, some of the the last models, such as the Ultra 40, had x86 CPUs).
     
    Since I'm not a fan of RGB nor windows on cases, I was looking for a nice case that was as stealthy as possible, basically a black box with a button to turn on. Sadly, case prices here are kinda expensive due to the pandemic along with bad currency ratio, so I decided to retromod a nice case. First I looked after a Mac Pro G5, but those are still expensive and the internal space to build an ATX build is basically non-existent, so I gave up on that idea.
     
    That's when I decided to go for the Sun Ultra 40/45, which has way more internal space, the components reassemble an ATX-like setup (but it still has a proprietary form-factor for mobo and psu), while still looking sleek and more elegant that then Mac Pro IMO.
     
    Today the system I managed to buy arrived, with the original CPU (1x UltraSPARC IIIi), mobo and PSU (1000W):
     





     
    And here's a video of the fancy toolless panels (idk how to embed videos here).
     
    I'll firstly need to polish and clean it up, maybe even repaint the top part since it's heavily scratched.
    Then I'll try to add some mobo standoffs, since the original one isn't ATX. I'll also need to find a way to fit a regular PSU, along with trying to mod the front panel in order to use the USB and audio ports along with the power button in a regular mobo.
    Lastly, I'll install some case fans along with some dust filters.
  2. Like
    igormp got a reaction from RTX 3090 in Communication between C++ and COM port   
    Eh, that makes things a bit more complicated, but this should work: https://github.com/wjwwood/serial
     
    Haven't tested it though, with linux it's as easy as just reading the serial file as a regular text file, or using pyserial with python instead of cpp.
  3. Like
    igormp got a reaction from Dasher125 in AMD/NVidia MultiGPU setup   
    Yeah, sure thing, at least on linux. Not sure how it works on windows, but I can't see why it shouldn't work either.
     
    You could try simply running the cuda detection example while having the AMD card as the main GPU to check if everything is working as it should.
  4. Like
    igormp got a reaction from Edgar B in In need of RAM knowledge   
    Usually anything above ~3000MHz is considered overclocking, you'll just need to enable the XMP profile for the rated speeds. It's usually really simple, but it depends on what CPU and mobo you'll be using. The same applies to 3200mhz, although it's way easier to reach those speeds with most setups.
  5. Informative
    igormp got a reaction from kelvinhall05 in Linux AMDGPU F@H memory leak bug - need testers!   
    No issues here.
  6. Like
    igormp got a reaction from Electronics Wizardy in 512GB RAM Server Built Config Opinion   
    First of all, what do you need it for? Those are 2 wildly different specs
     
    I'm not sure, but the ram sticks in there seem to be registered ECC? If so, I believe it won't work with ryzen, since it only works with unbuffered ECC (and also depends on the mobo).
  7. Like
    igormp got a reaction from Luka95 in What GPU is doing when you watch a youtube video?   
    iirc, you can see the details of GPU usage on task manager, not sure if it's showing it correctly though.
     
    Anyway, your browser is using your GPU's hardware decoder to play those videos without using your CPU. If you disable hardware acceleration, then you'll see some load on your CPU since it's now doing software decoding.
     
    It's not actually using your actual GPU core, but rather a separate chip responsible solely for decoding media. The encoder/decoder on the 1060 and 1070ti is the same, so the load when playing a 4k video should be the same in both.
  8. Like
    igormp got a reaction from leadeater in M1 Mac owners are experiencing extremely high SSD writes over short periods of time, likely thanks to aggressive swap   
    That's related to the SMART protocol. Each "data unit" is equivalent to 512 bytes, and the reported value are in the thousands. So, basically multiply that value by 512000 and you'll have the amount of bytes written.
  9. Informative
    igormp got a reaction from jrhaberland in M1 Mac owners are experiencing extremely high SSD writes over short periods of time, likely thanks to aggressive swap   
    That's related to the SMART protocol. Each "data unit" is equivalent to 512 bytes, and the reported value are in the thousands. So, basically multiply that value by 512000 and you'll have the amount of bytes written.
  10. Agree
    igormp got a reaction from Ben17 in I'm still mad… but buy it anyway - RTX 3060 Review   
    For games it's not really an upgrade for anyone with a 2060~2060 Super.
     
    However, it's a nice 30~50% faster than those cards when it comes to raw FP tasks, and it gets even more interesting when you factor in the 12gb of vram.
     
    I'll try to snatch 1 or 2 of those to go along with my 2060S (a man can dream lol)
  11. Like
    igormp got a reaction from SaudAlomari in Can your Android device run this game?   
    Sure, no problem
     
    I can't give you an opinion on that since I don't play many games, nor I have any idea how powerful/weak my phone is when compared to your average phone out there.
  12. Like
    igormp got a reaction from SaudAlomari in Can your Android device run this game?   
    Runs at around ~40 fps on high, ~60 fps on medium on my S10e.
  13. Like
    igormp got a reaction from Vishera in Can your Android device run this game?   
    Runs at around ~40 fps on high, ~60 fps on medium on my S10e.
  14. Agree
    igormp got a reaction from kelvinhall05 in How do file converters work.   
    Those don't use the GPU at all, and are not that CPU intensive. As mentioned above, give Calibre a try.
  15. Like
    igormp reacted to Kilrah in Is there still a demo scene?   
    This was awesome, and so are that team's other entries! Music is phat too...
    They do give the 3080 a run for its money at 4K60 😛
     
    Grr at browsers and anti-malware always flagging highly packed stuff...
  16. Like
    igormp got a reaction from Craftyawesome in Is there still a demo scene?   
    Sure, it's not what it used to be, but there are still competitions around, such as Revision and TRSAC (you can find many others on https://www.demoparty.net/)
     
    Here are some cool recent demos that I've seen:
     
    That has NOTHING to do with what the OP said. The demoscene is more of a hobby where people try to do nice stuff with constrained requirements. 
     
    Are you? Sounds like you're not into the whole culture and are just getting started in a compsci course or something like that.
     
    That's exactly what the current day demoscene tries to do, have a look at the things I linked above.
  17. Like
    igormp got a reaction from DriftMan in Is there still a demo scene?   
    Running an OS on a micro?
    Beware, a shitty linux running on top of an ARM emulator running on an 8bit ATmega1284p: https://dmitry.gr/?r=05.Projects&proj=07. Linux on 8bit
  18. Like
    igormp reacted to Xavier Yvonne Zanzibari0 in Is there still a demo scene?   
    Best place to go (imho) is pouet's demoscene site:
    https://www.pouet.net/

    My personal favorite is a 4k demo by RGBA &TBC ca:lled Elevated.
    I use it to test my decoder quality:
     
  19. Like
    igormp reacted to DriftMan in Is there still a demo scene?   
    Nah, it was just that I felt like the OP was being unfair comparing something totally possible and praised in the era, but IMO "too hard" to accomplish something as amazing now
    (with this comment ->)
    And I was pointing out (asking about how much about software development he knows, if I had to use ELI5 text or a bit more in-depth explanations to not sound condescendant) that with all the power of today's hardware, to make such amazing demoscenes or software needs a lot lot lot lot of effort, as you could probably do a simple 144p demoscene but for today's standard 4K is... Well I couldn't imagine how long would it take to code something like that.
     
    For example, I think one of the most amazing projects right now in terms of maximizing the hardware lifespan is Puppy Linux, but there are less and less low powered devices while people want more and more features every year, as both eras (back then vs now) are totally different IMHO they are just too concerned about FOSS development than actually demoscenes.
     
    Now that I've read my post again, I think I haven't expressed myself properly, not a big fan of big posts as I tend to repeat myself a lot when writing in English.
     
    But yeah, those hobbyists are still alive, just doing other stuff like the nostalgic emulators and modding we are so fond of. I tend to dive into ESP32 development from time to time as I've studied the embedded systems degree, I think there are more fields right now so it's a bit spread, with less people in animation but way more in dozens of different new fields
     
    Edit: specially sorry towards @Spindel if my post looked like harassment, I think we might have understood each other posts
  20. Agree
    igormp reacted to Moonzy in NVIDIA releases CMP lineup and reduces hashing rates on GeForce cards   
    Then you should be against it, not accepting compromises
    Because having a company dictate what we can and can't do with our hardware isn't the future I want
  21. Like
    igormp got a reaction from Taf the Ghost in New rumour suggests the "Super Switch" might be a pretty hefty upgrade afterall   
    Could be the Tegra Xavier. Volta is really similar to Turing, so it should be doable, and it's being widely used already.
    Another option is the Orin, but I don't think it's in production yet.
     
    Well, you can pull it off easily if all you want to do is upscale stuff. In the case of games, however, you want your raw GPU power free to render the actual game and have units dedicated to the upscaling step.
  22. Agree
    igormp reacted to trag1c in Socketed GPU?   
    I am not the least bit on board with this guys socket idea but he is correct in saying that power supply amperage value is the maximum number of amps that can be drawn from it within the thermal constraints of the design.  At the end of the day Ohms law doesn't lie, I = V / R. The voltage out of your power supply will be constant so the only thing that can change the current draw is the resistance which is also constant because the resistance is determined by the sum of the series parallel resistance of the entire device.
     
    TLDR; PS amp rating =/= current draw. The only 2 ways to change current draw are to increase voltage and/or decrease resistance. 
     
    (Every thing here assumes theoretical speak. Real world issues would persist but for the basis of this conversation it is not needed.) 
  23. Agree
    igormp got a reaction from DoctorNick in Chromebook charger safe for an s20+?   
    Both are USB-C, right? If so, then sure, no problems.
     
    Your S20+ will pull a max of 25W out of the charger, doesn't matter if it's 45 or 100W.
     
    If we're talking about USB-PD, then the device and charger should negotiate a common voltage that works for both. Baseline is 5v at startup and they there's the whole handshake thing to get higher voltages and currents.
  24. Informative
    igormp got a reaction from Radium_Angel in Chromebook charger safe for an s20+?   
    Both are USB-C, right? If so, then sure, no problems.
     
    Your S20+ will pull a max of 25W out of the charger, doesn't matter if it's 45 or 100W.
     
    If we're talking about USB-PD, then the device and charger should negotiate a common voltage that works for both. Baseline is 5v at startup and they there's the whole handshake thing to get higher voltages and currents.
  25. Agree
    igormp reacted to tikker in Why the Perseverance Rover computer is so "outdated"   
    As others have said, the fact that it needs to be super ultra mega reliable means tried and tested hardware. It also doesn't need a i9 10900k to do the relatively basic tasks that need to happen, plus there is possibly a lot of room for hardware optimization as the things it needs to do are well defined leading to effectively lower hardware requirements.
     
    Another point that I did not yet saw mentioned is the time development cycles for these things take. It's not like we decide to go to Mars next year and start building a rover. Space missions are planned many many many years ahead. There are many design an validation phases before they even start building the thing and the whole concept to mission can easily take a decade or two. I had to look it up, but Perseverance started in 2012. So not only will you use reliable, "outdated" hardware, you are looking at using  reliable and outdated hardware from the point of view of 2012.
     
     
    Yet it probably could 😛 Given DOOM's 66 MHz processor and couple dozen MB storage requirments. My god that would be an epic joke to me haha.  Ultimate immersion by sending a rover to Mars, to play DOOM on the planet.
×