Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Uttamattamakin

Member
  • Content Count

    623
  • Joined

  • Last visited

Reputation Activity

  1. Like
    Uttamattamakin reacted to AlexTheGreatish in Using 6000 CPU Cores for SCIENCE - HOLY S#!T   
    Lol probably never, but hopefully more videos like this one will be hitting LTT on occasion 
  2. Like
    Uttamattamakin reacted to AlexTheGreatish in Using 6000 CPU Cores for SCIENCE - HOLY S#!T   
    I contacted them out of interest.  I found out they were roughly in driving distance and then contacted them and set-up a trip.  They're really awesome about doing tours and having good learning resources.
  3. Agree
    Uttamattamakin reacted to AlexTheGreatish in Using 6000 CPU Cores for SCIENCE - HOLY S#!T   
    -Hyperloop vacuum doesn't need to be nearly as good
    -Hyperloop tunnels are being constantly pumped out to create a partial vacuum instead of being held under a vacuum
    -The LIGO tubes have been under vacuum for 10 years, thermal expansion hasn't been a problem (and it gets quite cold there in the winter)
    -Don't listen to Thunderf00t
  4. Agree
    Uttamattamakin reacted to GilmourD in Using 6000 CPU Cores for SCIENCE - HOLY S#!T   
    Was it a strike from Oracle because they get crazy about every copyright they own, like that Sun Microsystems logo somewhere towards the end?
  5. Informative
    Uttamattamakin got a reaction from KnightSirius in Using 6000 CPU Cores for SCIENCE - HOLY S#!T   
    Seriously a pretty good video which explains what is going on to a good level of detail without using mathematics much.   This could be shown to science students it is that good.
     
    One good thing to remember is that NOW that LISA would succeed is treated with hindsight as if t was a foregone conclusion.  There were many years that they were treated as a large, necessary, but by no means essential project.  All the glory and glamor went to the LHC and maybe dark matter detection.
     
    Since then LIGO has detected many MANY gravitational waves.    The LHC found the Higgs but NONE of the super symmetric partner particles and dark matter direct detection searches have come up with anything...yet. 

    The coolest part of it for me, personally was that I predicted the waveform they saw. ( 10.15200/winn.142574.40936 ).   Check out  slide 22 from my conference presentation (https://absuploads.aps.org/presentation.cfm?pid=12513), a plot of the wavefunction predicted by my hypothesis in 2014 (A combination of Mathieu functions). Compare the wave form I predicted to what was observed. Pretty darn close.    I can't know what went into the deliberations but it is likely that without their detection I would not have been invited to speak to the APS and may not be an academic scientist.   I would not be very peripherally involved in the LISA mission.   Pretty decent for a broke, African American, Transgender, Adjunct Professor from Chicago.
     
    Humble bragging over now I beg.
     
    ....by the by got any old Titan V's that are kinda obsolete that you'd like to donate to science?  Clearly some 4x4 Tensor Cores would be very useful to me playing GTA onli.... advancing science.     I mean if you all aren't using them....IJS.  Now that RTX is out the titan V is old news on yesterdays paper to most people.  General Relativity is all stated in terms of tensors.  Having hardware level tensor computation ability would be a game changer.
     
  6. Funny
    Uttamattamakin got a reaction from 8uhbbhu8 in Using 6000 CPU Cores for SCIENCE - HOLY S#!T   
    Seriously a pretty good video which explains what is going on to a good level of detail without using mathematics much.   This could be shown to science students it is that good.
     
    One good thing to remember is that NOW that LISA would succeed is treated with hindsight as if t was a foregone conclusion.  There were many years that they were treated as a large, necessary, but by no means essential project.  All the glory and glamor went to the LHC and maybe dark matter detection.
     
    Since then LIGO has detected many MANY gravitational waves.    The LHC found the Higgs but NONE of the super symmetric partner particles and dark matter direct detection searches have come up with anything...yet. 

    The coolest part of it for me, personally was that I predicted the waveform they saw. ( 10.15200/winn.142574.40936 ).   Check out  slide 22 from my conference presentation (https://absuploads.aps.org/presentation.cfm?pid=12513), a plot of the wavefunction predicted by my hypothesis in 2014 (A combination of Mathieu functions). Compare the wave form I predicted to what was observed. Pretty darn close.    I can't know what went into the deliberations but it is likely that without their detection I would not have been invited to speak to the APS and may not be an academic scientist.   I would not be very peripherally involved in the LISA mission.   Pretty decent for a broke, African American, Transgender, Adjunct Professor from Chicago.
     
    Humble bragging over now I beg.
     
    ....by the by got any old Titan V's that are kinda obsolete that you'd like to donate to science?  Clearly some 4x4 Tensor Cores would be very useful to me playing GTA onli.... advancing science.     I mean if you all aren't using them....IJS.  Now that RTX is out the titan V is old news on yesterdays paper to most people.  General Relativity is all stated in terms of tensors.  Having hardware level tensor computation ability would be a game changer.
     
  7. Informative
    Uttamattamakin got a reaction from Ezzy-525 in Using 6000 CPU Cores for SCIENCE - HOLY S#!T   
    Seriously a pretty good video which explains what is going on to a good level of detail without using mathematics much.   This could be shown to science students it is that good.
     
    One good thing to remember is that NOW that LISA would succeed is treated with hindsight as if t was a foregone conclusion.  There were many years that they were treated as a large, necessary, but by no means essential project.  All the glory and glamor went to the LHC and maybe dark matter detection.
     
    Since then LIGO has detected many MANY gravitational waves.    The LHC found the Higgs but NONE of the super symmetric partner particles and dark matter direct detection searches have come up with anything...yet. 

    The coolest part of it for me, personally was that I predicted the waveform they saw. ( 10.15200/winn.142574.40936 ).   Check out  slide 22 from my conference presentation (https://absuploads.aps.org/presentation.cfm?pid=12513), a plot of the wavefunction predicted by my hypothesis in 2014 (A combination of Mathieu functions). Compare the wave form I predicted to what was observed. Pretty darn close.    I can't know what went into the deliberations but it is likely that without their detection I would not have been invited to speak to the APS and may not be an academic scientist.   I would not be very peripherally involved in the LISA mission.   Pretty decent for a broke, African American, Transgender, Adjunct Professor from Chicago.
     
    Humble bragging over now I beg.
     
    ....by the by got any old Titan V's that are kinda obsolete that you'd like to donate to science?  Clearly some 4x4 Tensor Cores would be very useful to me playing GTA onli.... advancing science.     I mean if you all aren't using them....IJS.  Now that RTX is out the titan V is old news on yesterdays paper to most people.  General Relativity is all stated in terms of tensors.  Having hardware level tensor computation ability would be a game changer.
     
  8. Agree
    Uttamattamakin reacted to Ezzy-525 in Using 6000 CPU Cores for SCIENCE - HOLY S#!T   
    Me thinks this was more a video that Alex wanted to do ?
  9. Informative
    Uttamattamakin got a reaction from itswillum in Using 6000 CPU Cores for SCIENCE - HOLY S#!T   
    Seriously a pretty good video which explains what is going on to a good level of detail without using mathematics much.   This could be shown to science students it is that good.
     
    One good thing to remember is that NOW that LISA would succeed is treated with hindsight as if t was a foregone conclusion.  There were many years that they were treated as a large, necessary, but by no means essential project.  All the glory and glamor went to the LHC and maybe dark matter detection.
     
    Since then LIGO has detected many MANY gravitational waves.    The LHC found the Higgs but NONE of the super symmetric partner particles and dark matter direct detection searches have come up with anything...yet. 

    The coolest part of it for me, personally was that I predicted the waveform they saw. ( 10.15200/winn.142574.40936 ).   Check out  slide 22 from my conference presentation (https://absuploads.aps.org/presentation.cfm?pid=12513), a plot of the wavefunction predicted by my hypothesis in 2014 (A combination of Mathieu functions). Compare the wave form I predicted to what was observed. Pretty darn close.    I can't know what went into the deliberations but it is likely that without their detection I would not have been invited to speak to the APS and may not be an academic scientist.   I would not be very peripherally involved in the LISA mission.   Pretty decent for a broke, African American, Transgender, Adjunct Professor from Chicago.
     
    Humble bragging over now I beg.
     
    ....by the by got any old Titan V's that are kinda obsolete that you'd like to donate to science?  Clearly some 4x4 Tensor Cores would be very useful to me playing GTA onli.... advancing science.     I mean if you all aren't using them....IJS.  Now that RTX is out the titan V is old news on yesterdays paper to most people.  General Relativity is all stated in terms of tensors.  Having hardware level tensor computation ability would be a game changer.
     
  10. Like
    Uttamattamakin got a reaction from TechyBen in Worst Tech mistake you have ever made?   
    Back in the days when I built my PC's for myself I tried to POST Test a CPU without any heat sync.  In the time it took to get a post beep I saw a puff of smoke from it.  (I don't recall which CPU it was but it was in the time before they put lids on them...may have been a Pentium 4 or AMD Athlon era processor.   

    Second biggest mistake was that I threw away/ lost my vintage Tandy Computers.  A working 1000RL or Sensation II would be worth an unreasonable amount of money for what they were able to do.  
  11. Agree
    Uttamattamakin reacted to Mira Yurizaki in Wallmart kills OVERPOWERED Prebuilt PCs - [UPDATE - They're back!... Now with a $400 discount!]   
    My first computer that I was allowed to tinker with was a cheapo HP Pavillion with a Pentium III based Celeron. I certainly learned a lot despite never having built the computer.
     
    So I don't believe you need to build a PC to learn how to tinker with one.
  12. Agree
    Uttamattamakin got a reaction from Mira Yurizaki in Wallmart kills OVERPOWERED Prebuilt PCs - [UPDATE - They're back!... Now with a $400 discount!]   
    One of the things he did in that video (or does in a latter one) is upgrade the mobo.  This all reinforces my first impression.  The Wal-Mart PC is a good first computer for a young person to get into seeing how computers work.  (Yeah sure if they are around someone who knows what they are doing they can buy parts and yada yada yada).  One of the best ways to figure out how to put something together is to have a working model that is cheap, and experience taking it apart and putting it back together.   

    Just my $0.02 US.
  13. Agree
    Uttamattamakin reacted to dtaflorida in Wallmart kills OVERPOWERED Prebuilt PCs - [UPDATE - They're back!... Now with a $400 discount!]   
    LOL, that's hilarious. It could only be funnier if it had been the exact same model but with the rebranding on it. 
    I wouldn't even bother swapping the PSU if I did get one with the better mobos. Only mod worth doing is add extra airflow. Overall a pretty good deal on the PC after the huge price drop. 
  14. Agree
    Uttamattamakin got a reaction from ANNIHILATOR284 in Has PC tech become stagnant? I think so.   
    Agreed agreed agreed.

    I mean once the smart phone came around and there was money to be made there we stopped pushing for more.    I think about the fact that Windows Vista had more graphically intensive UI (compared to the hardware available) than windows 10.   You are right.

    Unless VR becomes a thing that is actually required in order to use some killer app or play the must have games who needs to upgrade.  Unless having many many cores becomes required a quad core i7 from 2018 will still be usable in 2038.
  15. Informative
    Uttamattamakin reacted to Mira Yurizaki in Has PC tech become stagnant? I think so.   
    By cut down, all I've seen is dropping FP64 performance. FP64 performance isn't really necessary for consumer applications.
     
    The GeForce 10 series is the first time NVIDIA has not used an older GPU design for any of the SKUs. The GeForce 900 series was close, but the bottom tier SKU used a first generation Maxwell.
     
    EDIT: I should add, it's the first time in a while. After the GeForce 4 and MX 4 fiasco, the GeForce FX, 6, 8, 9 (arguably), and possibly the 500 series did not use an older design at all.
    I'm pretty sure most consumers would almost immediately recognize the responsiveness from an HDD to a SATA SSD. SATA SSD to NVMe? I'd argue most enthusiasts agree there's no appreciable improvement to responsiveness and unless you actually have a use-case that needs that bandwidth, it's not worth it.
     
    So as an example of where bandwidth doesn't really mean much, here's some data showing storage activity when booting Windows 7 (it was on a VM, otherwise the data wouldn't really be obtainable)

     

     
    The SSD in this case is a SATA SSD. Notice that it doesn't even top out over 100MB/sec.
     
    Before you go "This is probably just a Windows thing", similar behavior happens on Linux Mint
     


     
    Yes you could go "Aha! It reads at +200MB/sec", well okay, but only for a second. The rest of the time it's booting it's spending it under 50MB/sec.
  16. Agree
    Uttamattamakin reacted to ewitte in Has PC tech become stagnant? I think so.   
    Well how many normal consumers would recognize the difference between a nvme and sata SSD vs ANY SSD and a HDD?  Even budget enthusiasts mix and match all three types for different storage types.
  17. Agree
    Uttamattamakin reacted to DeScruff in Has PC tech become stagnant? I think so.   
    Ding ding ding!
    - Obviously there is always gonna be exceptions, but yeah in terms of PCs a similar thing has happened with storage. Storage requirements have ether flat lined or gone down.

    Video downloading is not nearly as common, and streaming has taken over. Music similar story, but even then those files haven't been getting larger without the user going to lossless formats. Image files have long since been negligible, and even then are often just put on social media then deleted.
    The only exception is Videogames.

    Im still kinda surprised the 250GB SSD I gave to my mom a while back hasn't been been filled up. I was honestly expecting to have to hook up a second drive for storage by now.
  18. Agree
    Uttamattamakin reacted to Crunchy Dragon in Has PC tech become stagnant? I think so.   
    I still think that after that, chips will start to get larger, like we see on server grade platforms.
     
    Once we reach the end of Moore's Law, then we'd have to move to quantum tech or something like that, so unless companies like AMD and Intel are completely shelling out for R&D on that, it might be a while before quantum tech hits the consumer market.
  19. Informative
    Uttamattamakin reacted to Falkentyne in Has PC tech become stagnant? I think so.   
    Needed to add some things for clarification.
     
    Back during the old days (when 4 MB of RAM cost $200), we were dealing indeed with moore's law with little respite.  Parts were expensive as hell but indeed fast for their time, to be obsolete and too slow a few years later.  And RAM was also a big problem (not just hard drive space).  
     
    But then something happened.
    During the time that MMX was announced and Intel was releasing processor bins significantly faster than the lesser bin (yet extremely more expensive), apparently there was too much demand for the cheaper parts, so a legendary S-spec came out.  The Pentium 166 MMX, but not just any 166 MMX, but a certain S-spec.  Someone found that these were simply downclocked 233 MMX's and I don't remember if this was posted on USENET or even what forums existed at the time (overclockers.com?), but 100% of these processors would hit 233 mhz without failures (similar to 2600K's hitting 4.5 ghz), although I do not remember if there was a vcore jumper involved also.  you just had to use the 38 mhz (was this 75 mhz fsb doubled?  Or did that exist yet?) bus jumper.  IDE was connected to this bus and most hard drives had no issues at 38 mhz (or 75 if that was doubled).  In fact most of these processors would run at *262.5 mhz* also, with the 41.5 mhz (83 mhz??) bus jumper.  Giving you >223 mmx speeds for much much cheaper.  So it did seem that intel was releasing downclocked (with a lower multiplier) 223 mmx chips to fulfill market demand.
     
    this was the beginning of the legendary 50% overclocks and the first processor capable of it.
     
    The problem was even though (especially for MS DOS games, arcade and console (SNES) emulators) etc where these speed overclocks were VERY noticeable, your fun was short lived since Intel was quickly making processors that would be faster than your old one's highest overclock.  (example, pentium 2 300 mhz).  And then a killer game would come to bring your CPU to its knees and force you to upgrade (aka the "hardware genie"), if not the video card (never mind RAM and HD prices).  Ignoring dogs like Ultima 7 (from 486 days) and Strike Commander (which required a Pentium and still ran like a dog), that 166 MMX killer was the great game "Unreal".
     
    But you still had the 50% overclocks.  The next "166 mmx" was the "legendary" Celeron 300a @ 450.  This was more legendary because all of them could reach this speed afaik, while the 166 MMX didn't have this status because it relied on finding an EXACT S-spec (which meant buying in person or hoping a retailer told it to you over phone).  More 50% overclocks and more insane speed boosts.  And more shortlived fun as new products came out that were even faster and new games bought the last gen to its knees.
    Now you had Quake and Unreal Tournament and games I can't remember (Drakan: order of the flame?) doing the job.  (ignoring the video card wars and 3dfx again).  But sure enough another 50% legendary overclock came out--the coppermine 600e @ 900.  And that got eclipsed soon by the aborted Pentium 3 1 ghz (then re-released), and then the socket 370 change and the P3 1.4S Tualatin and then AMD jumped into the game too.  Nasty times. 

    Then came netburst.  But what about those old Pentium 3's?  Well, more killer games destroyed them forcing people to upgrade.  And those were Medal of Honor: Allied Assault, and the great one--Battlefield 1942, which had people scrambling for Pentium 4's.  And Ati reached its glory days with the Nvidia killing 9700 pro and 9800 pro.
     
    Then a change happened which signaled the beginning of the end of Moore's Law.
    The double IPC increase of Conroe (Core) and the drop in clockspeed.
    With a core 2 X6800 being twice as fast as the Pentium4 3.4C run at the same clockspeed.  But now the 50% overclocks were gone.
     
    Until Sandy Bridge.  Since the only massive IPC boost was going from netburst to Core, you no longer had the huge gains anymore.  Just more cores and hyperthreading.  But the 2600K brought back the large overclocks of years past (even if many processors couldn't hit 50% and even if a large number of users saw degradation after years of 1.4v+ @ 5 ghz and had to back down a bit).  But now you no longer had the mhz (matching transistor count loosely) doubling every 2 years making the old chips obsolete as new games were coded for them, nor did you have the massive IPC boosts either.  So the end result was those 2600K's lasting longer than any computers in history, except the old 8-bit systems which saw long life (like the Apple 2's and Commodore 64's), and just needing video card, storage and RAM upgrades to remain viable.  So that's what you have now. Just more cores and small IPC bumps which eventually add up.  But note that we were stuck on 4 cores and 8 threads on consumer chips for basically 7 years (if you want to include the X platform with hyperthreading without the 6 core parts).  But suddenly in the space of two years, Intel suddenly jumped to 6 cores in 2017 for consumer chips because AMD blindsided them with Zen, then 8 cores in 2018, something they've never done before, while moving their Xeon line to HEDT on the moar corez front to fight Threadripper.  And that's where we are today.
     
    While I won't call this AMD's thunderbird/Athlon 2.0 moment, they are definitely pulling a 9700 Pro on Intel here..
     
     
  20. Like
    Uttamattamakin got a reaction from grss1982 in Has PC tech become stagnant? I think so.   
    Well... the fact you can seriously say that is proof of the stagnation. 

    Compare buying a ... typical PC you could buy as a consumer in 1988 would likely have had an 8086 at 8MhZ or a 286 at 10 if you were a real baller.  The graphics would be 640x480 with four colors.  IF you bought a PCJr or a Tandy 1000 series 640x480 with 16 colors.  Fast forward just a couple of years and you'd have a 386 at 25Mhz, 2MB of ram, and a 40MB HDD able to run windows 3.1 and get on the early world wide web and just barely use a CD rom.    The graphics would be 800x600 and 256 colors. Two more years and it would be 1024x786 and 16 million colors.   etc.  

    What I am saying is.  It used to be that an upgrade to a new computer meant being able to do things that were previously impossible.
     
    Compare the situation now.  My moms Mac Book from 2008 can do everything my 2017 HP Specter X360 can do.  It just does it SLOWLY (but still fast enough to use).    Compared to how big of a leap an upgrade used to be it is not really worth it to buy a new PC until the thing breaks.
  21. Agree
    Uttamattamakin reacted to Mira Yurizaki in Has PC tech become stagnant? I think so.   
    The problem with desktop computers, and likely extending to servers and other such computers, is we've had decades to figure out what just works. One could say going in this decade, practically all of the "obvious" innovations that won't break existing systems have been pretty much hammered out (and now we're finding out some of them aren't exactly free lunches). There are still plenty of unused methods to try for general purpose computing, but it's been argued much of the design of modern processors was built around how C works. With the way we've built our computer empire, the world will have to go kicking and screaming to use anything different.
  22. Agree
    Uttamattamakin reacted to Mira Yurizaki in Has PC tech become stagnant? I think so.   
    They weren't. The 90s were basically the only time exponential improvements happened.
     
    Also funny thing, I don't think stagnation is exactly a new thing either. The early home computers basically used the same 6502 or Z80 for almost 10 years. That's stagnation.
     
    Because the developers were betting on a future where Intel's Tejas was a thing and 10 GHz CPUs would be possible. The game, when compared to modern software, is horribly designed. It was also using early implementations of graphical features which are more in the "prove it works" phase than the "make it perform good" phase.
  23. Agree
    Uttamattamakin reacted to corrado33 in Has PC tech become stagnant? I think so.   
    I mean.... it's a good question. It's happening. We're reaching the limit of size for transistors. Maybe we'll squeak out another decade, but still, SOMETHING needs to happen if computers continue to advance. 
  24. Funny
    Uttamattamakin reacted to Mira Yurizaki in Has PC tech become stagnant? I think so.   
    And the problem I see with this is that gaming isn't going to magically become better with more cores. Playing an FPS game like Doom where you're in small rooms fighting may be at most 5 enemies at a time isn't that hard to run. And something like Ashes of the Singularity where all of those cores actually do get some use is a rare use case.
     
    I mean, maybe we can have a GTA where it actually simulates what conditions in LA are like during rush hour... but that doesn't make the game fun. If I wanted to experience LA rush hour traffic, I'll just go there myself.
  25. Funny
    Uttamattamakin reacted to Nicnac in Has PC tech become stagnant? I think so.   
    oh is this the weekly "have we reached the limit of moore's law" post?
×