Jump to content

tomoki

Member
  • Posts

    710
  • Joined

  • Last visited

Reputation Activity

  1. Informative
    tomoki reacted to APasz in What is this device?? - T3 Innovation Remote 5??   
    As far as I can tell it's used in conjunction with some sort of analyser.
    With a few of these connected in different places it allows someone to figure out the electrical layout of the local coax.
  2. Informative
    tomoki reacted to brwainer in What is this device?? - T3 Innovation Remote 5??   
    This is part of a system to map out where and what path the coax in a building runs. They forgot it when they cleaned up. It isn’t doing anything without the tester unit.
     
    EDIT: this is the full system: https://www.t3innovation.com/coaxclarifier
  3. Like
    tomoki reacted to Donut417 in D-Link DGS1005D vs D-Link AC 750 DIR-819   
    A router has a Wireless AP, Swtich, Firewall, NAT and DHCP server built in. So either would work. Your best bet would be the Swtich because you can just plug and play. You dont want to hook multiple routers together due to cascading NAT. That can cause problems in its self, you will have to configure it to get it to work properly without double NAT. Just get a decent Gigabit Swtich, you can find them for like $20 on Amazon all day long. 
     
    Keep in mind you will be limited by your powerline adapter. Im guessing you going router to Power line to power line to Swtich. The power line adapters will be the weak link. While it might be a stable connection you might not get full speeds thru it. Just keep that in mind. 
  4. Like
    tomoki got a reaction from .Apex. in Question about Travel Adapter + Battery Bank   
    Sweet awesome! Thank you for your insight! 
  5. Informative
    tomoki reacted to .Apex. in Question about Travel Adapter + Battery Bank   
    @tomoki
    Yes it will work fine, your battery pack will only take as it needs, you can connect it to a 5V 10A charger and it will still only take 1.6A.
  6. Like
    tomoki reacted to rice guru in Help - Best IEM's for Under $150 USD???   
    Everyone seems to be hyping moondrop kana's pro on audio boards and reviews. I know almost nothing about them sorry all I know is they fit around your budget
  7. Like
    tomoki got a reaction from Mira Yurizaki in Cloning Drive Questions   
    Sweettttt thank you for that! I didn't think the solution would be that simple... I thought I would have to use that program called Virtual CloneDrive or whatever. All I remember is that sheep icon!  Thanks!
  8. Informative
    tomoki reacted to Mira Yurizaki in Cloning Drive Questions   
    Use Samsung's Data Migration tool. It's straightforward to use and as long as the amount of data in the source can fit in the SSD, it'll automatically sort out partition creation and all that. Otherwise the only other option I've seen passed around is Macrium Reflect.
    Once you clone the drive, verify the M.2 SSD works by booting into it. Do not access it if you've booted from the SATA SSD. If you access the M.2 SSD while booted from the SATA SSD, it may cause issues down the road. If you're satisfied the clone is good, then reformat the SATA SSD since this is the easiest way to clean it.
  9. Informative
    tomoki reacted to kirashi in Chromecast Questions   
    Gotcha, sounds like the sites could be the problem themselves then, since you get interruptions on a computer too. It's certainly worth exploring streaming media boxes, however, that's well beyond my knowledge and efforts, and technically a grey area so probably not kosher to discuss on the forums. (Unless we're talking purely hardware and say Kodi / Plex software to stream from a local server.)
     
    Even with a streaming media player you can still run into random buffering issues though, since again the content isn't usually stored on a Content Distribution Network due to its' questionably legal status. It's worth looking into if you're willing to research though, since I know it can be impossible to legally pay for certain content depending on the country you live in, which is ironic because the entertainment industry doesn't want people stealing the content either...  
  10. Informative
    tomoki got a reaction from kirashi in Chromecast Questions   
    Hahaha definitely not. They're streaming some asian dramas off some random website and so loading times are long to begin with but once it does finish it's initial buffer to load, it's consistently loading throughout the video so there's no stopping while playback. The random website has been hit with advertisements though and the LG smart TV can't have extensions be added to block some ads; the ads take you to some other website regardless of what you do and so it's extremely annoying. I guess that's what free means haha. 
     
    We don't have a streaming box but maybe that might be the solution as opposed to getting a Chromecast. I have 0 knowledge in streaming boxes though really. 
     
    Thanks for your honest reply  
  11. Agree
    tomoki reacted to Crunchy Dragon in Der8auer predelidded cpus seem stupid   
    I honestly wouldn't buy one.
     
    I like the DIY aspect of my PC. If I need something done, I'll do it myself.
  12. Agree
    tomoki reacted to Phentos in Cpu upgrade   
    Stick with the 7700K. The marginal (if any) increase in performance on the 8600K is certainly not worth the added cost of a decent Z370 board.
  13. Agree
    tomoki got a reaction from ghorbani in What do people use for Website Creation?   
    All of these languages.... Thanks for the input guys but if I were to begin programming (not just web design to begin with), where should I begin? Someone told me HTML..another told me Java...someone else told me Python but Python actually looks really simple. 
  14. Agree
    tomoki reacted to WallacEngineering in The End of CPU Advancement on our Doorstep (Moore's Law and the 7nm Barrier) Discussion   
    I would like to start an interesting discussion regarding traditional PCs and CPU performance enhancements and why I feel the final major CPU releases are right around the corner. Now to start, I should explain that while I have quite a few resources to back up my theory, this is all still pure speculation, and the sources I will provide are based on speculation as well.
     
    So what is my theory exactly? That around 2020-2022, people will stop buying new PCs, and from there, like a domino effect, technology advancements for CONSUMERS will come to an end (lets make it clear that I am not talking about Military Technology or technology as a whole, I am convinced these areas will continue growing).
     
    So lets start off with why I even think this is possible. So "Moore's Law" is an observation first recorded by Gordon Moore, whose 1965 paper described that the number of transistors in an integrated circuit will double every year for at least a decade. Now he was slightly incorrect about the timeline but correct otherwise. The number of transistors has been doubling about every 15-18 months in our CPUs. The increasing of transistors is what allows our CPUs to get more cores and faster clock speed with each generation. The issue with this is that there is a physical limit where increasing the number of transistors within a CPU becomes a physical impossibility. Now according to most computer scientists, computer enthusiasts such as you and I, and the tech community in general, we all agree that the physical impossibilities will begin to occur after 7nm-based CPUs hit the shelves. If you simply google "Moore's Law" you will come across a wide variety of articles explaining the definition and then discussing the physical limits of Moore's Law and when they predict that we will no longer be able to advance traditional CPUs. Most of these articles come to the same conclusion: "Moore's Law will reach it's physical limitation sometime in the 2020's". Take Time to read the article below:
     

     
    So INTEL THEMSELVES said in 2016 that transistors can only continue to shrink for about 5 more years. Guess what AMD has already announced for 2020-2021? Zen 3, their first 7nm+ CPU. Put two and two together and you get... four. Its as simple as that.
     
    Now the reason 7nm seems to be the physical limit is because beyond 7nm, the transistors would produce more heat than the power they would output, making the idea of 5nm CPUs an oxy-moron, or paradox. Now I believe that there is a small chance that scientists will figure out a way to produce 5nm CPUs, but the technology would be extremely difficult and expensive to manufacture, so much so that none of us consumers would be interested in upgrading from 7nm to 5nm CPUs.
     
    This next source I cannot provide because I can no longer find it. I saw a YouTube video about 18 months ago where a man who actually worked at Global Foundries was now legally allowed to speak about his time at the FAB and what difficulties they were expecting to see in the coming years. He said something about particle accelerator technology being required to create transistors around 7nm or smaller. I know, I don't understand how a particle accelerator could have anything to do with the FinFET manufacturing process either, but that's what he said, and that's why he makes more money than I do lol.
     
    This next part is purely my own speculation but its regarding the price of 5nm CPUs if we get there. So if we could purchase 5nm CPUs, they are likely to be so expensive that we will keep our 7nm CPUs instead. You think GPU pricing is bad right now? Try $1000+ entry level Pentium CPUs, or i9 extreme CPUs that cost tens of thousands of dollars. Ya, I don't know about you but I am staying WELL away from 5nm CPUs.
     
    Now you may be thinking: "Well why not just increase the physical size of CPUs so more 7nm transistors can fit into it?" Well that is technically a great idea, but its got a few issues. One, there needs to be room in between the CPU and RAM for cooler mount clearance, so going much bigger than ThreadRipper isn't really possible, and if we start pushing the RAM slots out then we start changing the ATX form factor standard which trust me, that isn't going to happen. This would mean all cases, power supplies, and other accessories would need to be completely redesigned to fit these new motherboards, and all this work would be done for a technology that will only last a few more years anyways. The largest issue however would be heat output. The current i7-8700K is already a monster of a heat producer, and that's just a mainstream CPU. Imagine a CPU with more than double the 8700K's processing power! Heat production would likely be so intense that even the most expensive custom water loop solutions would struggle to cool it, and don't even THINK about tying your GPU(s) into that loop, not gonna happen, especially as GPUs are likely to face the same issues as CPUs.
     
    Another issue that needs to be discussed is weather or not CPU upgrades are even necessary anymore, even TODAY! It is widely accepted that 8 Threads of processing power is more than enough for even the most demanding games, and while productivity and content creation software may be slower on something like my old Phenom II X4 970, its still possible.
     
    The only reason, and I mean the ONLY reason I just bought a Ryzen 5 1600X and related components is because the X4 970 suffers from only having 4 threads of compute power and sits on terrible green 1333 MHz RAM that is NOT overclockable. This means that while MOST of my daily computer use is still INSTANTANEOUS such as loading programs, watching Netflix, or transferring files thanks to my SSD, it fails in gaming physics. Firestrike graphics tests run very smoothly at 45 FPS, but as soon as the physics test starts I tank straight down to around 10 FPS. CPU intensive games like Kerbal Space Program and From the Depths simply arent capable of running at decent FPS no matter how low I set graphical settings.
     
    That and I bought every single component used so I managed to get a Ryzen 5 1600X system with 16GB DDR4 3000 MHz ram, an R9 390X, a Samsung 960 EVO 250GB NVME boot drive, and a Samsung 850 EVO 500GB 2.5" SSD while reusing my PSU and case, for about $600. Now THAT is what makes this purchase justifiable.
     
    Say its the year 2020. Zen 3 is out and you decide to pick up the new (just guessing on the specs here) Threadripper 4950 X with 20 Cores, 40 Threads and can reach 6GHz with some overclocking. How much of that processing power are you EVER going to use? When do you think that kind of compute performance will become obsolete and REQUIRE replacement? Software is a good two decades behind fully utilizing high-end hardware as things currently sit. And thats not even talking about something like threadripper. Its likely to take two decades to fully utilize the i7 8700K in just about ANY software that isn't related to content creation or rendering!
     
    If you bought a CPU as powerful as the Threadripper 4950X (as I specified) then theres very good chance that as long as the CPU physically survives and doesnt fail, that you wouldn't need to replace it at ANY point in the remainder of your lifetime.
     
    EDIT/UPDATE:
    Lets examine this issue from another perspective. Alot of comments suggest that carbon nanotube or quantum computing could be the solution, that Technology will not stop and continue on. Well keep in mind I never said technology will stop, what I am suggesting is that us, as consumers, do not find these new technologies to be justifiable purchases, at least not at first.
     
    Assuming that these new technologies that computer science will try to implement are extremely expensive in their first few years (which is likely as every new technology is expensive when it is first implemented), we can basically expect to see most PC consumers, including enthusiasts, to no longer have any reason to purchase these new technologies until these technologies improve, mature, and lower in cost sometime in the future.
     
    Take a look at every single "big leap forward" in man kinds history. When the microchip was first invented, it took many years of development and research before every day people like us were able to get an affordable computer in our homes. When the Automobile was first invented, it took many years before everyday people could afford one for themselves. It took us 20 years of testing space rockets to ever actually put an astronaut on the moon.
     
    And that is exactly the point. When a REALLY big game-changing technology is invented, it usually takes at least a decade, if not longer; of research, testing, maturity, and manufacturing efficiency, before the vast majority of the consumer market ever gets a chance of actually owning this technology for themselves.
     
    History has proven itself to repeat, we all know this. Intel JUST invested $5 Billion USD in 10nm manufacturing. This suggests that more than likely they EITHER arent even researching what to do after transistors reach their physical limit, or their R&D on the subject is very limited at this time, which means come the 2020's, it probably wont be ready, and we will be forced to wait it out.
     
    So what we are talking about then, is a strong possibility of another one of man kinds "big leaps" as far as computing is concerned, probably in the early 2020's, which is only a few years from now.
     
    The "Lets just add more transistors and create physically bigger and more power hungry CPUs" idea is NOT "Forward-Thinking" This is more relatable to Jeremy Clarkson from Top Gear UK/ The Grand Tour's take on automobile performance: "More Power!!!" With companies as advanced as CPU manufacturers like Intel and AMD, I would probably guess that neither of them would be okay with using this approach. The motto of world of computing always has been and always will be "Smaller, Faster, Smarter"
     
    There is more evidence that we can observe for ourselves that the inevitable end is approaching. Lets take a look at Console Gaming. Why do you suppose that the Xbox One and PS4 are still around? Why did Microsoft rename the One the "X" and simply fit the console with a few more Jaguar Cores and AMD Graphics Compute Units, rather than create an entirely new platform as they have in the past? Well this is simply because the technology has NOWHERE TO GO, its over! I believe we may see one more new generation of Console from each manufacturer before the world of console gaming comes to a halt just as PC technology is supposed to.
     
    Now lets take a look at automobiles. High-end cars today use heads-up displays that are basically high definition televisions. Then you have insane technologies such as Range Rover's Terrain Response System which can detect weather you are on Asphalt, Mud, Snow, or Sand and adjust power output to each individual wheel to keep you going pretty much no matter what. You can also actively adjust your ride height and suspension stiffness in a lot of cars today. There isn't much more we can do with a machine that uses four round pieces of rubber to move around. So whats next, flying cars? Well yea, they already exist and have existed for about a decade or so. But there are a number of MASSIVE problems with this technology that makes me think it will NEVER, EVER become a reality for 99.9% of people. Firstly, its EXTREMELY EXPENSIVE. It uses pivoting jet engines just like a real military VTOL aircraft does, do you have any idea how much those cost to buy, or to run? Lets just not even talk about it. Plus, do you really think your federal government is going to let you just fly around wherever you want? Ever heard of 9/11? Well imagine millions of idiot drivers behind the wheel of flying machines? Yea, 9/11 would look like a holiday in comparison. So NO, it will NOT EVER HAPPEN. On top of this, Google's self driving cars already work. They were successfully tested in Arizona in 2017 going on a few thousand test runs. No accidents AT ALL, they are now a reality and we should be seeing them on the streets by 2020.
     
    Now lets look at current displays. Displays have already hit their "Moore's Law Wall"! A scientific test was conducted that revealed that even the healthiest human eye cannot depict the difference between 1440p and 4K displays from more than a few inches away from the screen. So unless you like to sit so close to your monitor or TV that you cant even see half the screen and like to burn out your retnas, then 4K is simply a marketing gimmick, and has no real benefit what so ever. Some of you may know that I actually own a Samsung MU6500 Curved Glass 4K Smart TV that I use as my monitor and you may be wondering why. Well, my previous TV was a 7-year-old, crappy, low quality, cheap Sanyo 1080p flat screen. The image quality was terrible, the colors were abysmal, and it was so old that I figured it would die soon, so I sold it before it kicked the bucket on me. Plus I bought the Samsung on Black Friday 2017 so $500 for a $1000 TV? Sure, sign me right up. This display is also WAY better in terms of gaming capability. When you select "Game Console" as the selected source, it disables "Motion Rate" interpolation, most of the graphics processing options, and even lowers input lag. I can honestly say I cant tell the difference between this TV and any high-refresh gaming monitor I have ever played on. Its THAT fast and smooth, way to go Samsung! Anyways, here is the article that explains the scientific test:
     
    https://www.nbcnews.com/technology/enough-pixels-already-tvs-tablets-phones-surpass-limits-human-vision-2d11691618
     
    I do understand that Virtual Reality does need to increase in pixel count still and has a while to go before it's technology is maxed out. However I, like many other gamers, prefer a display over VR, and find VR to be an expense that simply isn't worth the money. Firstly, how do you go from playing a high-intensity, fast-paced FPS and then head into VR and find the aiming is so difficult that you are suddenly taking so much time to aim and fire your virtual weapon? Personally for me, any game that uses the VR motion controllers is pretty much worthless. I do agree, however, that there are some games that look tempting in VR. Games like Elite Dangerous or Forza Motorsport. Since you are in a vehicle you do not need the stupid motion controllers and being inside the vehicle just gives you a whole new level of immersion. But regardless, as discussed, GPUs are facing the same issues as CPUs, so good luck with 16K.
     
    So here we are then, all evidence points to the end of technology advancements as we know it, and its expected to be very soon. What are we to do? Well if you do the Moore's Law Google search you will find that at the end of most articles, they describe how Quantum Computing will be taking over the known world of computers, but if you have seen Linus's video where he takes a tour of the Quantum Computing headquarters, that technology is far from being available to the masses. So what do we do until then? Well, honestly? Nothing. Build a badass $3000+ PC around 2020 using a 7nm CPU and make it last you for the next 20 years as technology comes to a stand-still. Just replace components as they fail and deal with what you have. Its what the rest of us are going to be doing anyways so its not like you will be running an outdated or slow PC. I know I plan to build my "FINAL PC EVER" build when Zen 3 hits the shelves.
     
    So what do you guys think about all of this? Do you agree? Disagree? What do you think is next?
  15. Like
    tomoki reacted to porina in What do you hope your setup looks like in 5 years?   
    Display: lightweight AR headset with VR mode, 8k per eye, 240 Hz
    GPU: nvidia Hawking Titan XH HBM4 SLI
    CPU: Intel Wine Lake 64 core 8 GHz
    Ram: Oct-channel DDR5-12800 MHz
    Fast storage: Optane II
    Slow storage: Z-nand
    Networking: Gigabit Ethernet
    Power Supply: Corsair VS 450W (2023 edition)
  16. Agree
    tomoki reacted to vanished in How crazy would it be if someone created a usb to ddr2/ddr3/ddr4 adapter?   
    It would be very crazy as there is no practical use that I can think of.  The RAM and the USB interface cancel out any benefits the other have.  RAM is designed to be very fast but more importantly low latency, and USB is not optimized for this.  Also anything you store to the RAM would of course be erased when power is lost, so there wouldn't really be any point.
  17. Agree
    tomoki reacted to Surasonac in OC Poll   
    That doesn't make any sense. Overclocking literally means the clock speed is over its design speed. If you are undervolting, then its just undervolting. The clock speed does not change therefore you are not Overclocking or Underclocking. You are however correct that there is usually headroom, some chips better than others. And I always recommend trying to get the best out of your system.
  18. Funny
    tomoki reacted to Sellin Hugs in The $60 1080ti   
    Hey so i just bought this gpu from ebay. Was a buy it now listing, says its factory sealed and the guy has 100% feedback ratings ao far. Just bought it to find out if its a fake or maybe I just scored. Who knows but I invite you to follow along. If you have any tests I can run to verify this card is legit pls let me know. If this even ships and im not getting scammed. 
  19. Agree
    tomoki reacted to Totallycasual in The $60 1080ti   
    Have you not ever heard the expression about if something is too good to be true it probably is?  
  20. Agree
    tomoki reacted to Herman Mcpootis in Question for future build   
    it's SODIMM, no way is that laptop gonna have a full-sized desktop stick of ram.
  21. Agree
    tomoki reacted to Nena Trinity in New RAM causing crashes?   
    Its shit get a new one... O3O
  22. Like
    tomoki reacted to AngryBeaver in need help building a new pc for a girl   
    Should probably start off with a dinner at a nice restaurant. A PC is a little spendy for a "friend."
  23. Funny
    tomoki reacted to FloRolf in need help building a new pc for a girl   
    you better not f#ck it up m8. Feelin' ya.
  24. Like
    tomoki reacted to Velcade in 8700k downclocking when running cinebench   
    What is your power management set to in Windows?  The balanced setting will not apply the OC until it needs it, you'll have to make sure you're in performance mode.
  25. Informative
    tomoki reacted to Velcade in Overclocking 8700k with a Asus Prime z370a   
    You'd probably want to run for a few hours.  Make sure things are stable for the typical duration of application use.  I ran my testing for 3 hours.
×