Jump to content
Phishing Emails & YouTube Messages - Fake Giveaway Read more... ×
Search In
  • More options...
Find results that contain...
Find results in...

Blogs

 

CDP Basics - Cisco Discovery Protocol

CDP - Cisco discovery protocol...   A protocol that runs on Layer 2 which means it doesn't give a damn which layer 3 protocol is running on the interface! The basic concept from our point of view with CDP is that it can obtain information related to device name, interface, management IP and more!   A tiny bit of theory that isn't really required for the CCNP level is what CDP runs on. Medias must support Subnetwork access protocol (SNAP). Essentially this is a frame format which follows the common 802.3 ethernet frame and adds a header (with some new fields) which provides some information regarding what type of information is in our next header (similar to the old Ethernet frame format with the EtherType). The LLC header in our 802.3 frame would have 2 fields (DSAP and SSAP which are normally the same value, 0xAA in hex meaning a SNAP header will follow the LLC header in our frame.   In the SNAP header below, we have 2 fields: OUI (Organizational Code) which indicate a registered hex value for Cisco (0 x 00 00 0C) and the PID Protocol ID which in our case will be CDP (0 x 20 00):   SMXLL  Also, notice that we have a MAC multicast address to a well known address not only for CDP but other cisco protocols such as VTP,DTP etc..   If we move more over to the CDP message itself, take a look at a capture below:   SLXLM  We can see information that can be advertised via CDP such as:   Version, TTL (aka holdtime), the name of the device, VTP domain and more! A key thing with CDP is that it isn't a 2 way communication. Devices that originate CDP advertisements will just send them and have a care in the world what happens after that!   This CDP advertisement was sent from R2 to R1 so lets have a look what we can find in R1's CDP neighbor table by using the show cdp neighbors command.   SMXLL  Without any topology map or someone telling us, we can now conclude that on R1, we have a device called 'R2' which sent out a CDP advertisement that we received from our Fa0/0. The problem is that a switch could actually be between our routers so we can't fully depend on CDP and come to a conclusion that we our Fa0/0 is directly connected to R1's F0/0.   Now for some more theory before tweaking! CDP by default is enabled on cisco devices and has a few default parameters such as:   Advertisement is every 60s Hold timer is 180s By default, v2 advertisements are enabled (v1 pretty much doesn't send VTP management domain) and finally by default, CDP is enabled globally and on every interface   So these are pretty much the only tweaks we can do with CDP. We can change the advertisement/hold timers, change the version of CDP we advertise and disable it globally (or per interface)   Globally enabling/disabling: cdp run no cdp run   Interface level: cdp enable no cdp enable   Changing the timers in global config: cdp timer 20 cdp holdtime 60   Advertise v2 / don't cdp advertise-v2 no cdp advertise-v2   Another option we can configure with CDP is to alert us if CDP detects a duplex mismatch (since the duplex parameter of the interface is sent in the CDP message)   SMLXL  We can enable this logging in global config by: cdp log mismatch duplex   The concern with CDP is that people find it shares too much information in the message so they either: globally disable it or disable it on specific interfaces such as edge ports, ports facing ISP router/internet etc.. This is because a bit of information that people can obtain can easily do a few searches for vulnerabilities for a specific IOS version etc..   You can also perform CDP spoofing to the multicast address with different values in each message and pretty much 'overflow' the CDP table with hundreds of CDP entries, a tool is included in the Kali linux distribution to generate CDP messages as well as others such as BPDUs and more!   Here is a youtube video that substitutes this post... Excuse my explanation on SNAP during the video, I was a bit over the place! 

BSpendlove

BSpendlove

 

On the complaint of "incremental CPU improvements"

A common complaint I see about Intel is because they didn't have much in the way of competition from AMD for several years, they were content with releasing each new generation of processors with only "incremental" performance updates. Incremental being about 10%-15%. I wondered if in the past, we were enjoying a period of great performance improvements and so I went to looking around for benchmarks of processors from around the mid 2000s to late 2000s/early 2010s.   I found out that both Intel and AMD were only offering what amounted to incremental IPC improvements. Or perhaps just incremental improvements in general. The only exception was from Pentium D to Core 2.   Giant list of CPU reviews over the years   AMD Athlon 64 reviews (Compare against Athlon XP) http://www.anandtech.com/show/1164 http://www.tomshardware.com/reviews/amd,685.html https://www.extremetech.com/computing/55510-review-athlon-64-3400 http://hexus.net/tech/reviews/cpu/625-amd-athlon64-fx-51/   Core 2 Conroe reviews (Compare against Pentium D) http://www.trustedreviews.com/reviews/intel-core-2-duo-conroe-e6400-e6600-e6700-x6800 http://www.anandtech.com/show/2045 http://www.guru3d.com/articles-pages/review-core-2-duo-e6600-e6700-x6800,1.html http://www.pcstats.com/articleview.cfm?articleID=2097   AMD Phenom reviews (Compare against Athlon 64) http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/9218-amd-phenom-x4-9750-quad-core-cpu-review.html https://www.bit-tech.net/reviews/tech/cpus/amd_phenom_x4_9850_9750_and_9550_b3_cpus/1/ http://www.legitreviews.com/amd-phenom-9900-processor-review-spider-platform_597   Core 2 Penryn reviews (compare against Core 2 Duo Conroe) http://www.anandtech.com/show/2306 (this is more of a preview than a real review) http://www.overclockersclub.com/reviews/intel_e8400/ (Though it's a review for the C2D E8400, compare the C2Q Q9450 against the C2Q Q6600) https://www.bit-tech.net/reviews/tech/cpus/intel_core_2_duo_e8500_e8400_and_e8200/1/ http://www.phoronix.com/scan.php?page=article&item=intel_c2d_e8400&num=1   Core i5 Lynnfield reviews (compare against Core 2 Quad Penryn) The reason why i5 is used instead of the i7 is the i7 has HyperThreading. http://www.anandtech.com/show/2832 https://www.techspot.com/review/193-intel-core-i5-750/ (Note they have an overclocked score that might through you off) https://www.bit-tech.net/reviews/tech/cpus/intel-core-i5-and-i7-lynnfield-cpu-review/1/   AMD Phenom II reviews (Compare against AMD Phenom 9950 BE, but keep in mind of the 0.2GHz clock difference in favor of the Phenom II) http://www.anandtech.com/show/2702 http://www.tomshardware.com/reviews/phenom-ii-940,2114.html https://www.bit-tech.net/reviews/tech/cpus/amd-phenom-ii-x4-940-and-920-review/1/   Maybe the most damning find: This pattern may have been the norm all along I found an article from 2003 on Tom's Hardware benchmarking CPUs from the Pentium 100MHz all the way to then contemporary Pentium 4's. All of the graphs seemed to indicate that 10%-20% IPC increases were the norm across generations of each CPU. You can read the article (starting from the test setup page) at http://www.tomshardware.com/reviews/benchmark-marathon,590-22.html   However, there are noticeable performance jumps, notably: Unreal Tournament 2003 does not run at even 30 FPS on Intel Pentiums and AMD K6s. However the K6-III was released around the same time as the Celeron Mendocino and I can't figure out why it performs so much better. The Celeron Mendocino was the second attempt with the on-die L2 cache, but K6-III had on-die L2 cache as well. Mp3 Maker Platinum and Main Concept (an MPEG-2 encoder) have huge increases in time on processors at K6-III/Celeron Mendocino and before. I would point to the programs using SSE, but AMD Athlon Thunderbird performs just fine and it doesn't have SSE. Gathering all of this data, I think the contributors that offered the biggest improvements over the past 20 or so years has been: Better caching. On-die L2 saved the Celeron Mendocino. Better front-end improvements like what happened on the Athlon. Some more reading can be found at http://www.anandtech.com/show/355. This also massively helped Intel when going from Pentium 4/Pentium D to Core/Core 2. Improvements to FPU performance. This may explain AMD's performance oddities in some of the tests on older processors while Intel seemingly did not have issues. I think the biggest takeaway though is that clock speed has improved greatly in a short period of time, and for a while, it was the best way to get more performance out of a processor. The Pentium P54C at 100 MHz was released in October of 1994. It took Intel about 6 years later in March of 2000 to release the 1GHz Pentium III. This is a clock increase of about 150MHz per year or a 1.523 times improvement per year on average. In March 2000, the 1GHz AMD Athlon was released. Four years later, a 2GHz Athlon was released. This is 250 MHz per year on average, but only a 1.19 times improvement per year. Intel took about 18 months between the initial 1.3GHz Pentium 4 to the first 2.6GHz Pentium 4, which is the most impressive improvement at 866.667 MHz per year. But to get to 3.8GHz, the fastest yet, took about 3 years from the first 1.9GHz processor. This is about 633.333 MHz per year or 1.26 times more improvement on average.   And another thing to point out, in some cases like the Pentium III 500MHz vs. 1GHz may not sound like much today since 500MHz differences are common in processors, but back then it was still a 200% performance difference on the same architecture.

Mira Yurizaki

Mira Yurizaki

 

Back to school budget gaming PCs

While you might have thought that I abandoned this place, I haven't, it's just that the part prices are getting higher and higher on a pretty much weekly basis, and some components, such as GPUs and RAM, are shooting up in price due to limited stock and overall shortage   But no worries, I can still make pretty good budget builds for light to moderate gaming.   I'm going to do a little twist, I'm going to give nicknames to builds, not just "Budget build" or "Content creation monster".   250$ "Spectator"   400$ "Whistler"   650$ "Maltese"   1000$ "Zeus"   Thanks for viewing this and I hope you'll have fun time with these builds! Comment for any suggestions on optimizing these builds!

Djole123

Djole123

 

Benchmarking Prodecures

I haven't done anything that would require benchmarking in ages. Last change made to my old log file was 2014 after I changed some case fans and wanted to check if they did something to temps. This July I changed graphics card which is something I would say anyone should run their own sequence after. So after any bigger change (CPU/GPU mainly) anyone should run same benchmarks they have run before change. I will come back to this in a moment. In my situation need was really big. I went from 2011 midrange card to 2013 top end card. Something that in paper is bit better than 280X, card that I was eyeing for upgrade year ago. Yes, talking about used cards. Upgrade was from GTX 560Ti factory OC'd to reference GTX 780. Benchmarks would look awesome. In numbers if not actually, but thats more about nature of tests themselves.   So why this post? Well, so I can refer to it when someone asks advice about benchmarking. Since my last run was 3 years ago and I haven't really played any new games, all my game benchmarks are pretty bad choices. Like NFS Shift and Battlefield Bad Company 2. There might be Battlefield 3, but thats it. This new batch won't have any real gameplay. Because of two reasons. 1. I still lack good games for realistic benchmarking, Battlefield 4 is probably on the heavier side. So nah on games. And 2. I don't want to buy games just because I could get good benchmark out of them. Even games with benchmark tools would be just for it since I don't play 3rd person single player adventure/action games (Metro 2033, Tomb Raider, Witcher 3). This will be list of benchmarks which are free and provide good base for anyone looking to create their own sequence.   1. What is my "sequence"?   I use term "sequence" to describe procedure where I run multiple tests, check temps and mark down scores. Marking down scores and temps is good practice in general. Not just for bragging, but to check how much you spending money actually improves the systems performance. So my sequence involves having few monitoring software open, taking numbers in notepad and running several benchmarks one by one. Yes, it will take some time, some 1.5h for me. But you need to do it once and then you can just refer notes later if someone asks something. I've used temp readings many times to advice on high temps under stress tests and idles on my older hardware.   To actual point. I have 10 software, 12 tests, 1 main monitoring software with 2 others running, notepad with template for scores and temps and FRAPS for one odd out fps reading. I cover tests later. I happen to have 2 monitors, but all this can be done on one monitor. 2nd is good for having all monitoring software there. Like in this manor:   So MSI Afterburner is present because of habit of looking at graphs. I actually don't use it for GPU temp monitoring anymore, but habit of looking at fan and temp graphs remains. Main screen is where I look at GPU temp when test is running. Mainly since its new piece and I want to see how my fan settings are holding. Main software here is RealTemp with GPU temp monitoring open. I reset Maximum readings after every test to get reliable readings for all tests individually. Under them all is my normal main monitoring software, OpenHardwareMonitor. Notepad is on main monitor since I don't need it until after test finishes.   Things I do and would recommend. I have habit of doing this after cold boot. I would recommend booting before running sequence. Its easiest way, since there's least amount of extra software running on background. For this sequence I added way of closing all extra stuff I have open. Skype, several driver software's, basically anything except multimonitor manager, fan controls and virus scanner. Some might close virus scanner too, but I don't mind it. Then another boot after all testing is done. To get everything working as normal. I would recommend setting any fan profiles like you are going to use them before running test on new components. Makes more realistic comparison. As for temps, I have idle's. Which I will take after first test has run. Reason being that idle's coming directly from boot will be lower than what you are going to see any other time. For testing temps I use max temp. Its most relevant. Since RealTemp shows temps for all cores, I use average with easy count. Take highest and lowest and split difference with upward rounding. Like 66C and 58C would be 62C (8/2=4, 58+4=62C) or 67C and 60C would be 64C (7/2=3.5, round up 4).   2. Tests and score keeping   Let's start with synthetic benchmarks. I will give some background on why I use software listed here, where to get it and what settings to use (if needed).   3DMark https://www.futuremark.com/downloads/3dmark.zip If you have had gaming PC for some time, you know this software already. Futuremark's (Yay! for Finnish company) 3DMark has been industry standard for almost 20 years. Its combined benchmark, meaning that it tests both GPU and CPU within single run. First 3DMark I've used is 3DMark03. I've had 05, 06. Vantage and 11. 3DMark06 was used for long time because of its DX9 support. Until Vantage with DX10.1 it was only thing to test new hardware reliably. Thats important. Because of the way benchmarks work, if you change something the score will also differ. So comparing two results between different versions of software can cause issues. I first noticed this fact with FurMark.   But back to 3DMark. Free version has 3-4 tests. Most common is FireStrike which is for current gen gaming PCs. You can run it with lower end hardware if you are like me and want before upgrade scores to compare. I also used tad lighter SkyDiver. Mainly because I couldn't get 06 running anymore. So let both tests run, mark score to notepad, mark max temps for GPU and CPU, remember to let temps go back to "idle" in between and reset max readings before running the test. Thats it. Free version doesn't have any settings to toggle. Only thing I would like to toggle is Demo.   CINEBENCH R15 Yes, they actually have product name in caps. Maxon makes professional 3D modeling and animation software as their main source of income. But Cinebench has become one reason for their homepage to get constant traffic. It has 2 tests, individually for CPU and GPU. GPU uses OpenCL, CPU renders image using all available threads. Cinebench gives some comparison for similar systems, but I wouldn't look that graph too much. I also don't think it as very taxing software. Run tests for both CPU and GPU, with temp normalization in between.   This is among those software with this batch that has single part focus on testing. I don't have just for CPU, but I might look into that part more. Intel Extreme Tuning Utility has CPU testing, but I don't know if it works with AMD. Anyway, having whole benchmark just for single component has some advantages. Like if you'd like to test air cooling myth about radiating GPU heat. So running GPU only test would raise only GPUs temp notably and do something to CPUs temp also. I don't look at utilization when I run these tests, but it could be one thing to check also if you want to gather more data.   Catzilla http://www.catzilla.com/download ALLBenchmark's test is different from the two above because it  has very noticeable sound effects and music. Otherwise its just another combined benchmark. I've used it since I heard about it from OC3D's TinyTomLogan. TTL is someone who's opinion on OC and CPU performance means a lot to me. I've picked other go-to software from him with OCCT, a stress test software for CPU. But back to Catzilla. Basic version only has 540p benchmark, but you get 720p one by creating account to their site. Easy thing with Google, Twitter or Facebook linking. Rest of the stuff is like before.   RealBench https://rog.asus.com/rog-pro/realbench-v2-leaderboard/ ASUS' RealBench is combined test which uses real life tasks for benchmarking. Image manipulation, rendering, video encoding and multitasking. Besides giving total score, you get score per test. I mark all of them down. I got to see how much GPU did bottleneck CPU on CPU heavy tasks. Result? Not that much. But some.   UserBenchmark This is new to my lineup. I haven't looked much into what it actually measures. But looks to be lighter side combined test. I would say it replaces Novabench I had on earlier lineup. Results are given in three categories, for Gaming, Desktop and Workstation in percentage. I marked those percentages as results.   Heaven and Valley These two from Unigine are pure GPU tests. Heaven is what is commonly used for GPU OC testing, temp testing and benchmarking by reviewers. Valley is bit heavier, so I'd say running them both is good practice. Like with 3DMark, there are more tests you can use. I used highest presets for both, Extreme for Heaven and ExtremeHD for Valley.   FFXIV: Stormblood Like said earlier, I don't have real gameplay benchmarks in this set. So this game from 2013 is lower end gaming benchmark for me. It has preset for Mazximum settings, but I pumped those bit more. You can check my settings from PDF attached. I use score as scoring, but you could have also given FPS. (FFXIV_Stormbloo_benchmark_LoGiCalDrm_config.pdf)   Star Swarm http://store.steampowered.com/app/267130/Star_Swarm_Stress_Test/ This game engine benchmark is free on Steam. Released 2014 by Oxide Games, guys behind Ashes of Singularity, its has capability of simulating space battle game. There are few options to simulate different style of games. I used Attract with Extreme settings. Score will be given as average FPS. You can select some other combination, important part being that same settings are used before and after any upgrade made to keep score consistent.   demo2 https://files.scene.org/view/parties/2015/assembly15/demo/demo2_by_ekspert.zip This one is something I cooked up. Its newer than both other gaming style benchmarks, its done with Unreal Engine 4 and I have no clue if it has any relation to real world or not. Its Demo made by group called Ekspert for Assembly LAN Demo Compo 2015, in which it placed 2nd. There are few remarks I want to make about demo's and demoscene before going actual benchmarking part. Demoscene is all about digital art. Animation, coding, graphics, music, indie game development. At least in Europe, many software and game dev companies have their roots deep into demoscene. If you are doing the things I mentioned above and want to show off your skills by competing, maybe look if there are parties/compo's held in your area. The two known Finnish companies with demoscene background are Futuremark (surprise, surprise) and Rovio. One for making first PC demo at the time when Amiga and Commodore 64 were main platforms, other for making mobile game back when those were played on Nokia N-Cage's.   Now back to benchmarking part. Demo2 doesn't have built-in scoring system. So I've used FRAPS to calculate average FPS. Running 1080p version gives warning about using Fraps for recording purposes, but loads just fine after that. I start benchmark counter as soon as demo starts and check scores afterwards. Nothing more to it.   Others? As I said along the lines, I would like to have more modern game benchmarks, as well as CPU only benchmark. So I will be looking around for those and adding them here. If someone reading this has ideas about free or cheap games with included benchmarks, please let me know.   3. Scores and comparison   At this point you should have raw data text file. Something like this one I'm using: benchmsrk.txt Which includes system specs for each test cycle. This would be the file you are updating during tests. Feel free to use it as template or comparison. But what now? Well, you can just compare by eye results, use it to quickly refer temps and so on. But what if you want know how much better system performance is after upgrade. Thats where Excel (or Sheets, Calc and so on) comes in. Copying results to Excel (&co) can be annoying, but do it once to get template correct and maybe adjust .txt file to help in future. Here's my .xlsx for reference. Also free to be used as template or comparison: benchmark_LoGiCalDrm.xlsx   In file I've got some extended system info and notes about tests. Which are pretty much same as in here. But main thing is +- column (Excel hint: add < ' > in front of symbol to exclude any automatic formula). It calculates how many percentage better new score is compared to last. Works best when score changes less than 100% or new score is over twice the value of old. Formula used is pretty simple: =(<new>/<old>)-1 Shown in percentage. You can add color coding and such if needed. If new value are over twice bigger, remove <-1> to get accurate score.   There's some oddness in that file in CPU temps. I had issue with Speedfan missing fan profiles for 2 front intakes. Which I fixed after I had changed to new GPU. So those are something to ignore.   4. Conclusion   I hope this helps those who are new to benchmarking. Note that this is just how I do things and you should be taking it as guide or advice. Make it your own. I will be doing some fixes along the lines, but as I will not be getting major upgrade in few years, its quite possible I'll be making another batch of tests when next upgrade is on me.

LoGiCalDrm

LoGiCalDrm

 

LTT/Forum App

These questions seem to be more common now than ever. This is list of link to all of those threads. Or as many as I can find. Under each link is quotes from my reply and any official/semi-official if given. Note that this list doesn't cover times this has been asked in old Features and Suggestions thread in its many iterations.   Btw. If you know thread which isn't listed, tag me (for new ones) or PM (for old ones, >1 months).   *UPDATE* July 8th: Moved 2016 and 2017 to their own posts. Side note; No posts that I've seen since May. Whole month. FPC app requests aren't counted. May 9th: Moved 2015 to its own post. Removed some stats from yearly posts. Considering of change in order (current is from newest to oldest). Jan 5th: Moved 2014 to its own post. Added booksmarks to my browser for faster editing in future. Dec 31st: Moved 2013 to its own post, with improved date system and more stats.   Latest LMG level answer for the question:   Latest administrator level answer for the question: Latest moderator/dev level answer for the question:   Stats (Date & Time uses CET which is -1 for my own timezone and +1 UTC. Times in 24h format.) Total topic created on subject: 85 Yearly stats - 2013: 17 - 2014: 14 - 2015: 13 - 2016: 27 - 2017: 14 Yearly average (until 2016): 17.8 Most in month: 7 (May 2013) Monthly average (until Dec 2016): 1.5 Most in a Day... Just kidding. No, seriously: 2 (Jan 5, 2017) Shortest time between topics: 15h 45min (Jan 5, 05:41 - Jan 5, 21:26, 2017) Longest time between topics: 104d 7h 28min (Mar 17 - Jun 26, 2014) Latest: Aug 2, 11:54, 2017 Oldest: Jan 3, 18:20, 2013   To-Do: Change date to more easier to read. (WIP) Move each year to own post and link here (WIP)   Dates in format DD/MM/YYYY (until 2016)   2017   2016   2015   2014   2013  

LoGiCalDrm

LoGiCalDrm

 

Topics created in 2017

(Almost) ALL Topics talking about having LTT/Forums app in 2017   Main post with all statistics: 2017 Stats (so far)   Total topics created: 14 Most in month: 4 (January) Average in month: 2.4   Aug 2nd   July 19th   May 13th   May 8th   Apr 27th   Apr 12th   Apr 2nd   Mar 31st   Mar 5th   Feb 15th   Jan 23rd   Jan 21st   Jan 14th   Jan 5th    

LoGiCalDrm

LoGiCalDrm

 

Do You Need Antivirus for MacOS?

It is generally accepted that macOS is immune to viruses and malware. In general, this is true, because, for the entire history of the operating system, viruses written for it can be counted on the fingers.   I want to understand why this is happening and what to do in order not to get infected with those rare instances of malware.    Among my acquaintances who use Mac device, there is no a single person who has an antivirus or someone who would say that he got a malicious code or any virus. There are several reasons for this. Of course, one can argue that macOS X is a Unix-like operating system and, therefore, it is invulnerable. We reject this primitive thought since there are some viruses for macOS, which means that the system, like any other, is vulnerable.   The main reason why there are so few viruses written for Mac is that there are very few Mac devices themselves. If we compare the number of Windows personal computers and Macs that exist today in the world, it turns out that the Mac devices make only 7%. Attackers who want to steal credit card numbers are more likely to be interested in the audience of Windows PC users because of their large number.   Some say that to protect your Mac from the hypothetical possibility of installing malicious software, you need to install an antivirus. This is the most logical solution, according to most users.   Here is the list of viruses that I found information about:   1982: Elk Cloner 1987: nVIR 1990: MDEF 1995-1996: Concept / Laroux 1998: SevenDust 666 / AutoStart 9805 2004 and 2006: Renepo / Leap-A 2007: RSPlug-A 2009: iWorkS-A Trojan 2011: MacDefender 2012: Flashback / SabPub   What we have here: only 10 malicious programs. In my opinion, only MacDefender represented a real threat to users of macOS - stealing credit card numbers. As you understand, security updates have already come out for these viruses, and they do not represent threats. Let's return to antiviruses. What do you think, given the information written above, how often do you need to update the antivirus database for Mac? The answer is once a year or less.   Antivirus software for macOS is more harmful than useful. There is convincing evidence that Kaspersky AntiVirus very actively collects information about the user and sends it to its servers. There is information that computers with Kaspersky AntiVirus can participate in DDoS-attacks at the discretion of the developer of this antivirus. It's worth thinking about whether you need a Trojan program and also pay for it.   You still have to be careful not to get malware on your Mac device. Some malicious software tools like Safe Finder may also collect your data or show unwanted ads.

Far101

Far101

 

Topics created in 2016

(Almost) ALL Topics talking about having LTT/Forums app in 2016   Main post with all statistics: 2016 Stats   Total topics created: 27 Most in month: 4 (April) Average in month: 2.25   Dec 31st Dec 18th Dec 12th Nov 14th Nov 12th Nov 9th Oct 26th Oct 16th Sep 21st Sep 11th Sep 2nd Aug 20th July 29th July 28th May 24th May 20th May 5th Apr 26th Apr 24th Apr 11th Apr 4th Mar 19th Mar 7th Feb 24th Feb 14th Feb 7th   Jan 20th (before forum update)

LoGiCalDrm

LoGiCalDrm

 

R9 Fury X upgrade

I recently purchased an R9 Fury X off of eBay for a couple reasons, the first being for the giant performance increase from my 380X, and the second being that I can mine with my 380X.   Picture of FrostByte:   Picture of my WIP ethereum mining rig/NAS:   And if any of you are interested on how the new card performs, I made a video on the topic:  

Sylvie05

Sylvie05

 

Are split browsers productive?

Answer: no  reasoning: the App Store has about 6 or 8 split window browsing apps. One of the most popular is Split by Savy soda. The issue is that you have to go back and forth typing new searches. Ideally the app would mirror the first search to get both windows to the first google page, then un-mirror allowing the user to tap through multiple links without pressing the back button which wastes loading time. Then re-mirror and un-mirror. Maybe with some ai or something. A solution to the "back load time" is another app which promises zero latency back presses. That app is called "fastback." But there is no practical purpose for split windows. Searching time actually goes down going back and forth. For background music there are other apps that play in the background. You do not need to sacrifice half your screen to play music. 

Neotwoson

Neotwoson

 

June Build

PCPartPicker part list / Price breakdown by merchant CPU: AMD - Ryzen 5 1600 3.2GHz 6-Core Processor  ($195.69 @ SuperBiiz)
Motherboard: ASRock - AB350M Micro ATX AM4 Motherboard  ($65.98 @ Newegg)
Memory: Team - Vulcan 16GB (2 x 8GB) DDR4-2400 Memory  ($99.99 @ Newegg)
Storage: PNY - CS1311 120GB 2.5" Solid State Drive  ($52.99 @ Best Buy)
Storage: Western Digital - Caviar Blue 1TB 3.5" 7200RPM Internal Hard Drive  ($48.44 @ OutletPC)
Video Card: Asus - GeForce GTX 1070 8GB Dual Series Video Card  ($399.99 @ B&H)
Case: Inwin - 301 Black MicroATX Mini Tower Case  ($76.98 @ Newegg)
Power Supply: SeaSonic - 520W 80+ Bronze Certified Fully-Modular ATX Power Supply  ($56.99 @ SuperBiiz)
Total: $997.05
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2017-06-17 03:07 EDT-0400

RadiatingLight

RadiatingLight

 

We should really stop chastising people who think PC building is "too hard"

The other day I found a Gamer's Nexus video in my subscription feed. It was Steve Burke criticizing a magazine's article on how PC building is "hard" and to prove it wasn't, would do a speed build.   Admittedly I didn't watch the whole video, nor did I read the article in question. But my overall takeaway is this: to all you people who build or built their machines, stop saying it's easy as if building a PC is like operating an elevator or using a phone.   Now ignoring the other aspects of building a PC that conveniently get ignored, like identifying your needs, planning your budget, researching parts, testing, and if needed, troubleshooting, of course it's easy to build a PC if you've done it before. But to a fresh newbie, it still may be a nervous experience that really isn't all that easy for them.   To put in another way, I think about the time I was leaning how to ride a motorcycle vs when I ride now. I was nervous about shifting gears around except up when accelerating and down when coming to a stop, I had trouble with the clutch, hills were a problem, going down even past 45 MPH scared me, lane splitting was something I dared not to do with moving traffic.   Now, all of that no longer matters. I can shift just by my gut feeling. I have competent control of the clutch. I can manage most hills (though I'm sure San Francisco will humble me in a second). Highway speeds feel normal. I even lane split in moving traffic (and I survived LA).   These are all more or less "natural" to me. I don't have to think about how to do the action, but merely when to do it. So riding a motorcycle is easy to me. I think it's easy. But it's not.   Granted building a PC isn't this complex, risky, or what have you, but I wanted to illustrate a point. I'm sure most of you who built their own PC consulted the internet a few dozen times throughout each step of the process. I'm sure you spent a lot of time mulling over what's good for you or what's not. Or maybe you didn't and just copied someone else's build (though I'd argue you're sort of cheating). I'm sure when something went wrong, you nearly needed a change of underwear or blew a few blood vessels.   If PC building were really easy for anyone, nobody would need to do or worry about any of this.

Mira Yurizaki

Mira Yurizaki

 

Web app development is actually a good way to learn multi-threading concepts

As a way to expand my skill set and give me something to do in my spare time on programming, I've taken up learning what is called full stack web development. In it's most high level description, that means dabbling in both the front-end (client app, web page, etc.) and back-end (server, database, etc.) of the development process. To ease going into this, because I didn't want to learn almost a half-dozen languages, I've stuck with Node.js and MongoDB, as both use ECMAscript. Essentially all that means is I have to learn JavaScript. This has been a key point in where I'm going with the title.   A full web application is made up of several independent pieces Pieces like the server, the database, the client, and others. These pieces are independent form each other, yet they still need to talk to each other and perform as a cohesive unit. If one of these breaks or isn't developed well, the whole application can have a bad time.   Multi-threading means asynchronicity Synchronicity in this means that everything runs in order as you might expect while reading the code. Asynchronicity means that some code can execute right away, some code will be skipped, and some code will be put to the side, only to be run later.   If there is something you run into a lot of in JavaScript, it's that a lot of things are asynchronous. You have a lot of things on a web page that are waiting for user input. However, the web site can't freeze itself or force the user to do things in a specific order, because this decreases the quality of the user experience. In a server, when you do a database request, you don't want this hanging the entire server because a database request can take a while. You want the server to go right back to serving other people's requests if the current one is going to take a while and can be put off to the side. In JavaScript, these are handled by mechanisms such as callbacks and Promises. For example, here's some client code I've written, requesting to the server to get a list of groups: var getInfo = function(){ // Making an HTTP GET request $http.get('/groups/getList?index=' + $scope.startIndex) .then( function(response){ if (response.data.status === 30000){ $scope.groupList = $scope.groupList.concat(response.data.groupList); $scope.startIndex += response.data.groupList.length; } else { console.log("There was an error getting groups. Code:" + response.data.status); } }, function(response){ console.log("There was an error getting groups. Code:" + response.data.status); }); // Script resumes from here after making the HTTP GET request console.log("I made a HTTP GET request!"); }; The first part has the client make an HTTP GET request to the server to a particular URL for getting a list of groups. Since the server isn't going to respond instantaneously, the client will put this action off to the side and continue executing. What happens then is the client will execute console.log("I made a HTTP GET request!"); immediately afterwards. Essentially, for the moment since the data isn't available, it skips everything from the HTTP request function to the comment about where the script resumes. When the server sends the data back to the client, a signal is generated to let the client know to go back and execute the code that was skipped.   This is a key point in multi-threaded programming. Things need to be asynchronous. The application needs to be able to put tasks that are waiting for something else to be cast to the side and signal the application when the result comes in.   People may be sharing resources and you need to manage them An example is on Wikipedia. What if two people are editing a page at the same time, but adding different things to the page? How do you handle them both saving data? This is a common issue with multi-threading programming. One way of handling it the server having a token that only one person can take. If the token was taken, then nobody else can touch the page until the token is released. Another solution is if someone submits their edit, then another does, you can respond "hey, someone made changes already."   Web application development gives you more control in the debugging process This part makes understanding how each bit interact with each other easier.   One of the hardest things about developing and debugging multi-threaded applications is that a lot of things happen on the OS and processor level. It can be hard to poke into those to figure out what's going on. Like when you want to pause the execution of the application in the source code, you're not quite sure which thread you're on. When you're doing web application development, each instance you are accessing the web page can be thought of as a process or a thread in the entire application. In this way, you can pause one instance's execution and know exactly which one you're poking at.   In another sense, you may be able to control aspects behind the scenes, such as latency and such. It's harder to see how a local application will behave with hiccups and such without clunky methods, but you can certainly add latency to your network or simulate it another way without adding debug code or whatnot.   It's practical and relatively easy to setup! One of the hardest problems with learning something about software development is what to do with it. I'm finding full stack web development practical to learn because it touches upon something we all use in our daily lives. So instead of trying to figure out a local application you can run that can be multithreaded, try developing a website server and a client to go with it. The flip side is it's very visible and you can see your results.   Now I say it's relatively easy to setup, but it still has a learning curve. In my case, at the barest minimum you need to learn HTML and JavaScript. CSS is highly recommended. Then again, multi-threaded programming isn't a beginner level concept either. But if you take an afternoon to study, you can certainly get a web server setup: https://www.tutorialspoint.com/nodejs/nodejs_express_framework.htm   My development setup at the moment is: VirtualBox with Lubunutu Node.js  with Express as the server framework. MongoDB for the database. This is a document-oriented database, as opposed to a relational database like SQL. Whatever text editor of choice you want (I use Atom).

Mira Yurizaki

Mira Yurizaki

 

Commentary on the Legality of Microsoft's Data-Harvesting in Windows 10

This is a perhaps a more eloquent and elaborated presentation of what I was wanting to speak in the thread 7-times Microsoft MVP finds Windows 10 Enterprise collects too much data at minimum, calls for legal action.   The relevant links from that thread are these:   Windows, Spying, and a Twitter Rant Screenshots showing high levels of contact with Microsoft servers after employing all efforts to stop data-transmission Additional screenshots of further Microsoft server activity, discovered later     I would like to give some personal commentary to the subject that those links are about.  
If a politician steals millions of taxpayer dollars, which is only a few cents from each person, they go to jail. So what about when Microsoft is continuously piggy-backing on everybody's PC systems to enrich themselves? Microsoft is using people's own hardware, software licenses, electricity, computing power, data, time, and private activity for non-sanctioned business use, and the profit of Microsoft's executives.   Microsoft's data-mining is no different than a virus that is distributed to people's PCs to min mines digital coins using their CPU and CPU power, with the earnings being deposited in the e-wallet of the virus' creator. You could also look at it like someone setting up a mining farm, but connecting all their systems to their neighbour's electricity supply - except that in the case of Microsoft's data-mining, they are not using their own hardware, software licenses, and everything else, but those of the people whose systems are sending data to Microsoft... and so the coin-mining virus is a more suitable analogy.  
Every aspect of Microsoft collecting data from people's PC systems and personally-owned Windows licenses is already established in law as being illegal. But some people are taking a bit of time to work through the understanding that leads to that recognition, because software-license owners are traditionally just not on the lookout for stuff like this and usually just focus on using their software, and not technical legal aspects or ethical implications behind its operation. Also, Microsoft being a well-known company whose products people have used for years throws a lot of people for a loop, I think, because they are used to just assuming that whatever they're doing must check out, somehow. Well, this doesn't. It's illegal from head to toe.   It's theft, but it's also Unjust Enrichment - which is the situation where one party is making profit for itself at the unjust expense of others. A current UE case involves ZeniMax targeting Samsung for Unjust Enrichment over VR technology that ZeniMax claims belongs to them but is profiting Samsung.   If you unilaterally utilize somebody property, or copyrights to make yourself money, who is legally entitled to the proceeds? Legal entitlement goes to whom the required property and rights that the profit is dependent upon belong to.   
A person who argues that by using Windows 10 a person agrees to send Microsoft their data would be wrong, because sending Microsoft data is not essential to making use of the software functionality that was paid for when buying a Windows license, and so conditioning usage of the paid-for functionality on unrelated and Microsoft-profiting access to personal and private data would not pass the Reasonable Person test.   Also, such an argument would be in ignorance that the data sent to Microsoft doesn't come from only the owned software license, but also a person's personally-owned  hardware, electricity, computational time, uniquely-generated data, and personal / private activity - things for which there can be no entitlement for Microsoft to use any more than a car salesperson can claim that if you buy a vehicle from them, then they have access to use your garage and everything in it, including the power source hooked up to it.
Additionally, the idea that Microsoft could exert any authority over an instance of the Windows OS after they've sold the license that represents the OS instance to somebody is a violation of the first-sale doctrine, which makes clear that such authorities and privileges pass to the owner of the property, in this case, the owner of the software license and the instance of the OS it represents, once it is sold. And the SCOTUS has just made a unanimous, 8 - 0 in favour, re-affirmation that decision-making rights leave from the seller to the buyer at the first-sale of an item.  
I fully believe that seven-times Microsoft Most Valuable Professional award recipient Mark Burnett is right when he says "What we need to do is fix this, even if that means getting lawmakers involved. It can only get worse from here". Though, I believe it is important for big reasons beyond simple control and security of the OS.   Microsoft is stealing digital property, computational power, and electrical resources from everybody, and is making non-licensed usage of people's hardware property, the housing of that hardware, and are exploiting people's personal behaviours while those people are staying within their personal and private spaces (non-online activities). And in the process of violating Windows license-owners' rights over their property, resources, time, and behaviour, Microsoft is unjustly enriching its company and executives.   If action is not taken against those who commit these violations, then all established societal and legal notions of what property is, who possesses decision-making rights over it, how much a person can use their position to unfairly exploit others against their natural desire... then all existing understanding of those things becomes argued against, and a precedent is established where a person's property is anyone's to use by unilateral decision, and a seller of goods can enslave and overrule aspects of people's own private lives and property as part of their conditions for their sale. Effectively, a sale becomes not a transaction of goods for money, but a mechanism for enslavement and subjugation, with the seller acting as if they held a commercial license over a plethora of the buyer's possessions and entitlements.
 
A person whose personal and private PC system environment (non-online spaces) is sending data to Microsoft through telemetry, data-collection, and analytics of their behaviour is an employee of Microsoft who does not get paid, or receive any company benefits.

Delicieuxz

Delicieuxz

 

My Setup

Is this rig good for gaming at 1080p@60+fps highest preset? https://pcpartpicker.com/user/mohammed15/saved/

Ma name jeff

Ma name jeff

 

Simulated Processor - Creating an instruction Set

Hello,   So the other day, I was bored and thought, "You know what, I want to learn how to design a working instruction set," and thus I have been working out the details for a simulated processor using my own custom instruction set rather than getting some homework done that I really should be doing.   Here is what I currently have worked out: Croltex SM8 (simulated processor) specs ------------------------------------ L1 Cache - 64 bytes L2 Cache - 128 bytes L3 Cache - 256 bytes L4 Cache - 512 bytes (Because I want to have L4 cache) Instruction Set - Multimode 8-bit (M8) M8 Instruction set definitions ------------------------------ /------------ Function group | /--- Operation | | 0000 0000 Instruction set Data Tags (0000) 0000 0000 (00 00) = T-CBH (Marker - Command Begins Here) 0000 0001 (00 01) = T-CCH (Marker - Command Continues Here) 0000 0010 (00 02) = T-CEH (Marker - Command Ends Here) 0000 0011 (00 03) = T-VBH (Marker - Variable Begins Here) 0000 0100 (00 04) = T-LBH (Marker - List Begins Here) 0000 0101 (00 05) = T-SBH (Marker - String Begins Here) 0000 0110 (00 06) = T-IBH (Marker - Integer Begins Here) 0000 0111 (00 07) = T-FBH (Marker - Float Begins Here) 0000 1000 (00 08) = T-FBH (Marker - Boolean Begins Here) 0000 1001 (00 09) = T-OBH (Marker - Operator Begins Here) 0000 1010 (00 10) = T-NVH (Marker - Null Value Here) 0000 1011 (00 11) = T-PBH (Marker - Parameter Begins Here) 0000 1100 (00 12) = T-PEH (Marker - Parameter Ends Here) 0000 1101 (00 13) = T-FBH (Marker - File Begins Here) 0000 1110 (00 14) = T-FEH (Marker - File Ends Here) Variable operations (0001) 0001 0000 (01 00) = V-CEV (Variable - Create Empty Variable) 0001 0001 (01 01) = V-LVC (Variable - Lock Variable Content) 0001 0010 (01 01) = V-UVC (Variable - Unlock Variable Content) 0001 0011 (01 03) = V-SVC (Variable - Set Variable Content) 0001 0100 (01 04) = V-GVC (Variable - Get Variable Content) 0001 0101 (01 05) = V-MVL (Variable - Make Locally Available) 0001 0110 (01 06) = V-MVG (Variable - Make Globally Available) List operations (0010) 0010 0000 (02 00) = L-CEL (List - Create Empty List) 0010 0001 (02 01) = L-LLC (List - Lock List Content) 0010 0010 (02 02) = L-ULC (List - Unlock List Content) 0010 0011 (02 03) = L-SVC (List - Set Value's Content) 0010 0100 (02 04) = L-GVC (List - Get Value's Content) 0010 0101 (02 05) = L-MLA (List - Make Locally Available) 0010 0110 (02 06) = L-MGA (List - Make Globally Available) Mathematical operations (0011) 0011 0000 (03 00) = M-SAO (Math - Standard Addition Operation) 0011 0001 (03 01) = M-SSO (Math - Standard Subtraction Operation) 0011 0010 (03 02) = M-SMO (Math - Standard Multiplication Operation) 0011 0011 (03 03) = M-SDO (Math - Standard Division Operation) 0011 0100 (03 04) = M-SEO (Math - Standard Exponential Operation) 0011 0101 (03 05) = M-SRO (Math - Square Root Operation) 0011 0110 (03 06) = M-SCO (Math - Standard Ceiling Operation) 0011 0111 (03 07) = M-SFO (Math - Standard Floor Operation) 0011 1000 (04 08) = M-SRO (Math - Standard Rounding Operation) 0011 1001 (04 09) = M-SAO (Math - Standard Absolute Operation) 0011 1010 (04 10) = M-SCO (Math - Standard Cosine Operation) 0011 1011 (04 11) = M-SSO (Math - Standard Sine Operation) 0011 1100 (04 12) = M-STO (Math - Standard Tan Operation) 0011 1101 (04 13) = M-ACO (Math - Anti Cosine Operation) 0011 1110 (04 14) = M-ASO (Math - Anti Sine Operation) 0011 1111 (04 15) = M-ATO (Math - Anti Tan Operation) Comparison Operations (0100) 0100 0000 (04 00) = C-VAI (Comparison - Values Are Same) 0100 0001 (04 01) = C-VNI (Comparison - Values Not Same) 0100 0010 (04 02) = C-VTS (Comparison - Value Types are Same) 0100 0011 (04 03) = C-VTI (Comparison - Value Type is) 0100 0100 (04 04) = C-VTA (Comparison - Value Types Are) 0100 0101 (04 05) = C-FVG (Comparison - First Value is Greater) 0100 0110 (04 06) = C-FVL (Comparison - First Value is Lesser) String operations (0101) 0101 0000 (05 00) = S-CVS (Strings - Concatenate Values as String) 0101 0001 (05 01) = S-CVL (Strings - Concatenate Values as List) RAM Operations (0110) 0111 0000 (07 00) = R-SFD (Storage - Search for Devices) 0111 0001 (07 01) = R-SDI (Storage - Select Device ID) 0111 0010 (07 02) = R-GMC (Storage - Get RAM Capacity) 0111 0011 (07 03) = R-GMU (Storage - Get RAM Usage) 0111 0100 (07 04) = R-CNF (Storage - Create New File) 0111 0101 (07 05) = R-GFI (Storage - Get File ID) 0111 0110 (07 06) = R-GFS (Storage - Get File Size) 0111 0111 (07 07) = R-SAF (Storage - Select Active File) 0111 1000 (07 08) = R-RFC (Storage - Rewrite File Content) 0111 1001 (07 09) = R-AFC (Storage - Add File Content) 0111 1010 (07 10) = R-DFE (Storage - Delete File Entry) Storage Operations (0111) 0111 0000 (07 00) = S-SFD (Storage - Search for Devices) 0111 0001 (07 01) = S-SDI (Storage - Select Device ID) 0111 0010 (07 02) = S-GSC (Storage - Get Storage Capacity) 0111 0011 (07 03) = S-GSU (Storage - Get Storage Usage) 0111 0100 (07 04) = S-SSF (Storage - Search Stored Files) 0111 0101 (07 05) = S-CNF (Storage - Create New File) 0111 0110 (07 06) = S-GFI (Storage - Get File ID) 0111 0111 (07 07) = S-GFS (Storage - Get File Size) 0111 1000 (07 08) = S-SAF (Storage - Select Active File) 0111 1001 (07 09) = S-SFN (Storage - Change File Name) 0111 1010 (07 10) = S-RFC (Storage - Rewrite File Content) 0111 1011 (07 11) = S-AFC (Storage - Add File Content) 0111 1100 (07 12) = S-DFE (Storage - Delete File Entry) System Operations (1000) 1000 0000 (08 00) = P-CNP (Processor - Create New Process) 1000 0001 (08 01) = P-CNP (Processor - Kill Process) 1000 0010 (08 02) = P-GPL (Processor - Grab Process List) 1000 0010 (08 04) = P-GDL (Processor - Grab Device List) 1000 0000 (08 05) = P-LIM (Processor - Load Instruction from Memory) 1000 0001 (08 06) = P-PLI (Processor - Perform Loaded Instruction)  

SilicateWielder

SilicateWielder

 

Topics created in 2015

(Almost) ALL Topics talking about having LTT/Forums app in 2014   Main post with all statistics: 2015 Stats   Total topics created: 13 Most in month: 3 (August) Average in month: 1.1   Nov 7th Sep 10th Aug 21st Aug 10th Aug 6th Jul 11th Jul 1st Jun 21st Apr 7th Mar 24th Feb 26th Feb 24th Jan 16th  

LoGiCalDrm

LoGiCalDrm

 

May Build

PCPartPicker part list / Price breakdown by merchant CPU: AMD - Ryzen 5 1600 3.2GHz 6-Core Processor  ($218.55 @ OutletPC)
Motherboard: MSI - B350 PC MATE ATX AM4 Motherboard  ($84.99 @ B&H)
Memory: G.Skill - Flare X 16GB (2 x 8GB) DDR4-2133 Memory  ($98.88 @ Newegg)
Storage: Western Digital - Blue 250GB 2.5" Solid State Drive  ($82.37 @ NCIX US)
Storage: Western Digital - Caviar Blue 1TB 3.5" 7200RPM Internal Hard Drive  ($47.45 @ OutletPC)
Video Card: Zotac - GeForce GTX 1070 8GB Mini Video Card  ($339.99 @ Amazon)
Case: Phanteks - ECLIPSE P400 TEMPERED GLASS ATX Mid Tower Case  ($79.99 @ Amazon)
Power Supply: SeaSonic - S12II 520W 80+ Bronze Certified ATX Power Supply  ($44.90 @ Amazon)
Total: $997.12
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2017-05-05 11:59 EDT-0400

RadiatingLight

RadiatingLight

 

April Build

#This build is over budget, because I started this blog after I made the actual build, and prices have risen. PCPartPicker part list / Price breakdown by merchant CPU: Intel - Core i5-7600 3.5GHz Quad-Core Processor  ($199.00 @ Amazon)
Motherboard: MSI - B250 PC MATE ATX LGA1151 Motherboard  ($88.99 @ SuperBiiz)
Memory: G.Skill - Ripjaws V Series 16GB (2 x 8GB) DDR4-2666 Memory  ($113.99 @ Newegg)
Storage: PNY - CS1311 240GB 2.5" Solid State Drive  ($89.99 @ B&H)
Storage: Western Digital - Caviar Blue 1TB 3.5" 7200RPM Internal Hard Drive  ($47.45 @ OutletPC)
Video Card: Zotac - GeForce GTX 1070 8GB Video Card  ($383.98 @ Newegg)
Case: DIYPC - Zondda-O ATX Mid Tower Case  ($40.88 @ Newegg)
Power Supply: Rosewill - 550W 80+ Gold Certified Fully-Modular ATX Power Supply  ($62.99 @ Jet)
Total: $1027.27
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2017-05-05 11:58 EDT-0400

RadiatingLight

RadiatingLight

×