Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

smjpl

Member
  • Content Count

    1,583
  • Joined

  • Last visited

Awards


About smjpl

  • Title
    Atom Heart Mother

Profile Information

  • Gender
    Male
  • Location
    Down by the river.
  • Interests
    Everything technology wise, games, motor sport, music (the old stuff though) and most things in-between.

Recent Profile Visitors

1,183 profile views
  1. hmm that's not good. Using Ubuntu, open up the terminal and enter gparted This will open gparted which is a tool for looking at your available partitions on your drive. If gparted isn't installed, you can install it by typing sudo apt-get install gparted
  2. Messing around with a couple of distros. OpenSuse w/ KDE, Debian and Ubunto w/ Gnome. TBH they all seem pretty similar to me. Bit of a UI change but other than that, I can't see much reason to choose one over the other bar the look of the environment. Debian looks pretty boring. OpenSuse (KDE) looks nice. Ubuntu (Gnome) looks childish but it's not bad. I do like some of the extensions for Gnome. Drop down terminal is nice.
  3. That's fair enough. I'm not sure of the price plans. If it does cost nearly the same then I can see a point in what you said. I (EU) have a unlimited "with fair usage" mobile policy, that states a 15GB or more and we're going to throttle you. I pay €20 a month compared to approx €50/month home broadband packages. For €20 a month (phone/data costs) I don't expect to use mobile data as unlimited home data.
  4. I will quote my question again. This is essentially what you implied when you said "some people might use that as their only internet connection".
  5. This is about tethered data, not mobile data. So should you be allowed to tether your home router to your phone and use it as unlimited home broadband? This is the issue they are talking about. Not people using 2TB of data on their mobile device.
  6. That low end super budget laptop seemed shit. For that price range, one might be better off going second hand. There can be some decent deals out there. Especially if you look for old business grade laptops. I picked up a Lenovo T400 (Core2Duo T9400, 4GB, 160GB) for €100. Currently using it to build a website and it runs like a dream. Never stalls or gets sluggish. Quick to boot and the battery isn't bad either. My initial budget was €200 which doesn't leave much in terms of brand new hardware. Was looking at a first gen i5 (560M) Dell and Lenovo for €200 - €260 but decided to go with Lenovo T400 for €100 and put a ssd and 4GB ram in for an extra €100. But once I got it, it was so good for the price, that putting in ssd and ram wasn't worth doubling its price. So I got a €100 bargain.
  7. Black Ops 2, COD4 (MW1) or Black Ops 1. MW3 is full of people hacking and basically unplayable. I'm assuming MW2 and WaW is the same. Out of these CODs, COD4 will have most players online, BO2 will be the most modern (in terms of game features), and BO1 is a good balance of both. BO1 has more features than COD4, rentable dedicated servers like COD4but less players than COD4. For me in europe, I find BO2 to be lacking in players. Not sure if its the same in the US (or wherever you are) but it can be annoying for playing certain match types. I usually try play BO2, but if there aren't enough players online, I go to BO1. Failing that I go to COD4.
  8. I never said that we should be benching at 1080p or 1440p or 4k. All I'm doing is showing the differences in some games when you do bench at those resolutions. But on that note, from a practical point of view, if you are considering getting the CPU for a gaming machine, you'd probably want to be getting a high end GPU and a 1440p monitor or higher (well I would recommend it if you are planing on spending a $1000 on the tower). But that doesn't mean you will need to get an SLI setup. I would think the majority of people wouldn't be getting an SLI setup for that system. I wouldn't. In fact I run a gtx 670 on 1440p. I have to drop settings but its does the job for now. I'll upgrade in a while. But I really don't think it is unacceptable to run a GTX 980 on a 1440p monitor with recent(ish) games. That is a real world scenario that. I am not trying to skew results. But on the point of "same argument when people demanded "real world benchmarking" when most reviewers knew that 720p told a better story". There is no "better story" when you are talking about gaming performance. There is only different sides to the same story and I believe you failed to show that. I have not shown it completely either. I am just highlighting that it is not that simple. It depends on your entire system.
  9. Yes you are correct in saying there is a significant improvement, but that is at 1080p though. You move to 1440p and the numbers start to even out quite a bit. Saying you get 42% improvement out of the box does not tell the full story. Actually it is quite a misleading comment. Here are the findings from PcPer who conducted a 2600k vs 6700k comparison.
  10. Same here. Employers only look for overall graduation grade (first class honours, second class grade one, etc.) when you are getting your first job. Once you get a bit of experience, that will stand to you a lot more. Agreed. Engineering is so broad that finding an engineering hobby that you could consider getting involved in (practical applications), would do much more for you than being theoretically good in a lot of different subjects.
  11. You should do what you want to do. That is the first piece of advice. I say this because any subject instantly becomes 10 times harder if you don't enjoy learning about it. Do you know exactly what is involved with electrical engineering, and also what exactly your fathers company is involved with? BTW, when I say exactly, I mean that you aren't confusing electrical engineering with something similar like electronic engineering. If you are anyway unsure on the type of engineering you like, I would advise you to try get in on a open engineering course so that you can choose a specific branch of engineering after one year.
  12. Who? The ISP changing their DNS servers themselves? Well yeah they obviously would but have the put counter measures in place to insure it doesn't happen again? Or was it a band aid fix?
  13. Personally, I don't think you should watercool the GPU. I'd say stick with the H80, or if you really want to, get a h100 for CPU, use the H80 with a NZXT kracken or something like that. There are legitimate reasons to full custom loop, but it is a lot of money that you are not going to get back in any way. If you have multiple cards, a lot of noise, a lot of heat and a lot of money, definitely go for it. If you are breaking your balls to pay for everything it ain't worth it. You are going to need to upgrade that GPU in a few years (how many depends on you) but you are basically throwing extra money at it now that you are not going to get back. Like @Brynjar said, you should really upgrade your GPU to the latest and greatest before you buy an expensive waterblock for it.
  14. Maybe but 2 million households without internet is still a significant amount of damage. Especially if it becomes a recurring issue. And for some people, changing the DNS settings on their modem maybe about as feasible as changing the engine in their car. Unless they have someone on the phone with them telling them what to do, step by step (which would cost a shit load for 2 million people), it ain't going to happen.
  15. Wow! Anon is stepping up their game, not in terms of difficulty (although it may have been a lot more difficult), but in terms of potential targets and the resultant affect. Taking on a media company is expected, a government website is ballsy, but an ISP takes it to a whole new level. It really takes it from a very localised attack to a full scale attack affecting millions of people.
×