Jump to content

Zulkkis

Member
  • Posts

    312
  • Joined

  • Last visited

Reputation Activity

  1. Agree
    Zulkkis got a reaction from kirashi in Solar Panel Taxes?   
    I was initially confused, but now I think I got it. There is a very realistic reason for it if I got it correctly, but I'm confused as to why the billing can be that dumb to begin with. We have a fixed cost + usage based system, which actually reflects the costs.
     
    There are two parts for the operation of the grid. The grid infrastructure, and the power generation. The infrastructure is a fixed cost. The wires don't really wear out faster if you're using electricity all the time. Hence, this cost is a fixed cost. The power generation costs in your power bill should be billed by the actual use of electricity.
     
    Case A:
     
    Let's say the cost of the grid is $15 / month for a person. Then we add (easy unit for math, not for realism) $1 / UsageWatt bill for the electricity.
     
    A home without panels uses 10 UsageWatts per month. The final bill will be $15 + $10 = $25 / month.
    A home with solar panels uses 4 UsageWatts per month. The final bill will be $15 + $4 = $19 / month.
     
    This reflects the actual costs accumulated per user fairly well.
     
    Case B:
     
    Under another billing system, we only have UsageWatts for billing. In this system, the price of electricity is $2.5 / UsageWatt and no fixed costs.
     
    A home without solar panels still uses 10 UsageWatts per month, and the final bill is still 10 * $2.5 = $25 / month.
    A home with solar panels is still using 4 UsageWatts per month, and the final bill is 4 * $2.5 = $10 / month.
     
    In case A, the solar panel home saves $6 / month, and in case B, the savings are $15 / month, or a 150% increase in savings. In reality, in case B the solar panel home is paying less than they should if the billing was appropriate.
     
    Case C:
     
    Exactly what is going on here; the power company only bills consumption, but uses different UsageWatt rates for solar panel homes and non-solar panel homes. For normal homes the cost is $2.5 / UsageWatt, but for solar panel homes, the cost is increse to $5 / UsageWatt.
     
    Now the non-solar panel home is still billed for $25 / month...
    But the solar panel home is billed for $20 / month.
     
    This reflects the actual costs much better (just like case A) although it still isn't perfect. Case A is really how the electricity should be billed, if we were to bill by actual costs.
     
    The only problem with billing by actual cost is that the price of electricity itself can be relatively little of the total bill, and infrastructure costs can make the majority of your bill depending on the case, so it does not create an effective price incentive to actually save energy since you can only save so little. Electricity costs wary a lot depending on geographical locations.

     
    For power and water that is correct, because power needs to be generated, and water needs to be pumped. Both are finite.
     
    For communications, it doesn't actually make any sense. If the usage-rate is actually just the bandwidth you are paying for, then it is fine. Data caps, however, have no place in any kind of telecommunications network. The only data cap there should be is [time * the speed you paid for = maximum amount of data through the network].
     
    Unless you are aware of some ground-breaking battery technology that nobody else is aware of, using batteries is nowhere near as efficient as using the grid. This is the primary problem with renewables. Using batteries can somewhat help you, but they are still mainly used for redundancy, and for a good reason.
     
    Economically such an investment doesn't make sense. Just for some perspective, the average American household uses 30 kWh of power each day. Tesla's PowerWall 2.0 costs $5500 and stores up to 14 kWh of electricity. Not to mention that batteries are less efficient than the grid. Not to mention that they aren't getting better as they age, but rather the opposite.
     
    Can't really blame you though, since there are plenty of people who have fundamental misconceptions what electricity is. Take this is from the blogger:
     
     
    Just for the record, PowerWall 2.0 is not designed to take you off the grid. It is only intended to give you more efficiency out of your solar panels, because even though the efficiency of batteries still suck compared to the grid, having solar panels generating electricity during the day when nobody's at home using the said electricity is even less efficient.
  2. Informative
    Zulkkis got a reaction from Shakaza in Tim Sweeney believes that Microsoft will harm Steam with Windows 10 updates   
    Everyone who thinks that Tim is on the path of tinfoil-hattery should at least understand the point he is trying to make. He is saying that the current direction gives Microsoft all the rope they need to hang the developers if they choose to do so.
     
    Microsoft's market position is so strong that we could metaphorically say that it makes Microsoft put a chain under us, on which we stand on. Now, with these mandatory new update policies, UWP etc. Microsoft puts a rope around our neck. Now, Microsoft is naturally in a very strong negotiation position and if they don't get what they want, they could just kick the chair from under us.
     
    Whether if Microsoft will ever kick that chair or not, is not really the question here. Similar things have happened before, though - Microsoft fear-mongered about OpenGL support when they launched Vista, saying that OpenGL -commands would be turned into DirectX equivalents (causing a slowdown) and pushed the Xbox, creating a situation where people would rather program for DirectX instead of OpenGL. Microsoft also pulled something similar with ActiveX during the browser wars.
     
    But that is not what Sweeney's key argument is. Sweeney's argument is that the chair (monopolistic market position) is already pretty powerful; considering that we should definitely not be allowing Microsoft to tie a rope around our necks. I think it is a perfectly reasonable proposition. Whether if the actual kick would ever happen or not is another thing, just like possessing nuclear weapons doesn't mean that you're actually using them.
  3. Agree
    Zulkkis got a reaction from LAwLz in Tim Sweeney believes that Microsoft will harm Steam with Windows 10 updates   
    Everyone who thinks that Tim is on the path of tinfoil-hattery should at least understand the point he is trying to make. He is saying that the current direction gives Microsoft all the rope they need to hang the developers if they choose to do so.
     
    Microsoft's market position is so strong that we could metaphorically say that it makes Microsoft put a chain under us, on which we stand on. Now, with these mandatory new update policies, UWP etc. Microsoft puts a rope around our neck. Now, Microsoft is naturally in a very strong negotiation position and if they don't get what they want, they could just kick the chair from under us.
     
    Whether if Microsoft will ever kick that chair or not, is not really the question here. Similar things have happened before, though - Microsoft fear-mongered about OpenGL support when they launched Vista, saying that OpenGL -commands would be turned into DirectX equivalents (causing a slowdown) and pushed the Xbox, creating a situation where people would rather program for DirectX instead of OpenGL. Microsoft also pulled something similar with ActiveX during the browser wars.
     
    But that is not what Sweeney's key argument is. Sweeney's argument is that the chair (monopolistic market position) is already pretty powerful; considering that we should definitely not be allowing Microsoft to tie a rope around our necks. I think it is a perfectly reasonable proposition. Whether if the actual kick would ever happen or not is another thing, just like possessing nuclear weapons doesn't mean that you're actually using them.
  4. Like
  5. Like
    Zulkkis got a reaction from Lolzious in Who is the fanatic, really? Yes, it's one of those threads again   
    Shadowplay = AMD Game DVR
    Geforce Experience = AMD Raptr Gaming Evolved. I do admit that nVidia equivalent looks more classy but it is mostly just a skin
    6GB of VRAM is mostly useless for games, with proper compression even most open-world games should use less, valid concern but not a sticking point, you have seen the numbers
    Lower power consumption? I would understand if this was R9 390(X) vs GTX980 / 970, but at least tomshardware measured roughly equal power numbers for Fury X than the 980 TI (FuryX higher peak, lower average, within like 20W)
    Drivers are a toss-up, neither is significantly better than the other. Actually nVidia has had pretty terrible drivers lately, there are some games that can perform worse on AMD though. The driver difference lives inside of your head.
     
    Personally, I would also buy a (good, non-ref) 980 TI over a FuryX. That said, Fury X is not useless:
     -FreeSync displays are cheaper and the technology in practical, real world application is effectively equal to the more expensive G-sync
     -Smaller systems have easier time with FuryX as long as the case of your choice can accommodate for the AIO
  6. Like
    Zulkkis got a reaction from MEC-777 in Who is the fanatic, really? Yes, it's one of those threads again   
    Shadowplay = AMD Game DVR
    Geforce Experience = AMD Raptr Gaming Evolved. I do admit that nVidia equivalent looks more classy but it is mostly just a skin
    6GB of VRAM is mostly useless for games, with proper compression even most open-world games should use less, valid concern but not a sticking point, you have seen the numbers
    Lower power consumption? I would understand if this was R9 390(X) vs GTX980 / 970, but at least tomshardware measured roughly equal power numbers for Fury X than the 980 TI (FuryX higher peak, lower average, within like 20W)
    Drivers are a toss-up, neither is significantly better than the other. Actually nVidia has had pretty terrible drivers lately, there are some games that can perform worse on AMD though. The driver difference lives inside of your head.
     
    Personally, I would also buy a (good, non-ref) 980 TI over a FuryX. That said, Fury X is not useless:
     -FreeSync displays are cheaper and the technology in practical, real world application is effectively equal to the more expensive G-sync
     -Smaller systems have easier time with FuryX as long as the case of your choice can accommodate for the AIO
  7. Like
    Zulkkis got a reaction from FLUFFYJELLO in Who is the fanatic, really? Yes, it's one of those threads again   
    Shadowplay = AMD Game DVR
    Geforce Experience = AMD Raptr Gaming Evolved. I do admit that nVidia equivalent looks more classy but it is mostly just a skin
    6GB of VRAM is mostly useless for games, with proper compression even most open-world games should use less, valid concern but not a sticking point, you have seen the numbers
    Lower power consumption? I would understand if this was R9 390(X) vs GTX980 / 970, but at least tomshardware measured roughly equal power numbers for Fury X than the 980 TI (FuryX higher peak, lower average, within like 20W)
    Drivers are a toss-up, neither is significantly better than the other. Actually nVidia has had pretty terrible drivers lately, there are some games that can perform worse on AMD though. The driver difference lives inside of your head.
     
    Personally, I would also buy a (good, non-ref) 980 TI over a FuryX. That said, Fury X is not useless:
     -FreeSync displays are cheaper and the technology in practical, real world application is effectively equal to the more expensive G-sync
     -Smaller systems have easier time with FuryX as long as the case of your choice can accommodate for the AIO
  8. Like
    Zulkkis got a reaction from don_svetlio in Who is the fanatic, really? Yes, it's one of those threads again   
    Shadowplay = AMD Game DVR
    Geforce Experience = AMD Raptr Gaming Evolved. I do admit that nVidia equivalent looks more classy but it is mostly just a skin
    6GB of VRAM is mostly useless for games, with proper compression even most open-world games should use less, valid concern but not a sticking point, you have seen the numbers
    Lower power consumption? I would understand if this was R9 390(X) vs GTX980 / 970, but at least tomshardware measured roughly equal power numbers for Fury X than the 980 TI (FuryX higher peak, lower average, within like 20W)
    Drivers are a toss-up, neither is significantly better than the other. Actually nVidia has had pretty terrible drivers lately, there are some games that can perform worse on AMD though. The driver difference lives inside of your head.
     
    Personally, I would also buy a (good, non-ref) 980 TI over a FuryX. That said, Fury X is not useless:
     -FreeSync displays are cheaper and the technology in practical, real world application is effectively equal to the more expensive G-sync
     -Smaller systems have easier time with FuryX as long as the case of your choice can accommodate for the AIO
  9. Like
    Zulkkis got a reaction from iCrushDreams in Will a dual molex to 8 pin and 6 + 2 pin power a 8+8 pin GPU?   
    If you psu doesn't have the cables, that's a good sign that you shouldn't use it for the gpu.
  10. Like
    Zulkkis got a reaction from Darkman in Is 50% off a PSU a good deal? Or should I wait until Black Friday?   
    We can't say anything quite just yet. The thing is that while a process node shrink and using more energy-efficient memory can help with power management, the saving in the power budget will definitely be used somewhere else. At least one of the camps is probably going to have the "more power hungry card", particularly if you are looking forward to overclocking the cards.
  11. Like
    Zulkkis got a reaction from ApolloEleven in R9 290 Black Screen Crash   
    Well, try reinstalling the drivers, safe mode and stuff. Just purge them out and reinstall whatever you'd like, then test.
     
    I have to say that I am not completely convinced of the stability of these cards. I have an R9 290 and it is giving me black screens, but mine is different in that it gives them under heavy 3D-load. Attempting to reduce temps helps, but it doesn't remove them, only decreases the frequency of the crashes. I cannot test the system of mine with another power supply, but my other GPU that is kind of there but not quite in terms of power draw manages just fine.
     
    At one point I had issues with booting into windows (black screen directly), as I changed some power saving setting in afterburner. Managing to get myself into safe mode and sweeping the drivers and starting from a clean slate fixed that, but no the instability.
     
    I just think that they are very complex cards and there is a higher chance of getting a dud and it is all a lottery. My case is kind of iffy because I bought an used GPU that had some warranty, I have to now fight back (warranty is tied to the product not the owner, at least legally) and hope that I don't end up with a paperweight. And they kind of became more obvious as time passed by, I originally tested the card with Heaven + FireStrike. Had some crashing in Tomb Raider but since it was the only software I used at that point which did that, I chose not to care. Mistake, since it appears elsewhere too, even in relatively lightweight use, it just gets more common the heavier you stress it.
     
    I'm rolling back to my old GTX 580 for the summer if this doesn't get sorted out... it has become a big nuisance. From what I've seen on the internet, it is just very common, some people claim to have found fixes, but I believe most of them are temporary - since the crashing, at least for me, is generally random, and the frequency has increased over time or under heavy load.
  12. Like
    Zulkkis got a reaction from PineappleHolidays in The Fury X is here ! +EU pricing   
    Yeah, you can actually find 980 TI's for cheaper if you just go for the reference model (though I'd have to order it from Germany), and a little bit more gives you some of the custom models.
     
    Neither company will have my money right now, though. The r9 290 I got was too good of a deal, I hardly bought any games from the summer sale because of all the backlog, I go through all that and after the summer ends I'll check what the market has to offer.
  13. Like
    Zulkkis got a reaction from Castdeath97 in Are 2560x1080 monitors worth it?   
    I would get a 2560 x 1440 instead.
  14. Like
    Zulkkis got a reaction from BenR31415 in AMD 4GB means 4GB   
    It's not a bug, it's an intentional hardware decision that only came up just now.
     
    What is worrisome is that it took people who noticed something fishy and started to actually test stuff to make nVidia come forward about this. It's not a bug in the sense that they can't tell the BIOS update to make the needed to physical changes to the PCB to provide the performance in high VRAM-usage situations.
     
    It's still a great card and all, but for some bigger towers a 290 or 290X can make more sense now at 1440p+ resolutions, particularly when you start talking about multiple cards.
     
    It's hilarious and that's all. They don't even mention nVidia in the tweet at all, you have to be "in the know" to get the joke.
     
    If anything, the "Fermi" and "going to find something green in there" -types of videos are bad, because they pretty much paint the other camp as idiots and throw fuel into the fanboy fire. Even if the joke was kind of meh-funny, I still think it's better to leave that kind of videos out.
  15. Like
    Zulkkis got a reaction from AnnoyedShelf in Anyone else disappointed with R9 300/Fury lineup?   
    They are a tier lower. It's just a change of the naming scheme. Kind of like how a HD6870 is a slower card than what the HD5870 was. I agree that it is retarded but it is semantics at its best. Think the 390X as 380X and leave it at that.
     
    The current pricing of the 39X doesn't make much sense but it is only the MSRP anyways.
     
     
    TSMC failed to deliver a new node, so what can you do. At least 980 TI and the Fury cards are progress. 980 was pretty much a side-grade in a lot of games against a 780 TI (in games where driver support could be considered good on both).
     
     
     
    Yeah, though I am not sure how well air cooling will work for the card. The space in power budget given by HBM was used to increase performance, which is fine, but the card is so damn small right now that it is hard to imagine how heavy a cooling solution you'd need and how would you go about implementing it on that card. You can't just slap 3 fans on it unless you make it two times as long as it needs to be.
     
    Overall, I'd say it is both good and bad. SFF PCs just got lot more interesting. On the other hand, we are not dealing with gigantic performance leaps, but we've been dealing with 28 nm since forever, I mean, look at nVidia... had to wait until 980 TI to get something at relatively sane prices, and it's not earth-shattering either. I'd say the worst thing is that AMD just one-upped the 200$ segment with 285 4GB, with it was a Tonga Pro instead or something. I don't really mind the rebrands, the cards are still fine, really, but they can't be sold at the msrp.
     
    I guess by the time we will finally be able to move from the current processing node, we will get real improvements. HBM v2, 14(?) nm processing node, new architectures (?) so that should be all good. Expensive, though. But it should be much less of a joke in terms of performance. The current situation is a perversion of 28nm processing that has clearly overstayed its welcome.
  16. Like
    Zulkkis got a reaction from VortexS in Titan x, z, or Tesla K80 for Blender?   
    10k computer for "how to" videos at work? How are you trained to do this? Have you worked on professional stuff? If you have been working with that laptop for so long, I can tell you, even a single R9 290 can manage the BMW scene in a minute using the latest master builds (though it's not 100% the same as CUDA result due to not everything being supported yet), and even a GTX 580 will manage it in under 2 minutes.
     
    Everything you can get will most likely be an overkill. I'd get a 4k ~40" or so monitor first, 1-2 extra monitors, a good cintiq, and then worry about the computer with whatever you have left.
     
    Workstation cards are useless, you can't use iRay in blender so the driver features are meaningless. A quadro would enable 3D vision OpenGL if you need that for some reason, and better support on linux for multiple monitors and whatnot. You only do single-precision and ECC RAM is useless.
     
    Your i7 is most likely faster than your GPU at rendering, btw.
  17. Like
    Zulkkis got a reaction from Tataffe in NVIDIA Could Capitalize on AMD GCN Not Supporting Direct3D 12_1   
    Compensating developers to get the exclusive / mono-optimized features in the game instead of a developer genuinely choosing the solution out of its own volition rather than using an universally useful solution (or implementing GW on top of the universal solution) is definitely the kind of behavior we should endorse. :rolleyes:
     
     
    TXAA = Is it AA or a blur filter?
    SSAO > HBAO+
    PhysX = CPU-only doesn't really matter, GPU-only are particle woopidoopidoo effects even Carmack deemed garbage, physics which matter need to not be proprietary anyway since they tie themselves directly into the gameplay.
     
    There is just no reason for anyone to use a proprietary FX library that is tied to a hardware vendor and that they cannot optimize / modify themselves unless you're getting compensated for doing so.
     
    Anyway, I think there are too many conflicting sources for what DX12 level support there is right now. It won't be too long until we see how it really ends up.
  18. Like
    Zulkkis got a reaction from Notional in NVIDIA Could Capitalize on AMD GCN Not Supporting Direct3D 12_1   
    Compensating developers to get the exclusive / mono-optimized features in the game instead of a developer genuinely choosing the solution out of its own volition rather than using an universally useful solution (or implementing GW on top of the universal solution) is definitely the kind of behavior we should endorse. :rolleyes:
     
     
    TXAA = Is it AA or a blur filter?
    SSAO > HBAO+
    PhysX = CPU-only doesn't really matter, GPU-only are particle woopidoopidoo effects even Carmack deemed garbage, physics which matter need to not be proprietary anyway since they tie themselves directly into the gameplay.
     
    There is just no reason for anyone to use a proprietary FX library that is tied to a hardware vendor and that they cannot optimize / modify themselves unless you're getting compensated for doing so.
     
    Anyway, I think there are too many conflicting sources for what DX12 level support there is right now. It won't be too long until we see how it really ends up.
  19. Like
    Zulkkis got a reaction from xAcid9 in NVIDIA Could Capitalize on AMD GCN Not Supporting Direct3D 12_1   
    Compensating developers to get the exclusive / mono-optimized features in the game instead of a developer genuinely choosing the solution out of its own volition rather than using an universally useful solution (or implementing GW on top of the universal solution) is definitely the kind of behavior we should endorse. :rolleyes:
     
     
    TXAA = Is it AA or a blur filter?
    SSAO > HBAO+
    PhysX = CPU-only doesn't really matter, GPU-only are particle woopidoopidoo effects even Carmack deemed garbage, physics which matter need to not be proprietary anyway since they tie themselves directly into the gameplay.
     
    There is just no reason for anyone to use a proprietary FX library that is tied to a hardware vendor and that they cannot optimize / modify themselves unless you're getting compensated for doing so.
     
    Anyway, I think there are too many conflicting sources for what DX12 level support there is right now. It won't be too long until we see how it really ends up.
  20. Like
    Zulkkis got a reaction from Khvarrioiren in NVIDIA Could Capitalize on AMD GCN Not Supporting Direct3D 12_1   
    Waiting for tessellation 2.0
     
    Stupidly high-res skybox that is an uncompressed image that reaches under the map as well.
  21. Like
    Zulkkis got a reaction from FratStar in NVIDIA Could Capitalize on AMD GCN Not Supporting Direct3D 12_1   
    Compensating developers to get the exclusive / mono-optimized features in the game instead of a developer genuinely choosing the solution out of its own volition rather than using an universally useful solution (or implementing GW on top of the universal solution) is definitely the kind of behavior we should endorse. :rolleyes:
     
     
    TXAA = Is it AA or a blur filter?
    SSAO > HBAO+
    PhysX = CPU-only doesn't really matter, GPU-only are particle woopidoopidoo effects even Carmack deemed garbage, physics which matter need to not be proprietary anyway since they tie themselves directly into the gameplay.
     
    There is just no reason for anyone to use a proprietary FX library that is tied to a hardware vendor and that they cannot optimize / modify themselves unless you're getting compensated for doing so.
     
    Anyway, I think there are too many conflicting sources for what DX12 level support there is right now. It won't be too long until we see how it really ends up.
  22. Like
    Zulkkis got a reaction from alfaomega in My mom scolded me for electric bill spike due to R9 290   
    GTX 970 is better for the most part, but it isn't cheaper. I'd take if over R9 290, for sure - but when you look at the market, in most countries the GTX 970 is significantly more expensive.
     
    Either OP lives in a wonderland where power is stupidly expensive and left the card running on for 24/7 while he didn't use the older nVidia system at all, or her mother is overpaying for electricity, or the power bill increase does not account for the change of GPU alone. Or then he just wants to bash the manufacturer, considering he's just saying that the card "killed" those power supplies.
  23. Like
    Zulkkis got a reaction from xAcid9 in My mom scolded me for electric bill spike due to R9 290   
    GTX 970 is better for the most part, but it isn't cheaper. I'd take if over R9 290, for sure - but when you look at the market, in most countries the GTX 970 is significantly more expensive.
     
    Either OP lives in a wonderland where power is stupidly expensive and left the card running on for 24/7 while he didn't use the older nVidia system at all, or her mother is overpaying for electricity, or the power bill increase does not account for the change of GPU alone. Or then he just wants to bash the manufacturer, considering he's just saying that the card "killed" those power supplies.
  24. Like
    Zulkkis got a reaction from mvitkun in ACX 2.0 cooler coming for Titan X   
    70$ for an ACX?
     
    At least offer the same AIO you do to the GTX 980, that's 100$ and much better performance.
     
    Accelero over this any day of the week.
  25. Like
    Zulkkis got a reaction from cesrai in ACX 2.0 cooler coming for Titan X   
×