Jump to content

fulminemizzega

Member
  • Posts

    72
  • Joined

  • Last visited

Reputation Activity

  1. Like
    fulminemizzega got a reaction from FairPumpkin in My Sponsor says Slower is… Better?   
    They lost the opportunity to win the 2024 worst step-down converter award 😈
  2. Like
    fulminemizzega reacted to LinusTech in What about a European Warehouse ?   
    I don't consider our community to be comprised of 'insignificant nobodies'.  I enjoy interacting with people here on the forum and elsewhere. Any executive who is too far-removed from the day-to-day to interact with the individuals who support their business should be fired on the spot.

    With that said, I think it's clear that those executives shouldn't be obligated to take business advice from every individual who offers an opinion.
  3. Agree
    fulminemizzega got a reaction from ValelaV in Thread for LTT Labs Test Suggestions   
    Suggestion: Create a Reference Guide for Review Metrics
    Hello, I know that maybe this post should be in the other huge thread that is full of of angry people, but I think this is more a suggestion/request for the Labs team. This is really just a long example about GPU testing because that's what I'm thinking about now, but it can be applied more broadly (I think).
    In GPU reviews, testing used to be simpler - just FPS in some games, thermals, power draw. Then we got more advanced metrics like noise levels, min/max/average FPS, 1% lows, frame times and many others will come for sure to even better understand and evaluate a product. For example, I saw a recent GN video explaining a new Intel tool to measure "GPU busy" and driver overhead. They spent ~30 minutes just on what the metric means (I'm sure they could have made a shorter version with less chatting in between slides).
    I understand LTT has to target a broad audience so hard concepts/metric often get simplified, have to be omitted or maybe just belong to written form.
    But even current metrics like 1% lows aren't that intuitive, you can explain it as the lowest 1% of FPS values, but that's hand-wavy. The lowest 1% FPS values are... what? A range of values expressed as a single value? Well kind-of, you get what I mean, higher is better let's just move on.
     
    The underlying concepts of cumulative distribution function and probability density functions are not easy, sure in an academic context this is just offensively simple, but it is my belief that these concepts are not well understood elsewhere. I think that if I go point blank (very mean of myself, I know) asking reviewers online to explain (without handwaving) what a percentile for a probability distribution is, I'd get many "uh... ah...".
    I'd not fault many of them (depends on what follows after the "ah...") because while it is quickly defined in a math context (a simple inequation of the inverse CDF, child's play!), each of those words carry a great amount of "meaning" and, as it often is, an easy math concept is just an elegant and concise way to express a deep and complex idea.
    This 1% low is just the best example I have to pitch my request, I hope I've shown what I mean by not an easy concept.
    Here is the suggestion/request:
    I'd like you to make a reference "guide", somewhat similar to the definitive guide to build a PC, as a reference for every metric that you wish to publish, going into "enough" details. The goal is to educate us viewers so we better understand these concepts instead of relying on vague explanations, to have a good reference to be pointed to or to refer to when needed, to clearly understand what it is that your numbers actually mean. With this reference, you could start publishing even more complex metrics (like the gpu busy example above) since people will understand them. It will raise the quality of discussion even more.
    We are already way past the "it goes faster it is just better" with most of what you review, and most of us (I think) had to adapt a long time ago to the idea that there is a need for many metrics to understand GPUs before buying one, it already is a higher quality of discussion than what can be found in other fields.
     
    Adding to this, if you could show us how you found the errors you will be correcting in older videos, this may lead to better feedback from the viewers, as we will also know better what to try to look for or have a better way of looking at the data you publish, instead of just staring at it (sure, my fault I sometimes do stare). Even reading a chart is not exactly easy, it is not just a bunch of lines thrown in a rectangle, otherwise there would be no need for someone to comment them.
    Maybe a better understanding of the various metrics from the viewers may also lead to better feedback when errors crop up.
     
    I wish you the best of luck with all your endeavors. I believe this team will continue improving LMG content regardless. Even though it may not pay wages, I appreciate what you've done over these many years. I hope you will get back on track as soon as possible, as I believe you strive for the goal Gary once stated to "not be questioned" about Labs data.
  4. Like
    fulminemizzega reacted to LinusTech in Gamers Nexus alleges LMG has insufficient ethics and integrity   
    There won't be a big WAN Show segment about this or anything. Most of what I have to say, I've already said, and I've done so privately.

    To Steve, I expressed my disappointment that he didn't go through proper journalistic practices in creating this piece. He has my email and number (along with numerous other members of our team) and could have asked me for context that may have proven to be valuable (like the fact that we didn't 'sell' the monoblock, but rather auctioned it for charity due to a miscommunication... AND the fact that while we haven't sent payment yet, we have already agreed to compensate Billet Labs for the cost of their prototype). There are other issues, but I've told him that I won't be drawn into a public sniping match over this and that I'll be continuing to move forward in good faith as part of 'Team Media'. When/if he's ready to do so again I'll be ready.

    To my team (and my CEO's team, but realistically I was at the helm for all of these errors, so I need to own it), I stressed the importance of diligence in our work because there are so many eyes on us. We are going through some growing pains - we've been very public about them in the interest of transparency - and it's clear we have some work to do on internal processes and communication. We have already been doing a lot of work internally to clean up our processes, but these things take time. Rome wasn't built in a day, but that's no excuse for sloppiness.

    Now, for my community, all I can say is the same things I always say. We know that we're not perfect. We wear our imperfection on our sleeves in the interest of ensuring that we stay accountable to you. But it's sad and unfortunate when this transparency gets warped into a bad thing. The Labs team is hard at work hard creating processes and tools to generate data that will benefit all consumers - a work in progress that is very much not done and that we've communicated needs to be treated as such. Do we have notes under some videos? Yes. Is it because we are striving for transparency/improvement? Yeah... What we're doing hasn't been in many years, if ever.. and we would make a much larger correction if the circumstances merited it. Listing the wrong amount of cache on a table for a CPU review is sloppy, but given that our conclusions are drawn based on our testing, not the spec sheet, it doesn't materially change the recommendation. That doesn't mean these things don't matter. We've set KPIs for our writing/labs team around accuracy, and we are continually installing new checks and balances to ensure that things continue to get better. If you haven't seen the improvement, frankly I wonder if you're really looking for it... The thoroughness that we managed on our last handful of GPU videos is getting really incredible given the limited time we have for these embargoes. I'm REALLY excited about what the future will hold.
     
    With all of that said, I still disagree that the Billet Labs video (not the situation with the return, which I've already addressed above) is an 'accuracy' issue. It's more like I just read the room wrong. We COULD have re-tested it with perfect accuracy, but to do so PROPERLY - accounting for which cases it could be installed in (none) and which radiators it would be plumbed with (again... mystery) would have been impossible... and also didn't affect the conclusion of the video... OR SO I THOUGHT...
     
    I wanted to evaluate it as a product, and as a product, IF it could manage to compete with the temperatures of the highest end blocks on the planet, it still wouldn't make sense to buy... so from my point of view, re-testing it and finding out that yes, it did in fact run cooler made no difference to the conclusion, so it didn't really make a difference.
     
    Adam and I were talking about this today. He advocated for re-testing it regardless of how non-viable it was as a product at the time and I think he expressed really well today why it mattered. It was like making a video about a supercar. It doesn't mater if no one watching will buy it. They just wanna see it rip.  I missed that, but it wasn't because I didn't care about the consumer.. it was because I was so focused on how this product impacted a potential buyer. Either way, clearly my bad, but my intention was never to harm Billet Labs. I specifically called out their incredible machining skills because I wanted to see them create something with a viable market for it and was hoping others would appreciate the fineness of the craftsmanship even if the product was impractical. I still hope they move forward building something else because they obviously have talent and I've watched countless niche water cooling vendors come and go. It's an astonishingly unforgiving market.
     
    Either way, I'm sorry I got the community's priorities mixed-up on this one, and that we didn't show the Billet in the best light. Our intention wasn't to hurt anyone. We wanted no one to buy it (because it's an egregious waste of money no matter what temps it runs at) and we wanted Billet to make something marketable (so they can, y'know, eat).
     
    With all of this in mind, it saddens me how quickly the pitchforks were raised over this. It also comes across a touch hypocritical when some basic due diligence could have helped clarify much of it. I have a LONG history of meeting issues head on and I've never been afraid to answer questions, which lands me in hot water regularly, but helps keep me in tune with my peers and with the community. The only reason I can think of not to ask me is because my honest response might be inconvenient. 
     
    We can test that... with this post. Will the "It was a mistake (a bad one, but a mistake) and they're taking care of it" reality manage to have the same reach? Let's see if anyone actually wants to know what happened. I hope so, but it's been disheartening seeing how many people were willing to jump on us here. Believe it or not, I'm a real person and so is the rest of my team. We are trying our best, and if what we were doing was easy, everyone would do it. Today sucks.
     
    Thanks for reading this.
  5. Funny
    fulminemizzega got a reaction from 05032-Mendicant-Bias in The BEST GPUs NVIDIA Ever Made… and the WORST   
    Thanks, this video gave me a different perspective. None of this information is new or at least I should know most of it, but I've always narrowed my view to the "nearest" products (meaning the current vs previous chip, across the two manifacturers, in a reasonable price bracket), seeing it all in one place does make a difference.
    Even if this video is focused on nvidia, the thing that stuck to me at the end is that (to me) it looks like amd's gpu division is alive by miracle.
  6. Agree
    fulminemizzega got a reaction from Spotty in I Await Sony’s Cease and Desist   
    I understand from YT comments that this thing was quite a pain for devs.... but it looks AMAZING. The front bezel, all the connectors, buttons, LEDs... the HDD trays! It has kind of Hi-Fi gear feel.
  7. Agree
    fulminemizzega reacted to little_santa in Build a PC while you still can   
    Few potentially misleading claims/ misinformation in this one.

    RE: ISA's effect on efficiency:  Apart from restrictions on memory consistency, there really isnt much you can directly tie to a chip being aarch64 or x86. See : https://research.cs.wisc.edu/vertical/papers/2013/hpca13-isa-power-struggles.pdf
     
    Sure the decode is more complicated but current x86 machines all use RISC like uops internally. A lot of the decode overhead is eliminated through Intel's use of a trace cache that stores decoded instructions rather than actual instructions.
     
    While Apple Silicon did come from ARM's ISA license, unlike Qualcomm and others, the microarch and physical design was done entirely in house by Apple. Most Qualcomm designs are modifications of existing ARM softcores. 
  8. Informative
    fulminemizzega reacted to hishnash in I want to love Apple, but they’re making it hard   
    NAND already comes in a package, it is already a stack of dies. If apple wants to just be able to order NAND its much simpler to buy it already in package rather than buying raw then pay for custom packaging, that is why I suspect the controler unit is layered behemoth the NAND package (it could be very small and even embedded within the PCB) someone will de-sodler these at some point and we will see. That said I suspect we will see these in the parts list apple said they would sell. 
  9. Agree
    fulminemizzega reacted to hishnash in I want to love Apple, but they’re making it hard   
    In particular any system with a readably powerful iGPU/APU will benefit a lot from having memory on package. The industry accepts GPU memory cant be socketed since you just cant get the bandwidth needed within a readable amount of space and power for even a low end gpu.   I think we might well see a range chips from AMD, and Intel over the next few years that have on package memory intel have done this before for some iGPUs but in those cases have limited that memory to be only used for the GPU this waste a lot of power. 
  10. Agree
    fulminemizzega reacted to BondiBlue in I want to love Apple, but they’re making it hard   
    I think you underestimate what a lot of normal users do. 
  11. Like
    fulminemizzega got a reaction from hishnash in How did Microsoft screw this up? - Surface Pro X (SQ2) vs M1 Macbook Air   
    For many years it's been said that ARM architecture would be used also in laptops and servers, and Apple M1 Macbooks could be one major "milestone" reached (I'm using quotes because M1 uses only ARM ISA, not ARM IP). I like to think that there is yet another segment, the desktop.
    Let's say that in a few years, x86 CPUs will be not as common as today in those segments and sales volume will be much smaller than today. This could mean that x86 CPUs will be too expensive for consumers.
    Then, what kind of chips will be left for the desktop (if it still exists)? Cut down server CPUs (something like threadripper?), or "bottom of the barrel" mobile silicon?
    ...maybe the only difference is going to be soldered CPUs even there (and a new huge pile of software issues).
  12. Informative
    fulminemizzega reacted to SolarNova in RTX 3080 - Can it Run CRYSIS (Remastered)??   
    Crytek expanded to fast, opened to many dev studios in multiple locations. There have been vids on them in the past by other youtube that explain it all.
     
    I remember buying my 1st 'beast' PC back then, a QX6850 CPU and Dual GTX 8800 in SLI.
    Could it run Crysis ..yes ..yes it could .. and it felt gooood lol.
  13. Like
    fulminemizzega reacted to Spindel in Is Apple's Betrayal the END of Intel?   
    Let me finish your post for you: ”...tin foil hat stupid”
  14. Like
    fulminemizzega got a reaction from Video Beagle in Is Apple's Betrayal the END of Intel?   
    It is not. Apple has done OSS for a very long time, without doing much noise about it. Parts of macOS and iOS are open source and available (even if they have no legal reason to release them) here: https://developer.apple.com/opensource/
    Other than WebKit, there is LLVM/Clang that is huge.
    Aside from this, their hardware is proprietary. From a repair point of view: common PCB failures are (by far) connectors, then power delivery/controllers. Fried CPUs are not common, so using a in-house built chip will make repair more difficult... to a point. Is that easy buying mobile intel chips from a reputable source?
    The big issue is no schematics and sourcing replacements parts.
    This switch just makes sense (which is why this move is not surprising😞 apple chips are able to reach intel performance, and Apple is making SoCs, where they can integrate CPU, GPU, memory controller, T2 storage, display controllers and so on, allowing to reduce PCB size and cost, increasing power efficiency and performance. It costs a lot of money upfront. To think that someone at Apple thought "let's stop using intel to make repairs even harder" is just... I don't know.
  15. Informative
    fulminemizzega reacted to mmda in I said I wouldn't cry... - $35K Abyss Headphone Setup   
    The Grade SDAC is based off the ODAC design. 
  16. Agree
    fulminemizzega reacted to Belgarathian in I said I wouldn't cry... - $35K Abyss Headphone Setup   
    So many snake oil products were sent ?‍♂️all he needed was the DAC and amp.
     
    If he had shit power in the building, he could have run it through a UPS but otherwise the amp and DAC should have been able to isolate mains noise themselves if they were engineered correctly, no need for all the snake oil cables and power supplies.
     
    The DAC is a R2R Ladder DAC which means that they believe they can beat DAC chip OEMs like AKM, TI, and ESS or save a cent, but usually ends up resulting in garbage with lots of noise, distortion, and other weird oddities. The spec for the DAC is 0.008% THD+N which equates to ~81dB, so the DAC doesn’t even meet CD spec of 96dB... Not sure where they got their stated 130.5dB of dynamic range as it doesn’t correlate to any of their other specs. If it is true though, then yes... it’s a chart topping DAC so long as they haven’t stuffed up the fundamentals. I’m sceptical as hell though.
     
    Going by specs, the amplifier looks okay delivering tons of power which is usually the single most important metric for an amp given that Planar Magnetic headphones are notoriously difficult to drive.
     
    The Drop THX 789 and a Topping D50 would probably outperform for less than $1000 or even an RME ADI-2 all in one for example.
     
    On the positive side, the headphones are very, very good. Just a shame they paired them with a crap chain.
  17. Informative
    fulminemizzega reacted to Emily Young in Petabyte Project is FULL! Time to upgrade...   
    It definitely was something stupid: We didn't realize the drive wasn't fresh... The problem was just that it was already formatted. It wasn't talking about physical sector size, it was talking about the formatted cluster size.
     
    WHY it wouldn't just nuke the drive when adding it to a zpool or otherwise say "hey, there's data on this drive, you sure about this fam" is beyond me.
  18. Agree
    fulminemizzega reacted to mynameisjuan in Petabyte Project is FULL! Time to upgrade...   
    Drive isnt working, *hacker scene*, fixed.
     
    So just going to blue ball us like that and not tell us why 117 was not working.
     
    It's as bad as people who create a post asking for help then responding with, nvm fixed it
  19. Agree
    fulminemizzega reacted to Emily Young in Macs are SLOWER than PCs. Here’s why.   
    I feel like the point was kind of missed here... The point isn't "lol apple sux", it's "Apple has made deliberate decisions to prioritise certain things over the performance of their computers, to the detriment of the people who might actually want to use them for what they're typically advertised for." It was an investigation into the what, and speculation as to the why. We even said up front that Apple's not the only company guilty of this, but even ignoring the form factor vs performance debate, little things like their fan curves aren't helping, and it's been a major issue in the past where chips just got too hot on the board and the BGAs just let go.
     
    I own one of those affected 2011 Macbook Pros, and I happen to really like macOS - The problem I have is that when I bought my Macbook, there wasn't anyone out there who had the combination of battery life, form factor, and performance Apple offered, and that's since changed dramatically (if you ignore the GPU overheating issue). Apple is quite obviously attempting to hang on to that "magical" middle-ground, and as a result they're sacrificing their actual performance levels to achieve that, but not in a way that laypeople would typically notice. As we said in the video, part of this is Intel's fault - Apple had this design on the books with the expectation that we'd be on 10nm by now, but that's been delayed and delayed and delayed. But they could at least make an attempt to not run the machines at the redline at all times...
     
    The point being, Apple could have largely defanged this critique quite easily with a more aggressive fan curve... But the fans take about a minute to hit maximum speed while the die temperature is hot enough to boil water, and the Core i9 Macbook Pro's launch fiasco proved that they don't have the overhead to ramp the fans up that slowly. That kind of thermal performance is straight-up unacceptable, especially for a professional machine.
     
    So TL;DR, what I hope to accomplish with this is to get Apple to rethink their thermal solution, even if it's ramping the fans up sooner. The cooler on the Macbook Pro in particular is tiny, and the only way I can see them expect the machine to survive long-term is if it's rarely hit with a sustained load. Apple has been and in many ways still is better than that (see: iPhone), and unless they want to lose that image, this needs to change. Maybe it'll take a move to the A-series ARM processors. If you recall, there were similar issues with thermal performance and processing performance with the later PowerPC processors, and IBM couldn't hit the targets Apple wanted... Kinda feels the same this time around with Intel, don't you think?
  20. Funny
    fulminemizzega got a reaction from JCHelios in Here’s why iPhones are better   
    This analogy is mindblowingly flawed. Is a car without an engine still a car? Sure, it still works downhill.
    Now, is a smartphone with a locked, trusted, secure, but single, source of software not a smartphone anymore? If not, I can't but agree with you.
    No 3rd party app SOURCES. iPhones as every other smartphone device thrive with 3rd party software. The same software you can get for an android device, you can find on the appstore, this is another flawed analogy. What you do not get is the possibility to install software from non trusted sources.
    Now, I do understand why users would like to get software from other sources, but I do also understand what the appstore accomplishes.
    I like to see it this way: Apple is acting as a "root" trusted source for software, and it makes sense to trust apple because you already trust them to build the device you buy. They provide services, bandwidth, rules for app developers and enforces them (like it or not), and you do pay for that. Developers get a single place where to sell their software, and they also pay for everything apple does on their behalf (taking a cut from every sale).
    There is someone checking (to the best of their ability) that software you get from that single trusted source is safe (enough?). And this is huge, because phones contain more personal information than other devices, phones are communication devices, the actual devices you use to do most of your communications (unless you only do emails).
    What is the alternative? More app stores? So now you have to trust even more people to to the "right" thing, and they have to earn a living. Disabling all the added safety you get to install untrusted software, on such a sensitive device? I can't afford to own 2 phones, one with trusted software, another with random APKs. I can see a solution, it probably is re-inventing Windows, by having trusted publishers, and pop-ups warning you about trusting unsafe software developers, not reading and always clicking yes.
    Is this any better? Finally, how is apple's track record on security? Better, worse, same as Windows? Better, worse, same as Android? Does this approach work or not?
    I understand your point, you have less control on your device (I avoid the word complete, because you do not have complete control anyway). What do you get in return?
    How is "android is mostly open source" an argument for better android? Being open source just makes it better? When and where have you seen a common android device without closed source parts? There are closed parts even down to the kernel, with binary blobs for hardware support (first one that comes to mind is GPU blobs), and no one knows (beside who made them) what they do.
    And what is this "kernel" thing in android phones? When you make a device, you choose an SoC (that is cpu, gpu, "chipset", ...). The kernel version you get is what the SoC vendors gives you, it is not the latest and it is significantly different from upstream linux. Then you add your stuff to it (drivers, more blobs, whatever is needed). Then you ship your device. You make your kernel sources available online. Nobody (besides you) reviewes the changes you made because nobody other than you is using them, nobody other than you knows if your stuff is secure, you do not get the help of the linux community with your custom version.
    Now, a new vulnerability shows up. If your SoC vendor gives you a patch, great. You now have to apply it for every device you shipped, and it may not be that easy, because you may need to change the stuff you previously added. Your SoC vendor does not give you a patch? Don't expect to be able to take an upstream one, because your kernel is both too different from upstream and too old, meaning even more different, you will have to write it yourself. Multiply that for every device you ship. Every manifacturer has to do this dance, by themselves. Does this seem maintanable? Juicy source here: https://youtu.be/RKadXpQLmPU?t=1113 (at least 5 minutes), then demo: https://youtu.be/RKadXpQLmPU?t=2662
    As you can see, releasing sources is not enough. Open source is better when it works, not when everybody does its thing, duplicating work.
    All of this is only about the kernel, then there are all the other things that android is made of.
    Probably a laser gun?
  21. Like
    fulminemizzega got a reaction from dalekphalm in Here’s why iPhones are better   
    This analogy is mindblowingly flawed. Is a car without an engine still a car? Sure, it still works downhill.
    Now, is a smartphone with a locked, trusted, secure, but single, source of software not a smartphone anymore? If not, I can't but agree with you.
    No 3rd party app SOURCES. iPhones as every other smartphone device thrive with 3rd party software. The same software you can get for an android device, you can find on the appstore, this is another flawed analogy. What you do not get is the possibility to install software from non trusted sources.
    Now, I do understand why users would like to get software from other sources, but I do also understand what the appstore accomplishes.
    I like to see it this way: Apple is acting as a "root" trusted source for software, and it makes sense to trust apple because you already trust them to build the device you buy. They provide services, bandwidth, rules for app developers and enforces them (like it or not), and you do pay for that. Developers get a single place where to sell their software, and they also pay for everything apple does on their behalf (taking a cut from every sale).
    There is someone checking (to the best of their ability) that software you get from that single trusted source is safe (enough?). And this is huge, because phones contain more personal information than other devices, phones are communication devices, the actual devices you use to do most of your communications (unless you only do emails).
    What is the alternative? More app stores? So now you have to trust even more people to to the "right" thing, and they have to earn a living. Disabling all the added safety you get to install untrusted software, on such a sensitive device? I can't afford to own 2 phones, one with trusted software, another with random APKs. I can see a solution, it probably is re-inventing Windows, by having trusted publishers, and pop-ups warning you about trusting unsafe software developers, not reading and always clicking yes.
    Is this any better? Finally, how is apple's track record on security? Better, worse, same as Windows? Better, worse, same as Android? Does this approach work or not?
    I understand your point, you have less control on your device (I avoid the word complete, because you do not have complete control anyway). What do you get in return?
    How is "android is mostly open source" an argument for better android? Being open source just makes it better? When and where have you seen a common android device without closed source parts? There are closed parts even down to the kernel, with binary blobs for hardware support (first one that comes to mind is GPU blobs), and no one knows (beside who made them) what they do.
    And what is this "kernel" thing in android phones? When you make a device, you choose an SoC (that is cpu, gpu, "chipset", ...). The kernel version you get is what the SoC vendors gives you, it is not the latest and it is significantly different from upstream linux. Then you add your stuff to it (drivers, more blobs, whatever is needed). Then you ship your device. You make your kernel sources available online. Nobody (besides you) reviewes the changes you made because nobody other than you is using them, nobody other than you knows if your stuff is secure, you do not get the help of the linux community with your custom version.
    Now, a new vulnerability shows up. If your SoC vendor gives you a patch, great. You now have to apply it for every device you shipped, and it may not be that easy, because you may need to change the stuff you previously added. Your SoC vendor does not give you a patch? Don't expect to be able to take an upstream one, because your kernel is both too different from upstream and too old, meaning even more different, you will have to write it yourself. Multiply that for every device you ship. Every manifacturer has to do this dance, by themselves. Does this seem maintanable? Juicy source here: https://youtu.be/RKadXpQLmPU?t=1113 (at least 5 minutes), then demo: https://youtu.be/RKadXpQLmPU?t=2662
    As you can see, releasing sources is not enough. Open source is better when it works, not when everybody does its thing, duplicating work.
    All of this is only about the kernel, then there are all the other things that android is made of.
    Probably a laser gun?
  22. Agree
    fulminemizzega got a reaction from RockyRZ in Here’s why iPhones are better   
    This analogy is mindblowingly flawed. Is a car without an engine still a car? Sure, it still works downhill.
    Now, is a smartphone with a locked, trusted, secure, but single, source of software not a smartphone anymore? If not, I can't but agree with you.
    No 3rd party app SOURCES. iPhones as every other smartphone device thrive with 3rd party software. The same software you can get for an android device, you can find on the appstore, this is another flawed analogy. What you do not get is the possibility to install software from non trusted sources.
    Now, I do understand why users would like to get software from other sources, but I do also understand what the appstore accomplishes.
    I like to see it this way: Apple is acting as a "root" trusted source for software, and it makes sense to trust apple because you already trust them to build the device you buy. They provide services, bandwidth, rules for app developers and enforces them (like it or not), and you do pay for that. Developers get a single place where to sell their software, and they also pay for everything apple does on their behalf (taking a cut from every sale).
    There is someone checking (to the best of their ability) that software you get from that single trusted source is safe (enough?). And this is huge, because phones contain more personal information than other devices, phones are communication devices, the actual devices you use to do most of your communications (unless you only do emails).
    What is the alternative? More app stores? So now you have to trust even more people to to the "right" thing, and they have to earn a living. Disabling all the added safety you get to install untrusted software, on such a sensitive device? I can't afford to own 2 phones, one with trusted software, another with random APKs. I can see a solution, it probably is re-inventing Windows, by having trusted publishers, and pop-ups warning you about trusting unsafe software developers, not reading and always clicking yes.
    Is this any better? Finally, how is apple's track record on security? Better, worse, same as Windows? Better, worse, same as Android? Does this approach work or not?
    I understand your point, you have less control on your device (I avoid the word complete, because you do not have complete control anyway). What do you get in return?
    How is "android is mostly open source" an argument for better android? Being open source just makes it better? When and where have you seen a common android device without closed source parts? There are closed parts even down to the kernel, with binary blobs for hardware support (first one that comes to mind is GPU blobs), and no one knows (beside who made them) what they do.
    And what is this "kernel" thing in android phones? When you make a device, you choose an SoC (that is cpu, gpu, "chipset", ...). The kernel version you get is what the SoC vendors gives you, it is not the latest and it is significantly different from upstream linux. Then you add your stuff to it (drivers, more blobs, whatever is needed). Then you ship your device. You make your kernel sources available online. Nobody (besides you) reviewes the changes you made because nobody other than you is using them, nobody other than you knows if your stuff is secure, you do not get the help of the linux community with your custom version.
    Now, a new vulnerability shows up. If your SoC vendor gives you a patch, great. You now have to apply it for every device you shipped, and it may not be that easy, because you may need to change the stuff you previously added. Your SoC vendor does not give you a patch? Don't expect to be able to take an upstream one, because your kernel is both too different from upstream and too old, meaning even more different, you will have to write it yourself. Multiply that for every device you ship. Every manifacturer has to do this dance, by themselves. Does this seem maintanable? Juicy source here: https://youtu.be/RKadXpQLmPU?t=1113 (at least 5 minutes), then demo: https://youtu.be/RKadXpQLmPU?t=2662
    As you can see, releasing sources is not enough. Open source is better when it works, not when everybody does its thing, duplicating work.
    All of this is only about the kernel, then there are all the other things that android is made of.
    Probably a laser gun?
  23. Like
    fulminemizzega got a reaction from Chokondis in Here’s why iPhones are better   
    This analogy is mindblowingly flawed. Is a car without an engine still a car? Sure, it still works downhill.
    Now, is a smartphone with a locked, trusted, secure, but single, source of software not a smartphone anymore? If not, I can't but agree with you.
    No 3rd party app SOURCES. iPhones as every other smartphone device thrive with 3rd party software. The same software you can get for an android device, you can find on the appstore, this is another flawed analogy. What you do not get is the possibility to install software from non trusted sources.
    Now, I do understand why users would like to get software from other sources, but I do also understand what the appstore accomplishes.
    I like to see it this way: Apple is acting as a "root" trusted source for software, and it makes sense to trust apple because you already trust them to build the device you buy. They provide services, bandwidth, rules for app developers and enforces them (like it or not), and you do pay for that. Developers get a single place where to sell their software, and they also pay for everything apple does on their behalf (taking a cut from every sale).
    There is someone checking (to the best of their ability) that software you get from that single trusted source is safe (enough?). And this is huge, because phones contain more personal information than other devices, phones are communication devices, the actual devices you use to do most of your communications (unless you only do emails).
    What is the alternative? More app stores? So now you have to trust even more people to to the "right" thing, and they have to earn a living. Disabling all the added safety you get to install untrusted software, on such a sensitive device? I can't afford to own 2 phones, one with trusted software, another with random APKs. I can see a solution, it probably is re-inventing Windows, by having trusted publishers, and pop-ups warning you about trusting unsafe software developers, not reading and always clicking yes.
    Is this any better? Finally, how is apple's track record on security? Better, worse, same as Windows? Better, worse, same as Android? Does this approach work or not?
    I understand your point, you have less control on your device (I avoid the word complete, because you do not have complete control anyway). What do you get in return?
    How is "android is mostly open source" an argument for better android? Being open source just makes it better? When and where have you seen a common android device without closed source parts? There are closed parts even down to the kernel, with binary blobs for hardware support (first one that comes to mind is GPU blobs), and no one knows (beside who made them) what they do.
    And what is this "kernel" thing in android phones? When you make a device, you choose an SoC (that is cpu, gpu, "chipset", ...). The kernel version you get is what the SoC vendors gives you, it is not the latest and it is significantly different from upstream linux. Then you add your stuff to it (drivers, more blobs, whatever is needed). Then you ship your device. You make your kernel sources available online. Nobody (besides you) reviewes the changes you made because nobody other than you is using them, nobody other than you knows if your stuff is secure, you do not get the help of the linux community with your custom version.
    Now, a new vulnerability shows up. If your SoC vendor gives you a patch, great. You now have to apply it for every device you shipped, and it may not be that easy, because you may need to change the stuff you previously added. Your SoC vendor does not give you a patch? Don't expect to be able to take an upstream one, because your kernel is both too different from upstream and too old, meaning even more different, you will have to write it yourself. Multiply that for every device you ship. Every manifacturer has to do this dance, by themselves. Does this seem maintanable? Juicy source here: https://youtu.be/RKadXpQLmPU?t=1113 (at least 5 minutes), then demo: https://youtu.be/RKadXpQLmPU?t=2662
    As you can see, releasing sources is not enough. Open source is better when it works, not when everybody does its thing, duplicating work.
    All of this is only about the kernel, then there are all the other things that android is made of.
    Probably a laser gun?
  24. Agree
    fulminemizzega got a reaction from paddy-stone in Small business redundant NAS in blackouty situation   
    Back to your question, the lowest price I can find on amazon for a 3tb hdd is 60€, that means you have 220€ left for everything else. What can you buy with 220€?
    Definitely no RAID card and no UPS. I really don't know if at this point you should build a NAS or instead just wait until you can have a bigger budget (and also less electrical issues?). You are asking about a NAS, this is what I think. FreeNAS recommends using intel, so everything here applies for intel.
    Choose: if you want to upgrade to ECC ram later on, you will have to spend more money right now, but you would get a "server" motherboard right now.
    Looking at your usercase, you don't need it, but if you can afford it, go for it. If you start reading on freenas forums about ECC/no ECC you will go crazy and start buying triple redundant power supplies. If you want to dig deeper on the subject, http://jrs-s.net/2015/02/03/will-zfs-and-non-ecc-ram-kill-your-data/ and also https://blog.codinghorror.com/to-ecc-or-not-to-ecc/ read the linked papers for some actual data.
    At this point you should be able to find on average how many errors you will have with your ram, per year. I expect your lights will go out more often.
    If you decide against ECC, then you can look for the cheapest motherboard + cpu combination you can find, look for an intel pentium: https://forums.freenas.org/index.php?resources/hardware-recommendations-guide.12/
    Now, be careful, don't go too cheap on the motherboard, check at least the if it has 1gb ethernet.
    If you go for future ECC support, then try to buy a supermicro motherboard, there's a nice IPMI controller onboard.
    Buy 8gb of ram, find the cheapest case, you just need one fan for your 2 disks. Buy a brand power supply, don't go too cheap because that will reduce your hardware's life span. I think that is all 
     
    When your lights will go out, you may lose data only when writing to the NAS, if you time the power outage just right you might be able to lose a small portion of the video file you are writing, it will not corrupt the whole file (thanks to how ZFS works).
    If you find a used server for around 220€ that you can use, it could be a good idea, I can't recommend anything at this moment.
    Since your budget is so tight, why don't you keep waiting and just use an external drive bay to plug and unplug hdds, maybe use file sharing on Windows? It is ugly, you will have to make redundant copies by yourself, but it's way cheaper. Later on you will be able to spend more money on a NAS that will be more useful in the long run, think about it: if you could spend about 500€ on everything except hdds (let's stick to 8gb ecc ram and i3 cpu), you would have a system that will last you years, you will be able to keep adding drives until you have no bays space in your case. Make some projections, try to estimate how fast you are going to fill those HDDs. If you write 10gb a day, 3tb will last you less than a year.
    Don't rush your decision
  25. Agree
    fulminemizzega got a reaction from leadeater in Small business redundant NAS in blackouty situation   
    Back to your question, the lowest price I can find on amazon for a 3tb hdd is 60€, that means you have 220€ left for everything else. What can you buy with 220€?
    Definitely no RAID card and no UPS. I really don't know if at this point you should build a NAS or instead just wait until you can have a bigger budget (and also less electrical issues?). You are asking about a NAS, this is what I think. FreeNAS recommends using intel, so everything here applies for intel.
    Choose: if you want to upgrade to ECC ram later on, you will have to spend more money right now, but you would get a "server" motherboard right now.
    Looking at your usercase, you don't need it, but if you can afford it, go for it. If you start reading on freenas forums about ECC/no ECC you will go crazy and start buying triple redundant power supplies. If you want to dig deeper on the subject, http://jrs-s.net/2015/02/03/will-zfs-and-non-ecc-ram-kill-your-data/ and also https://blog.codinghorror.com/to-ecc-or-not-to-ecc/ read the linked papers for some actual data.
    At this point you should be able to find on average how many errors you will have with your ram, per year. I expect your lights will go out more often.
    If you decide against ECC, then you can look for the cheapest motherboard + cpu combination you can find, look for an intel pentium: https://forums.freenas.org/index.php?resources/hardware-recommendations-guide.12/
    Now, be careful, don't go too cheap on the motherboard, check at least the if it has 1gb ethernet.
    If you go for future ECC support, then try to buy a supermicro motherboard, there's a nice IPMI controller onboard.
    Buy 8gb of ram, find the cheapest case, you just need one fan for your 2 disks. Buy a brand power supply, don't go too cheap because that will reduce your hardware's life span. I think that is all 
     
    When your lights will go out, you may lose data only when writing to the NAS, if you time the power outage just right you might be able to lose a small portion of the video file you are writing, it will not corrupt the whole file (thanks to how ZFS works).
    If you find a used server for around 220€ that you can use, it could be a good idea, I can't recommend anything at this moment.
    Since your budget is so tight, why don't you keep waiting and just use an external drive bay to plug and unplug hdds, maybe use file sharing on Windows? It is ugly, you will have to make redundant copies by yourself, but it's way cheaper. Later on you will be able to spend more money on a NAS that will be more useful in the long run, think about it: if you could spend about 500€ on everything except hdds (let's stick to 8gb ecc ram and i3 cpu), you would have a system that will last you years, you will be able to keep adding drives until you have no bays space in your case. Make some projections, try to estimate how fast you are going to fill those HDDs. If you write 10gb a day, 3tb will last you less than a year.
    Don't rush your decision
×