Jump to content

GoldenSound

Member
  • Posts

    139
  • Joined

  • Last visited

Reputation Activity

  1. Like
    GoldenSound got a reaction from nuwang69 in Which IEMs to buy from this list?   
    I would strongly recommend looking at the TruthEar Zero RED.
    At the $50 price point these are far and away the best of anything I've tested
  2. Agree
    GoldenSound got a reaction from Mr. Rabbit in Gamers Nexus alleges LMG has insufficient ethics and integrity   
    This is a situation with a lot of issues at play, and a heavy mix of major/minor issues, ones with clear causes and others that are likely one-off problems and maybe don't need to be as big of a problem as they are, but for me, the biggest issue has been the response to the concern of accurate testing. 

    Saying that spending $500 of time to retest a product because it wouldn't change the conclusion is simply not ok. It just isn't. You either test something properly or don't publish the testing at all. You are not hobbyists, you are a professional media organization who has a duty to fairly assess a manufacturers products and to provide fair and accurate information to viewers.

    To be clear, I actually AGREE that the conclusion about value almost certainly wouldn't have changed, but that's besides the point. LMG is a big company, and viewers rely on LMG to provide accurate information. If you do not do that, it brings into question ALL the information you provide. And if it happens consistently, the problem is exacerbated. Even more so if you then say you don't think that's a problem!

    I myself produce reviews of audio products which consist of both subjective evaluation AND extensive objective testing, and there have been times where I've been testing stuff and then realized that I'd made a mistake, something was configured wrong, or there was some other issue and it can be REALLY frustrating to spend an entire day or longer testing something only to realize you made an error and need to redo that work.

    It doesn't matter if I don't think the conclusion would change, if my mission is to provide accurate and useful data, that's what I'm going to do. If I make a mistake in testing, that sucks, but its my mistake and isn't fair on the viewers or manufacturer if I just publish it KNOWING there is a problem. 

    Even if you feel strongly enough that the faulty data doesn't matter, in that situation you just make a video on that basis. Explaining your view with no data and explaining that no data-driven results could change your opinion is MUCH better than providing bad/incorrect data.

    It's one thing to catch a mistake after a video goes out, but a video should never ever be published if you KNOW that there is a mistake in the information/data being provided.

    Media outlets and reviewers rely on trust. That is built up over time, but can be lost quickly.

    If you make a mistake, that's fine, it happens, just fix it.

    If you consistently make errors, you will lose trust.

    If you then say that you don't think those errors are worthy of your time to fix, you will lose trust 10 times faster because whether you intended it to or not, a significant portion of your audience will read that as "I don't care".

    If your current workflow does not allow you to catch and correct mistakes, your workflow has a problem and you need to fix it, it's as simple as that.
  3. Agree
    GoldenSound got a reaction from murixbob in Gamers Nexus alleges LMG has insufficient ethics and integrity   
    This is a situation with a lot of issues at play, and a heavy mix of major/minor issues, ones with clear causes and others that are likely one-off problems and maybe don't need to be as big of a problem as they are, but for me, the biggest issue has been the response to the concern of accurate testing. 

    Saying that spending $500 of time to retest a product because it wouldn't change the conclusion is simply not ok. It just isn't. You either test something properly or don't publish the testing at all. You are not hobbyists, you are a professional media organization who has a duty to fairly assess a manufacturers products and to provide fair and accurate information to viewers.

    To be clear, I actually AGREE that the conclusion about value almost certainly wouldn't have changed, but that's besides the point. LMG is a big company, and viewers rely on LMG to provide accurate information. If you do not do that, it brings into question ALL the information you provide. And if it happens consistently, the problem is exacerbated. Even more so if you then say you don't think that's a problem!

    I myself produce reviews of audio products which consist of both subjective evaluation AND extensive objective testing, and there have been times where I've been testing stuff and then realized that I'd made a mistake, something was configured wrong, or there was some other issue and it can be REALLY frustrating to spend an entire day or longer testing something only to realize you made an error and need to redo that work.

    It doesn't matter if I don't think the conclusion would change, if my mission is to provide accurate and useful data, that's what I'm going to do. If I make a mistake in testing, that sucks, but its my mistake and isn't fair on the viewers or manufacturer if I just publish it KNOWING there is a problem. 

    Even if you feel strongly enough that the faulty data doesn't matter, in that situation you just make a video on that basis. Explaining your view with no data and explaining that no data-driven results could change your opinion is MUCH better than providing bad/incorrect data.

    It's one thing to catch a mistake after a video goes out, but a video should never ever be published if you KNOW that there is a mistake in the information/data being provided.

    Media outlets and reviewers rely on trust. That is built up over time, but can be lost quickly.

    If you make a mistake, that's fine, it happens, just fix it.

    If you consistently make errors, you will lose trust.

    If you then say that you don't think those errors are worthy of your time to fix, you will lose trust 10 times faster because whether you intended it to or not, a significant portion of your audience will read that as "I don't care".

    If your current workflow does not allow you to catch and correct mistakes, your workflow has a problem and you need to fix it, it's as simple as that.
  4. Agree
    GoldenSound got a reaction from Biohazard777 in Gamers Nexus alleges LMG has insufficient ethics and integrity   
    This is a situation with a lot of issues at play, and a heavy mix of major/minor issues, ones with clear causes and others that are likely one-off problems and maybe don't need to be as big of a problem as they are, but for me, the biggest issue has been the response to the concern of accurate testing. 

    Saying that spending $500 of time to retest a product because it wouldn't change the conclusion is simply not ok. It just isn't. You either test something properly or don't publish the testing at all. You are not hobbyists, you are a professional media organization who has a duty to fairly assess a manufacturers products and to provide fair and accurate information to viewers.

    To be clear, I actually AGREE that the conclusion about value almost certainly wouldn't have changed, but that's besides the point. LMG is a big company, and viewers rely on LMG to provide accurate information. If you do not do that, it brings into question ALL the information you provide. And if it happens consistently, the problem is exacerbated. Even more so if you then say you don't think that's a problem!

    I myself produce reviews of audio products which consist of both subjective evaluation AND extensive objective testing, and there have been times where I've been testing stuff and then realized that I'd made a mistake, something was configured wrong, or there was some other issue and it can be REALLY frustrating to spend an entire day or longer testing something only to realize you made an error and need to redo that work.

    It doesn't matter if I don't think the conclusion would change, if my mission is to provide accurate and useful data, that's what I'm going to do. If I make a mistake in testing, that sucks, but its my mistake and isn't fair on the viewers or manufacturer if I just publish it KNOWING there is a problem. 

    Even if you feel strongly enough that the faulty data doesn't matter, in that situation you just make a video on that basis. Explaining your view with no data and explaining that no data-driven results could change your opinion is MUCH better than providing bad/incorrect data.

    It's one thing to catch a mistake after a video goes out, but a video should never ever be published if you KNOW that there is a mistake in the information/data being provided.

    Media outlets and reviewers rely on trust. That is built up over time, but can be lost quickly.

    If you make a mistake, that's fine, it happens, just fix it.

    If you consistently make errors, you will lose trust.

    If you then say that you don't think those errors are worthy of your time to fix, you will lose trust 10 times faster because whether you intended it to or not, a significant portion of your audience will read that as "I don't care".

    If your current workflow does not allow you to catch and correct mistakes, your workflow has a problem and you need to fix it, it's as simple as that.
  5. Agree
    GoldenSound got a reaction from ben025 in Gamers Nexus alleges LMG has insufficient ethics and integrity   
    This is a situation with a lot of issues at play, and a heavy mix of major/minor issues, ones with clear causes and others that are likely one-off problems and maybe don't need to be as big of a problem as they are, but for me, the biggest issue has been the response to the concern of accurate testing. 

    Saying that spending $500 of time to retest a product because it wouldn't change the conclusion is simply not ok. It just isn't. You either test something properly or don't publish the testing at all. You are not hobbyists, you are a professional media organization who has a duty to fairly assess a manufacturers products and to provide fair and accurate information to viewers.

    To be clear, I actually AGREE that the conclusion about value almost certainly wouldn't have changed, but that's besides the point. LMG is a big company, and viewers rely on LMG to provide accurate information. If you do not do that, it brings into question ALL the information you provide. And if it happens consistently, the problem is exacerbated. Even more so if you then say you don't think that's a problem!

    I myself produce reviews of audio products which consist of both subjective evaluation AND extensive objective testing, and there have been times where I've been testing stuff and then realized that I'd made a mistake, something was configured wrong, or there was some other issue and it can be REALLY frustrating to spend an entire day or longer testing something only to realize you made an error and need to redo that work.

    It doesn't matter if I don't think the conclusion would change, if my mission is to provide accurate and useful data, that's what I'm going to do. If I make a mistake in testing, that sucks, but its my mistake and isn't fair on the viewers or manufacturer if I just publish it KNOWING there is a problem. 

    Even if you feel strongly enough that the faulty data doesn't matter, in that situation you just make a video on that basis. Explaining your view with no data and explaining that no data-driven results could change your opinion is MUCH better than providing bad/incorrect data.

    It's one thing to catch a mistake after a video goes out, but a video should never ever be published if you KNOW that there is a mistake in the information/data being provided.

    Media outlets and reviewers rely on trust. That is built up over time, but can be lost quickly.

    If you make a mistake, that's fine, it happens, just fix it.

    If you consistently make errors, you will lose trust.

    If you then say that you don't think those errors are worthy of your time to fix, you will lose trust 10 times faster because whether you intended it to or not, a significant portion of your audience will read that as "I don't care".

    If your current workflow does not allow you to catch and correct mistakes, your workflow has a problem and you need to fix it, it's as simple as that.
  6. Agree
    GoldenSound got a reaction from Alexeygridnev1993 in Gamers Nexus alleges LMG has insufficient ethics and integrity   
    This is a situation with a lot of issues at play, and a heavy mix of major/minor issues, ones with clear causes and others that are likely one-off problems and maybe don't need to be as big of a problem as they are, but for me, the biggest issue has been the response to the concern of accurate testing. 

    Saying that spending $500 of time to retest a product because it wouldn't change the conclusion is simply not ok. It just isn't. You either test something properly or don't publish the testing at all. You are not hobbyists, you are a professional media organization who has a duty to fairly assess a manufacturers products and to provide fair and accurate information to viewers.

    To be clear, I actually AGREE that the conclusion about value almost certainly wouldn't have changed, but that's besides the point. LMG is a big company, and viewers rely on LMG to provide accurate information. If you do not do that, it brings into question ALL the information you provide. And if it happens consistently, the problem is exacerbated. Even more so if you then say you don't think that's a problem!

    I myself produce reviews of audio products which consist of both subjective evaluation AND extensive objective testing, and there have been times where I've been testing stuff and then realized that I'd made a mistake, something was configured wrong, or there was some other issue and it can be REALLY frustrating to spend an entire day or longer testing something only to realize you made an error and need to redo that work.

    It doesn't matter if I don't think the conclusion would change, if my mission is to provide accurate and useful data, that's what I'm going to do. If I make a mistake in testing, that sucks, but its my mistake and isn't fair on the viewers or manufacturer if I just publish it KNOWING there is a problem. 

    Even if you feel strongly enough that the faulty data doesn't matter, in that situation you just make a video on that basis. Explaining your view with no data and explaining that no data-driven results could change your opinion is MUCH better than providing bad/incorrect data.

    It's one thing to catch a mistake after a video goes out, but a video should never ever be published if you KNOW that there is a mistake in the information/data being provided.

    Media outlets and reviewers rely on trust. That is built up over time, but can be lost quickly.

    If you make a mistake, that's fine, it happens, just fix it.

    If you consistently make errors, you will lose trust.

    If you then say that you don't think those errors are worthy of your time to fix, you will lose trust 10 times faster because whether you intended it to or not, a significant portion of your audience will read that as "I don't care".

    If your current workflow does not allow you to catch and correct mistakes, your workflow has a problem and you need to fix it, it's as simple as that.
  7. Agree
    GoldenSound got a reaction from BlackSmokeDMax in Gamers Nexus alleges LMG has insufficient ethics and integrity   
    This is a situation with a lot of issues at play, and a heavy mix of major/minor issues, ones with clear causes and others that are likely one-off problems and maybe don't need to be as big of a problem as they are, but for me, the biggest issue has been the response to the concern of accurate testing. 

    Saying that spending $500 of time to retest a product because it wouldn't change the conclusion is simply not ok. It just isn't. You either test something properly or don't publish the testing at all. You are not hobbyists, you are a professional media organization who has a duty to fairly assess a manufacturers products and to provide fair and accurate information to viewers.

    To be clear, I actually AGREE that the conclusion about value almost certainly wouldn't have changed, but that's besides the point. LMG is a big company, and viewers rely on LMG to provide accurate information. If you do not do that, it brings into question ALL the information you provide. And if it happens consistently, the problem is exacerbated. Even more so if you then say you don't think that's a problem!

    I myself produce reviews of audio products which consist of both subjective evaluation AND extensive objective testing, and there have been times where I've been testing stuff and then realized that I'd made a mistake, something was configured wrong, or there was some other issue and it can be REALLY frustrating to spend an entire day or longer testing something only to realize you made an error and need to redo that work.

    It doesn't matter if I don't think the conclusion would change, if my mission is to provide accurate and useful data, that's what I'm going to do. If I make a mistake in testing, that sucks, but its my mistake and isn't fair on the viewers or manufacturer if I just publish it KNOWING there is a problem. 

    Even if you feel strongly enough that the faulty data doesn't matter, in that situation you just make a video on that basis. Explaining your view with no data and explaining that no data-driven results could change your opinion is MUCH better than providing bad/incorrect data.

    It's one thing to catch a mistake after a video goes out, but a video should never ever be published if you KNOW that there is a mistake in the information/data being provided.

    Media outlets and reviewers rely on trust. That is built up over time, but can be lost quickly.

    If you make a mistake, that's fine, it happens, just fix it.

    If you consistently make errors, you will lose trust.

    If you then say that you don't think those errors are worthy of your time to fix, you will lose trust 10 times faster because whether you intended it to or not, a significant portion of your audience will read that as "I don't care".

    If your current workflow does not allow you to catch and correct mistakes, your workflow has a problem and you need to fix it, it's as simple as that.
  8. Like
    GoldenSound got a reaction from pinemach in Gamers Nexus alleges LMG has insufficient ethics and integrity   
    This is a situation with a lot of issues at play, and a heavy mix of major/minor issues, ones with clear causes and others that are likely one-off problems and maybe don't need to be as big of a problem as they are, but for me, the biggest issue has been the response to the concern of accurate testing. 

    Saying that spending $500 of time to retest a product because it wouldn't change the conclusion is simply not ok. It just isn't. You either test something properly or don't publish the testing at all. You are not hobbyists, you are a professional media organization who has a duty to fairly assess a manufacturers products and to provide fair and accurate information to viewers.

    To be clear, I actually AGREE that the conclusion about value almost certainly wouldn't have changed, but that's besides the point. LMG is a big company, and viewers rely on LMG to provide accurate information. If you do not do that, it brings into question ALL the information you provide. And if it happens consistently, the problem is exacerbated. Even more so if you then say you don't think that's a problem!

    I myself produce reviews of audio products which consist of both subjective evaluation AND extensive objective testing, and there have been times where I've been testing stuff and then realized that I'd made a mistake, something was configured wrong, or there was some other issue and it can be REALLY frustrating to spend an entire day or longer testing something only to realize you made an error and need to redo that work.

    It doesn't matter if I don't think the conclusion would change, if my mission is to provide accurate and useful data, that's what I'm going to do. If I make a mistake in testing, that sucks, but its my mistake and isn't fair on the viewers or manufacturer if I just publish it KNOWING there is a problem. 

    Even if you feel strongly enough that the faulty data doesn't matter, in that situation you just make a video on that basis. Explaining your view with no data and explaining that no data-driven results could change your opinion is MUCH better than providing bad/incorrect data.

    It's one thing to catch a mistake after a video goes out, but a video should never ever be published if you KNOW that there is a mistake in the information/data being provided.

    Media outlets and reviewers rely on trust. That is built up over time, but can be lost quickly.

    If you make a mistake, that's fine, it happens, just fix it.

    If you consistently make errors, you will lose trust.

    If you then say that you don't think those errors are worthy of your time to fix, you will lose trust 10 times faster because whether you intended it to or not, a significant portion of your audience will read that as "I don't care".

    If your current workflow does not allow you to catch and correct mistakes, your workflow has a problem and you need to fix it, it's as simple as that.
  9. Agree
    GoldenSound got a reaction from Davikar in Gamers Nexus alleges LMG has insufficient ethics and integrity   
    This is a situation with a lot of issues at play, and a heavy mix of major/minor issues, ones with clear causes and others that are likely one-off problems and maybe don't need to be as big of a problem as they are, but for me, the biggest issue has been the response to the concern of accurate testing. 

    Saying that spending $500 of time to retest a product because it wouldn't change the conclusion is simply not ok. It just isn't. You either test something properly or don't publish the testing at all. You are not hobbyists, you are a professional media organization who has a duty to fairly assess a manufacturers products and to provide fair and accurate information to viewers.

    To be clear, I actually AGREE that the conclusion about value almost certainly wouldn't have changed, but that's besides the point. LMG is a big company, and viewers rely on LMG to provide accurate information. If you do not do that, it brings into question ALL the information you provide. And if it happens consistently, the problem is exacerbated. Even more so if you then say you don't think that's a problem!

    I myself produce reviews of audio products which consist of both subjective evaluation AND extensive objective testing, and there have been times where I've been testing stuff and then realized that I'd made a mistake, something was configured wrong, or there was some other issue and it can be REALLY frustrating to spend an entire day or longer testing something only to realize you made an error and need to redo that work.

    It doesn't matter if I don't think the conclusion would change, if my mission is to provide accurate and useful data, that's what I'm going to do. If I make a mistake in testing, that sucks, but its my mistake and isn't fair on the viewers or manufacturer if I just publish it KNOWING there is a problem. 

    Even if you feel strongly enough that the faulty data doesn't matter, in that situation you just make a video on that basis. Explaining your view with no data and explaining that no data-driven results could change your opinion is MUCH better than providing bad/incorrect data.

    It's one thing to catch a mistake after a video goes out, but a video should never ever be published if you KNOW that there is a mistake in the information/data being provided.

    Media outlets and reviewers rely on trust. That is built up over time, but can be lost quickly.

    If you make a mistake, that's fine, it happens, just fix it.

    If you consistently make errors, you will lose trust.

    If you then say that you don't think those errors are worthy of your time to fix, you will lose trust 10 times faster because whether you intended it to or not, a significant portion of your audience will read that as "I don't care".

    If your current workflow does not allow you to catch and correct mistakes, your workflow has a problem and you need to fix it, it's as simple as that.
  10. Like
    GoldenSound got a reaction from dinkostinko in What do drivers do in headphones?   
    Drivers are the actual part of the headphone that moves, converting electrical energy into acoustic energy.

    The most common type is a 'dynamic' driver. This is what you see in most speakers and headphones. A coil of wire around a magnetic core. With the coil being attached to a diaphragm. 
    The voice coil then moves when current is applied. (This is what your headphone cable is connected to). Moving the diaphragm, and creating sound.



     
    There are other types of driver however, such as Planar magnetic, electrostatic, ribbon, and balanced armature. But the vast majority of headphones and speakers use dynamic drivers. It's typically only once you start looking at the higher end of the market you'll get more into the other types.
  11. Informative
    GoldenSound got a reaction from person123456789 in What do drivers do in headphones?   
    Drivers are the actual part of the headphone that moves, converting electrical energy into acoustic energy.

    The most common type is a 'dynamic' driver. This is what you see in most speakers and headphones. A coil of wire around a magnetic core. With the coil being attached to a diaphragm. 
    The voice coil then moves when current is applied. (This is what your headphone cable is connected to). Moving the diaphragm, and creating sound.



     
    There are other types of driver however, such as Planar magnetic, electrostatic, ribbon, and balanced armature. But the vast majority of headphones and speakers use dynamic drivers. It's typically only once you start looking at the higher end of the market you'll get more into the other types.
  12. Informative
    GoldenSound got a reaction from Psittac in Max headphone Impedance supported by Motherboard   
    Headphone impedance is very often misunderstood, and there's a bit of history to it.

    From an electrical standpoint, there is no 'max impedance' an output will support. In fact, higher impedance loads are EASIER to drive due to requiring less current for the same voltage.
    So you don't need to worry about your headphone impedance causing any issues there.

    The headphones that are actually genuinely hard to drive are the low impedance, low sensitivity planars as these draw significant amounts of current. (Some extreme examples like the Hifiman Susvara people often run on speaker amplifiers).
  13. Agree
    GoldenSound got a reaction from Spuriae in Max headphone Impedance supported by Motherboard   
    Headphone impedance is very often misunderstood, and there's a bit of history to it.

    From an electrical standpoint, there is no 'max impedance' an output will support. In fact, higher impedance loads are EASIER to drive due to requiring less current for the same voltage.
    So you don't need to worry about your headphone impedance causing any issues there.

    The headphones that are actually genuinely hard to drive are the low impedance, low sensitivity planars as these draw significant amounts of current. (Some extreme examples like the Hifiman Susvara people often run on speaker amplifiers).
  14. Agree
    GoldenSound got a reaction from CTR640 in What is better: Dolby Atmos headphones or 7.1 headphones   
    Both are just software trickery to try to emulate a surround sound result (usually poorly).
    The actual hardware is no different at all.
  15. Agree
    GoldenSound got a reaction from OfficialTechSpace in What is better: Dolby Atmos headphones or 7.1 headphones   
    Both are just software trickery to try to emulate a surround sound result (usually poorly).
    The actual hardware is no different at all.
  16. Like
    GoldenSound got a reaction from OfficialTechSpace in Can somebody send me a .flac song?   
    Yeah I was pretty disappointed too 😞
    The MQA stuff was annoying but at least you could turn it off/opt out.
    Now you can't and so it's not really a lossless service for a huge number of tracks.

    A real shame
  17. Informative
    GoldenSound got a reaction from OfficialTechSpace in Can somebody send me a .flac song?   
    Unfortunately the Tidal HiFi tier is also MQA, just with flagging removed so your DAC/player does not recognise it as such. But you are actually being served the exact same files as on the Master tier.
    This means that for any song marked with the 'MASTER' label, you actually cannot stream it losslessly on Tidal at all.

    Qobuz or Amazon HD are better alternatives until Tidal addresses this. (If they ever do)
     
    https://goldensound.audio/2021/11/29/tidal-hifi-is-not-lossless/
  18. Agree
    GoldenSound got a reaction from eskamobob1 in DAC / AMP that have EQ?   
    Generally you'd want to avoid hardware EQ unless you NEED to use it. Especially if you aren't willing to spend several grand on pro level gear.

    For an all in one with EQ, you'd want to look at the RME ADI-2. That has a lot of DSP features built in including parametric EQ.

    If wanting to use a different DAC, one of the MiniDSP products would work
  19. Agree
    GoldenSound got a reaction from LionSpeck in Got my studio headphones, now what?   
    It's worth noting that the windows mixer itself is not bitperfect, even if sample rate is set correctly. It's tricky to get good quality audio through windows mixer.
    If you want to play music in the highest quality, with no alteration of the data at all, you need to use a player that has an 'Exclusive mode' output support. (WASAPI Exclusive, or ASIO if your DAC has an ASIO driver).

    Players like foobar and musicbee are free and can do this. Or if you're wanting something higher end and are happy to pay, Roon and Audirvana are fantastic
    Tidal has WASAPI Exclusive support, and Qobuz has both WASAPI Exclusive and ASIO support.

    Spotify is windows mixer only. There used to be a third party client called 'fidelify' which added WASAPI Exclusive support but to my knowledge it's no longer maintained.

    Exclusive mode will bypass windows audio mixing, automatically adjust the sample rate of your DAC depending on the content being played, and if using ASIO, will also support DSD content if you have any and your DAC supports it.
  20. Like
    GoldenSound reacted to LionSpeck in Got my studio headphones, now what?   
    This is false; setting the volume to 98 (or any) % won't avoid clipping, it will only reduce by 2% the resolution of your DAC. That 2% volume reduction (a linear scale, then quantized in digital) happens after your player, therefore if you're clipping, you're clipping before that linear map, scaling that distortion as well. DACs don't clip; their 100% is Windows' 100%.
    The only way you can have a "DAC clip" (it doesn't happen in the DAC, but still in Windows), is if you boost your audio somewhere in your digital path, perhaps with EQ or ReplayGain. Again, since Windows' volume will apply afterwards, this will simply scale the clipped audio to 98%, retaining the distortion.
     
    Clipping is a kind of distortion that happens when your signal hits the limits of amplitude of your carrier:
    - in digital, if the value tries to go above 16/24 bits, or below 0; this happens only if there's some audio boosting happening (EQ, ReplayGain or something similar).
    - in analog, if the value tries to go above or below the maximum available voltage swing of the audio signal chain. The component that has the lower limit will cause the clipping; generally it's either an OpAmp, or a transistor. This happens if the audio coming from the DAC / preamp overloads the gain stage (very rare) or if you're overloading the power stage by attempting to drive low sensitivity drivers at higher volumes. Typically, high impedance headphones have this kind of issue, as they require high amplitude signals.
    Power, in audio devices*, isn't limited by current, but by how much amplitude you can get before too much distortion or clipping.
     
    *except when you're dealing with <8Ohms loudspeakers and power amplifiers; some amps (especially tube, OTLs and BJT amps) may not be able to drive lower impedance loudspeakers for current reasons. With headphones there's no such problem.
  21. Agree
    GoldenSound got a reaction from H713 in Headphone   
    Personally I would say yes.
    Having tried the HD820 I (and many others) feel that it does not deliver on quality for the price at all due to the uh....well unfortunately extremely wonky frequency response



    Something like the HD800S for comparison:



    If I were you, I personally would recommend selling and swapping to something like a Hifiman Arya if possible.

    Or if you're happy to dunk more money into it, something higher maybe (though your DAC/Amp will be a bigger factor then)
  22. Like
    GoldenSound got a reaction from Linus william in Headphone   
    Personally I would say yes.
    Having tried the HD820 I (and many others) feel that it does not deliver on quality for the price at all due to the uh....well unfortunately extremely wonky frequency response



    Something like the HD800S for comparison:



    If I were you, I personally would recommend selling and swapping to something like a Hifiman Arya if possible.

    Or if you're happy to dunk more money into it, something higher maybe (though your DAC/Amp will be a bigger factor then)
  23. Agree
    GoldenSound got a reaction from Spuriae in Headphone   
    Personally I would say yes.
    Having tried the HD820 I (and many others) feel that it does not deliver on quality for the price at all due to the uh....well unfortunately extremely wonky frequency response



    Something like the HD800S for comparison:



    If I were you, I personally would recommend selling and swapping to something like a Hifiman Arya if possible.

    Or if you're happy to dunk more money into it, something higher maybe (though your DAC/Amp will be a bigger factor then)
  24. Like
    GoldenSound got a reaction from Entropy. in For use with DAC. Usb vs optical, what do you use? What is better?   
    TLDR: You'll usually get better quality from USB. Optical specifically is often best avoided.
    The 'bits are bits' argument is not true in audio as it ignores some aspects specific to digital audio transmission protocols.

    Long Version:
    So there is a lot of misunderstanding surrounding digital audio. With the 'bits are bits' argument often being used.

    It's true in the sense that the actual CONTENT of the signal, the audio data itself, the 1's and 0's will either reach the DAC intact, or they will not.
    There are no 'audiophile bits' and as far as data integrity goes, optical SPDIF, coaxial SPDIF, AES, USB, I2S or anything else will be able to get the exact same data to the DAC.

    BUT, the issue with audio is that it is not purely about data integrity. TIMING is important too. If you have exactly the same audio content, (ie: Exactly the same PCM samples) but they are converted at different times, you'll get distortion.
    This is called jitter.
     

     
    USB is an 'asynchronous' protocol.
    This means that the timing is actually controlled by the DAC.
    A super simplified explanation is that audio data from your PC is sent to your DAC, often in chunks at a time.
    This is put into a buffer on the USB receiver of the DAC, and then the DAC uses it's own internal clock (crystal oscillator) to determine when to convert these samples.

    When the buffer is getting low the DAC instructs the PC to hand it some more data and it sends more to put into the buffer ready for conversion.

    The DAC is dependent on its own clocks for timing accuracy, and whether you plug this DAC into a Mac, a PC, a Pi, or any other device, the jitter performance will be absolutely the same. The timing with which the source device sends data to the DAC makes absolutely no difference so long as the DAC always has the next sample in the buffer ready to convert.


    SPDIF is a 'synchronous' protocol. And unlike USB, the source you use DOES have a direct impact on the quality of the output of the DAC, even if the data itself is completely identical.
    Why? Because with SPDIF, the clock signal is actually sent by the source device (PC), NOT the DAC's own clocks.

    Data is sent in a constant stream, accompanied by the clock signal, and so if there is additional jitter on that given clock signal, the DAC itself will be converting samples with less accurate timing and the analog output will be less accurate.

    We can show this quite easily by measuring the output of the DAC.
    This for example is the Schiit Yggdrasil, playing a 'J-Test' file, when fed SPDIF from a good source:


     
    And then this is that same DAC, playing EXACTLY the same file, with no data integrity issues, but with a poorer quality SPDIF source that has higher jitter:


     
    These are not digital simulations or readings, these are measurements of the actual XLR analog output of this DAC.
     
    Many DACs will have methods of attempting to get rid of jitter from a poor source device.
    One common method is a PLL, however a PLL does not eliminate jitter, it's more accurate to describe it as jitter attenuation. The level at which it can do so is dependent on both the design of the PLL itself (performance of PLLs varies MASSIVELY), and the reference clock used. As the PLL can only ever be as accurate as the clock you're using as a reference.

    So unfortunately, in audio, bits are not always bits. And other aspects of the digital source can absolutely make a difference in quality even if the 1's and 0's themselves are no different.
    This is why there is an entire product category called 'digital to digital converters' which take audio data from your PC, and output over I2S, AES, SPDIF etc, with exceptionally low levels of jitter. The idea being that you can use these to gain an improvement in jitter performance over your DACs internal clocks when using USB. And in many cases these work quite well.

    An example being the singxer SU-6: https://shenzhenaudio.com/products/singxer-su-6-xmos-xu208-cpld-femtosecond-clock-usb-digital-interface

    And the last thing to mention would be noise. As @H713 mentioned, optical offers galvanic isolation, meaning there is no actual electrical connection between the source and DAC (cause it's just light). This prevents any sort of noise being conducted from the noisy, beefy gaming PC to the DAC which can degrade performance.

    Some DACs will have this internally but typically only at the higher end of things.
    But DDCs like the singxer SU-6 will almost always include this feature as well so they can provide an exceptionally low noise output on SPDIF/AES/I2S even if your PC is horrifically noisy.
    There are also products to do this on USB such as the Intona 7055-C https://intona.eu/en/products/7055-c

    Generally though the noise advantage of optical is not worth the often drastic hit to jitter performance. ESPECIALLY if you're using a PC. Seriously most motherboard optical outs that I've tested were beyond awful it's a miracle some dacs can even successfully lock to them.
     
     
     
  25. Informative
    GoldenSound got a reaction from Rauten in For use with DAC. Usb vs optical, what do you use? What is better?   
    TLDR: You'll usually get better quality from USB. Optical specifically is often best avoided.
    The 'bits are bits' argument is not true in audio as it ignores some aspects specific to digital audio transmission protocols.

    Long Version:
    So there is a lot of misunderstanding surrounding digital audio. With the 'bits are bits' argument often being used.

    It's true in the sense that the actual CONTENT of the signal, the audio data itself, the 1's and 0's will either reach the DAC intact, or they will not.
    There are no 'audiophile bits' and as far as data integrity goes, optical SPDIF, coaxial SPDIF, AES, USB, I2S or anything else will be able to get the exact same data to the DAC.

    BUT, the issue with audio is that it is not purely about data integrity. TIMING is important too. If you have exactly the same audio content, (ie: Exactly the same PCM samples) but they are converted at different times, you'll get distortion.
    This is called jitter.
     

     
    USB is an 'asynchronous' protocol.
    This means that the timing is actually controlled by the DAC.
    A super simplified explanation is that audio data from your PC is sent to your DAC, often in chunks at a time.
    This is put into a buffer on the USB receiver of the DAC, and then the DAC uses it's own internal clock (crystal oscillator) to determine when to convert these samples.

    When the buffer is getting low the DAC instructs the PC to hand it some more data and it sends more to put into the buffer ready for conversion.

    The DAC is dependent on its own clocks for timing accuracy, and whether you plug this DAC into a Mac, a PC, a Pi, or any other device, the jitter performance will be absolutely the same. The timing with which the source device sends data to the DAC makes absolutely no difference so long as the DAC always has the next sample in the buffer ready to convert.


    SPDIF is a 'synchronous' protocol. And unlike USB, the source you use DOES have a direct impact on the quality of the output of the DAC, even if the data itself is completely identical.
    Why? Because with SPDIF, the clock signal is actually sent by the source device (PC), NOT the DAC's own clocks.

    Data is sent in a constant stream, accompanied by the clock signal, and so if there is additional jitter on that given clock signal, the DAC itself will be converting samples with less accurate timing and the analog output will be less accurate.

    We can show this quite easily by measuring the output of the DAC.
    This for example is the Schiit Yggdrasil, playing a 'J-Test' file, when fed SPDIF from a good source:


     
    And then this is that same DAC, playing EXACTLY the same file, with no data integrity issues, but with a poorer quality SPDIF source that has higher jitter:


     
    These are not digital simulations or readings, these are measurements of the actual XLR analog output of this DAC.
     
    Many DACs will have methods of attempting to get rid of jitter from a poor source device.
    One common method is a PLL, however a PLL does not eliminate jitter, it's more accurate to describe it as jitter attenuation. The level at which it can do so is dependent on both the design of the PLL itself (performance of PLLs varies MASSIVELY), and the reference clock used. As the PLL can only ever be as accurate as the clock you're using as a reference.

    So unfortunately, in audio, bits are not always bits. And other aspects of the digital source can absolutely make a difference in quality even if the 1's and 0's themselves are no different.
    This is why there is an entire product category called 'digital to digital converters' which take audio data from your PC, and output over I2S, AES, SPDIF etc, with exceptionally low levels of jitter. The idea being that you can use these to gain an improvement in jitter performance over your DACs internal clocks when using USB. And in many cases these work quite well.

    An example being the singxer SU-6: https://shenzhenaudio.com/products/singxer-su-6-xmos-xu208-cpld-femtosecond-clock-usb-digital-interface

    And the last thing to mention would be noise. As @H713 mentioned, optical offers galvanic isolation, meaning there is no actual electrical connection between the source and DAC (cause it's just light). This prevents any sort of noise being conducted from the noisy, beefy gaming PC to the DAC which can degrade performance.

    Some DACs will have this internally but typically only at the higher end of things.
    But DDCs like the singxer SU-6 will almost always include this feature as well so they can provide an exceptionally low noise output on SPDIF/AES/I2S even if your PC is horrifically noisy.
    There are also products to do this on USB such as the Intona 7055-C https://intona.eu/en/products/7055-c

    Generally though the noise advantage of optical is not worth the often drastic hit to jitter performance. ESPECIALLY if you're using a PC. Seriously most motherboard optical outs that I've tested were beyond awful it's a miracle some dacs can even successfully lock to them.
     
     
     
×