Jump to content

LAwLz

Member
  • Posts

    19,151
  • Joined

  • Last visited

Reputation Activity

  1. Funny
    LAwLz reacted to Erioch in YouTube ads during WAN show replay   
    Ads?  During the WAN Infomercial show?  Outragious!
  2. Agree
    LAwLz reacted to Holmes108 in NVIDIA CEO Jen-Hsun Huang: Don't learn computer science. The future is human language (AI code generation)   
    Which could still very well lead to a drastic reduction in programming jobs. So Jensen might still not be way off base.
  3. Like
    LAwLz reacted to wanderingfool2 in Microsoft fixes the Teams app on Windows 11   
    hmm, neat, well I stand corrected on that...although after a while I just stopped looking for it...still the fact that it took like 2 years is inexcusable.
     
    I agree, and the answer I think is a lot of people don't try to be efficient (at least at the work places I've noticed) and most people stick to a webbrowser which has a good amount of tabs.  I actually find some of my daily tasks always to be slower having to switch in Windows 11 style windows or the always combine.
     
    Still in Windows 11 I'm still faced daily by annoyances that Windows 10 did have...XP -> 7, tons of features that if I moved from 7 -> XP my life would be worse, 7 -> W10 same deal, I wouldn't return to W7.  10 -> 11, every day I wish I was back on 10.  I can't really think of any redeeming qualities that are deal breakers
  4. Agree
    LAwLz got a reaction from Heats with Nvidia in TV brain= hacked firestick?   
    It's most likely next to impossible. I kind of doubt the ribbon cable just sends a regular HDMI signal to the display. These things are typically quite "non-standard". 
     
    If you were to make it work, you'd probably need to solve far more difficult challenges than getting it powered on. 
    Since a lot of things will probably be very specific to your particular model of TV, it's most likely too much to ask someone to help you. It would just require way to much time and effort. 
  5. Agree
    LAwLz reacted to wanderingfool2 in Microsoft fixes the Teams app on Windows 11   
    Honestly, Windows 11 was one of the worst quality of life regressions I've experienced.  I'm glad that they fixed it, but there is just so much wrong with the way Windows 11 behaves.
     
    I'm still waiting for them to bring back expanded taskbar stuff.  I do like the icons being small, but man it gets really annoying have 2 windows open and needing to bring my mouse down click, move up find the window, click and then go about my way.  The whole tab solution doesn't make practical sense if I need two windows open at the same time to interact with each other.
     
    Excel for the longest time needing to open another instance to create two windows...it's like they never went through a common workflow of people.
     
    Finally, the Windows explorer which they rewrote and still didn't make it so that the freezing of one window (lets say accessing a file that's on a drive that's booting up but the drive is a HDD with a folder with 1000's of log files that can't be cleaned so you have to wait 1 - 2 minutes while it tries loading but all your other explorer windows are locked up)...or better yet, accessing a network drive where the computer was turned off and it takes a minute for explorer to become responsive again.
     
    I am glad they fixed the teams thing, but there is so much more day to day things that Microsoft really should be fixing that should never have existed in the first place on a final released product
  6. Agree
    LAwLz reacted to StDragon in Microsoft fixes the Teams app on Windows 11   
    New Teams is for business (enterprise) use only at this time, so I don't think MS cares about this from a PR perspective as this isn't launched for the general consumer.

    As to why they've done this with two versions in parallel; that's because MS wrote the Teams client from Electron (which is very slow) to use WebView2 (more responsive) instead. New Teams was buggy as hell (most of which haven shaken out by now). So, they couldn't just replace Classic Teams with New Teams. But they also needed feedback from the user base of Classic Teams. MS thought this was the best way to transition per their roadmap.

    I'm not arguing the opinion of the matter, just stating facts as stated by Microsoft.
  7. Informative
    LAwLz got a reaction from NobleGamer in WPA3 (WAN Show Comment)   
    Except you weren't keeping it simple because what you said was just straight up wrong.
    As I have said time and time again, the frequency does not matter at all. The 2.4GHz and 5Ghz networks that are usually sent out by typical consumer routers? They have full access to each other. Putting IoT devices on the 2.4GHz network and other devices on the 5GHz network does not provide any additional security, at all, because they are the same network. 
     
     
     
    Judging by your comments in this thread I suspect your reasoning is wrong, and now you're giving advice to others that at best gives them a false sense of security while actually providing zero security benefits. At worst you might be doing that on top of making their network less functional and more limited.
     
     
    Excuse me but... What are you on about?
    You are wording things very weirdly, but I assume you are talking about how having a let's say a 802.11g device on a network capable of 802.11n will slow things down for all the wireless-N devices. Is that what you are talking about? Because that is completely unrelated to everything you have said so far. That has nothing to do with security, it has nothing to do with frequency, and it is not a reason for segmenting IoT devices into 2.4GHz and other devices into 5GHz. What you are talking about is simply not related to frequency.
     
    It's also not because the AP has to "constantly switch speeds" that it slows things down... The switching of "speed" (as in, the protocol used) is instantaneous. The reason why older devices might slow down speeds for newer devices is because they require more airtime to transmit the same amount of data. Since Wi-Fi is a shared medium, everyone has to wait for the transmission to be over. The slower the device is, the longer time the others have to wait (or rather, the more time slots it will be allocated to send the same amount of data as a faster device).
    Beacon frames are also sent at the lowest mandatory data rate so that also slows things down since they use up airtime too.
     
    But again, that has nothing to do with the frequency it operates on. It is not a reason to keep IoT devices on a 2.4GHz network and other devices on a 5Ghz network. It might be a reason why you would like to put them on different SSIDs, but that is unrelated to frequency as I have said time and time again.
  8. Agree
    LAwLz got a reaction from NobleGamer in WPA3 (WAN Show Comment)   
    Except that's not what you should be doing... Because you shouldn't assume a piece of the spectrum is "tainted". You should assume a network, which is independent of what frequency you transmit it on, is "tainted". Just because a device on let's say the SSID "IoT" is vulnerable does not mean the SSID "Office" is vulnerable because it is broadcasted on the same frequency. A vulnerable device might compromise the network itself, but that does not inherently extend across all devices sharing the same spectrum. Spectrum has nothing to do with it. In most networks, a compromised host on the 2.4GHz spectrum might extend to the 5GHz spectrum as well because they are probably the same network. Having separate 2.4GHz networks and 5Ghz networks, with a firewall between them, is a very strange and uncommon setup (because it is a poor, inflexible design that doesn't offer any benefits).
     
    The reason I keep pushing on this is that it seems like you think, or at least people reading your post might think, that 2.4GHz is its own "network", which it isn't. What you say makes as little sense as saying "Cat 5e is tainted, so you should keep that on a separate network" or "100Mbps Ethernet is not secure, so you should keep that on a separate network".
    The big issue I have with what you are saying, is that it you are partially right for the wrong reasons, and as a result, you might mislead people into assuming "devices that are on 5GHz = more secure", which is not true. 
     
    The frequency has absolutely nothing, none at all, zero, with how secure something is, and a single network can be (and usually is) available on multiple frequencies just like a device with a Cat 5e cable might be connected to the same network as a device with a Cat 6 cable. 
    I don't even agree with the premise you built this entire argument on, the idea that unsecure IoT devices only support 2.4GHz. I have given you an example of possibly the most popular IoT device and it supports 5GHz. So your argument falls apart even on that front.
     
    The part about you having two access points and broadcast two separate networks from each is also a red flag in my mind, but I won't go into that now.
     
    I just want to leave the discussion by saying group and isolate devices based on purpose and "security rating", not "frequency they support". It makes no sense to do that. If you want to make an "IoT" network, you can do that. It might be a very good idea to isolate those devices since they tend to be less than steller security wise. But it is totally fine (and probably a good idea) to have that network be available on both 2.4GHz and 5Ghz. Likewise, your "secure WiFi" should probably be available on both 2.4GHz and 5Ghz as well. It's the SSID that determines the perimeters of the network, not the frequency. You should absolutely not assume that 5GHz devices are secure and 2.4GHz devices are not secure either. Both can be either good or bad security-wise. Isolate your devices based on potential security risks, not hardware capabilities. 
  9. Agree
    LAwLz got a reaction from NobleGamer in WPA3 (WAN Show Comment)   
    The security of your Wi-Fi shouldn't be based on range though, and you could always turn down the 2.4Ghz antenna if you wanted less range. 
    What you're advocating is basically security through obscurity, and that's not real security. 
     
    But that's the wrong way to think, because you shouldn't isolate clients based on frequency. You isolate them based on function. 
     
     
    Honestly, you're thinking about this all wrong. Frequency has absolutely nothing to do with security. 
    If you want to isolate your IoT devices then that's fine, in fact it's probably good to do that. But you should isolate them based on the fact that they are IoT devices, not because they connect to 2.4GHz. You could have a 5GHz IoT network if you wanted as well. 
     
    What you should be doing, assuming your network devices support it, is this:
    Make a separate SSID based on function/purpose. If you want an IoT network then make that. 
    Then map that to a specific VLAN.
    It shouldn't be based on frequency. You can have that SSID be transmitted on any frequency you want, 2.4Ghz, 5Ghz or even 6Ghz.
    The benefit of doing things this way is that you can have the same network be available on wired connections (just have that VLAN be available on the switch), you can have a 5GHz capable IoT device connect to the network too if that's the better option. You can even have your IoT devices automatically decide if they should connect to 2.4GHz or 5GHz based on load or interference. It will also result in less management overhead because you no longer need to create duplicate configs whenever you want one network on multiple bands. 
     
    Are you worried that someone on the street will sit there and try and hack your network because it's sending out at 2.4GHz? Lower your transmission power and make sure your network and devices have decent security implemented on them. Since the actual security mechanisms don't differ from 2.4GHz, 5GHz and 6Ghz, your networks should be equally secure to one another. If you're worried about someone attacking your 2.4GHz network then you should be equally worried about the 5GHz network since that should be using the same security practices. It's not like the key exchange or encryption algorithm differs between the two, unless you deliberately made the SSID you mapped to 2.4GHz worse. 
     
     
    The idea that you have a 2.4Ghz network that transmits on 2.4Ghz and one 5Ghz network that transmits on 5Ghz is something you probably picked up from using a bunch of consumer grade all-in-one routers, and those typically bridge to each other on those devices anyway, so it's not like it's a security thing.
    The frequencies you transmit on should just be a parameter you apply to an SSID. It shouldn't be seen as a separate network. 
  10. Agree
    LAwLz got a reaction from NobleGamer in WPA3 (WAN Show Comment)   
    Stop listening to Linus for tech advice.
    Not sure where he got the idea that WPA2 is easy to crack from. He is not right though if that's what he said in the video. 
    WPA2 is, when configured properly, very secure. There are a few pitfalls you can fall into when configuring WPA2 but most vendors have good defaults so unless you deliberately start changing settings you don't know what they mean (like switching from CCMP to TKIP), you will be fine.
     
     
     
    Frequency has nothing to do with it. 2.4GHz, 5GHz and 6GHz are equally secure to each other. It's true that some IoT devices, which may be unsecure, might only support 2.4GHz, but it's not the network that's unsecure in those cases. Thinking that isolating the 2.4GHz clients from the 5GHz clients will increase security is completely wrong.
    If you want to isolate your IoT devices then you isolate them based on the device type/purpose/security practice, not the frequency they use to connect. Especially since plenty of IoT devices, such as the Raspberry Pi, do support 5Ghz these days.
     
     
    You probably have quite a few devices that support it.
    All your up-to-date Windows machines support it. Your Android devices running Android 10 or higher support it. 
    Your iOS devices running iOS 13 or later also support it.
    (There may or may not be some old hardware that has trouble with WPA3, but in theory, it does not require any new hardware. All WPA2 hardware should be able to support WPA3. In most cases, you should only have to update the software like your OS and it should work).
  11. Informative
    LAwLz got a reaction from Lightwreather in NVIDIA CEO Jen-Hsun Huang: Don't learn computer science. The future is human language (AI code generation)   
    Two thoughts I had when I saw this video, and I am a bit scared that nobody else has pointed this out.
    1) The video is super heavily edited. I am not even sure this is what he said because it is so heavily edited. Does anyone have the original video? In that 1 minute video I counted something like 18 cuts. That the large amount of cuts doesn't ring an alarm bell to anyone else is in my opinion a pretty big alarm bell. Are people just so used to the jump cuts that they don't notice them? Each and every cut could be taking out important context that may bring nuance or it could also just be stitching together words to form sentences that were never even said to begin with.
     
    2) Is he really saying "don't learn computer science"? 
    To me, the way I interpret what was said, is that while we used to hear people say "everyone should learn how to program" we are quickly moving towards a world where few people will need to learn how to code. That we should aim to make tools that makes it so that everyone can "program" without knowing a programming language. 
    That people who used to need to learn programming to make their computers do something that was actually related to something else (like biology) will hopefully be able to just focus on the biology part in the future. Instead of needing to learn how to code because they need to write a program to check some protein reaction or whatever, they will hopefully in the future be able to just focus on learning the biology part and let the computer handle the "how to create a program to check this" part.
     
    That we are moving away from "everyone need to learn programming because programming is a skill every job needs" to "you don't need to learn programming if that's not going to be your primary focus".
     
    I might be misinterpreting what he said, but I think that is a fairly logical explanation and view considering some particular word choices and context.
  12. Like
    LAwLz reacted to wanderingfool2 in NVIDIA CEO Jen-Hsun Huang: Don't learn computer science. The future is human language (AI code generation)   
    I've seen ChatGPT spit out code for a fairly mundane thing that would pass most tests but has fatal flaws in it [ones humans wouldn't make]...I'm not sure I would put it at the absolutely yet.
     
    Where I stand on this topic is I think that we will always need to learn computer science because there will always be edge cases with any technology and you will always need people in order to figure things out or check the logic actually is proper...with that said, I think as a whole AI will greatly reduce the amount of time needed to essentially complete tasks and for non-critical things as well you could get by with just the testing/verifying results...especially for workloads the fit the concept of validating being easier than "solving" [e.g. if you solve a sudoku, it's easy for someone to validate but to solve it takes time].
     
    What AI really allows us to do though is to create works that are almost there, and then the computer science people will step in and clean things up/remove bad logic.
     
     
    I agree with your comments on blanket statements comment, and would like to add this AI has the good and ugly sides to it.  AI tools can help search and refine code, and spot errors that a human might not really notice [Like the heartbleed bug, a simple human oversight that an AI would be able to quickly understand].  Then we get to the ugly
    int overfloweg(int x) { if (x + 100 < x) print("overflow error"); return x + 100; } The above technically is valid(ish), technically undefined, a programmer could see it and if they are familiar would understand what is happening.  On one compiler though with optimizations, it doesn't "understand" it and optimizes the code to never run the if statement...thus allowing an overflow.
     
    Humans and AI are fallible, but putting the two together under the right conditions can be a wonderful thing.
     
    I don't have to spend hours typing out skeleton files/basic files that I need to create to create a project.  I just spend 5 minutes with the AI to spit out all the work and then 5 minutes to verify what it did was correct enough and continue with the portions I know it can't handle/wouldn't do accordingly.  It saves lots of time.  The way I would put it, code written by AI allows the developer to focus on the more important aspects of their work.
  13. Like
    LAwLz reacted to TatamiMatt in At a bit of a loss   
    @jaslion @Blasty Blosty @venomtail @LAwLz @Somerandomtechyboi
     
    Just to update anyone whose in terested in this, ended up relaying this info (He has already bought the game on steam but hasnt been able to run it so his play time is under 2 hours and can still be refunded)


     
    Thanks for the help and if you have anything else you think would be good to add!
     
    (Marking solution on these posts is always a pain, everyones been helpful 😅)
  14. Agree
    LAwLz got a reaction from mr moose in Long, but excellent, article by Steven Sinofsky (in charge of Windows when it went through the EU wringer) on Apple's DMA compliance   
    Sinofsky is a massive twat. I am not surprised that he is strongly against the DMA, a piece of legislation trying to keep giant tech companies from abusing their positions of power to lock users into their ecosystems. That is exactly what Sinofsky was trying to push (and in some regards succeeded) when he was at Microsoft.
     
    I also think he is being a bit silly or maybe disingenuous when he says the legislation is "clearly aimed at specific US companies" and then goes ahead and lists companies like Samsung, ByteDance, Alibaba, AliExpress, Booking.com, and Zalando as also being affected. Is he aware that those companies aren't American? I mean, it is Sinofsky we're talking about so I wouldn't be surprised if he doesn't know Samsung isn't from the US...
     
    Maybe the issue isn't that "the legislation is aimed at US companies because the EU is evil and want to harm America!" but rather "a lot of the big companies that are abusing their power are from the US"?
     
     
    There is so much bullshit in this article it's not even funny.
    Things like claiming Apple has never abused their position of power and that no consumer has been harmed by the way Apple acts. I would argue that the 30% cut Apple takes is an abuse of their position. Especially since they forbid developers from telling users about for example cheaper rates on their website. Telling developers "no, you are not allowed to tell your users that they can subscribe to the service without also paying us, Apple, is not allowed and we will take away everything from you if you do" is not exactly a friendly and non-abusive way of handling your users or developers. Sinofsky might think that's not abusive because he looks up to Apple a lot and wanted Microsoft to be like Apple, but if he is going to claim that's perfectly fine, good and not an abuse of power then he is in my eyes a dumbass.
  15. Like
    LAwLz got a reaction from TatamiMatt in At a bit of a loss   
    "1080p60fps" doesn't really tell us much.
    There is a big difference between 1080p60fps when using upscaling and running at lowest detail settings, and if you are running at native resolution with everything maxed out. It's also a big difference if we are talking about average FPS or minimum FPS.
     
    The recommended specs for 1080p 60 FPS with medium settings is an Nvidia RX 2060 or AMD RX 6600 XT with a Ryzen 3700K, 16GB of RAM.

    Judging by forum posts I've heard, it seems like an AMD 3600 with an RX 2060 is enough for 50-60 FPS with some settings turned down for 1440p gameplay.
    Assuming that's okay for you, a brand new PC ends up being about 650-700 dollars. Here is an example of a build I quickly threw together as an example:
    https://pcpartpicker.com/list/HmCW89
    Please note that that's something I quickly threw together. There may be room for improvements.
     
    You could also try and reuse some parts to get away cheaper. The PSU and the case might be reusable. That would save 110 dollars from my build.
    It might be possible to reuse the motherboard, RAM and CPU as well although at this point I think it might not be a great idea to do that. They are quite old and would potentially hold you back.
  16. Like
    LAwLz got a reaction from porina in NVIDIA CEO Jen-Hsun Huang: Don't learn computer science. The future is human language (AI code generation)   
    Two thoughts I had when I saw this video, and I am a bit scared that nobody else has pointed this out.
    1) The video is super heavily edited. I am not even sure this is what he said because it is so heavily edited. Does anyone have the original video? In that 1 minute video I counted something like 18 cuts. That the large amount of cuts doesn't ring an alarm bell to anyone else is in my opinion a pretty big alarm bell. Are people just so used to the jump cuts that they don't notice them? Each and every cut could be taking out important context that may bring nuance or it could also just be stitching together words to form sentences that were never even said to begin with.
     
    2) Is he really saying "don't learn computer science"? 
    To me, the way I interpret what was said, is that while we used to hear people say "everyone should learn how to program" we are quickly moving towards a world where few people will need to learn how to code. That we should aim to make tools that makes it so that everyone can "program" without knowing a programming language. 
    That people who used to need to learn programming to make their computers do something that was actually related to something else (like biology) will hopefully be able to just focus on the biology part in the future. Instead of needing to learn how to code because they need to write a program to check some protein reaction or whatever, they will hopefully in the future be able to just focus on learning the biology part and let the computer handle the "how to create a program to check this" part.
     
    That we are moving away from "everyone need to learn programming because programming is a skill every job needs" to "you don't need to learn programming if that's not going to be your primary focus".
     
    I might be misinterpreting what he said, but I think that is a fairly logical explanation and view considering some particular word choices and context.
  17. Agree
    LAwLz got a reaction from Sauron in Long, but excellent, article by Steven Sinofsky (in charge of Windows when it went through the EU wringer) on Apple's DMA compliance   
    Sinofsky is a massive twat. I am not surprised that he is strongly against the DMA, a piece of legislation trying to keep giant tech companies from abusing their positions of power to lock users into their ecosystems. That is exactly what Sinofsky was trying to push (and in some regards succeeded) when he was at Microsoft.
     
    I also think he is being a bit silly or maybe disingenuous when he says the legislation is "clearly aimed at specific US companies" and then goes ahead and lists companies like Samsung, ByteDance, Alibaba, AliExpress, Booking.com, and Zalando as also being affected. Is he aware that those companies aren't American? I mean, it is Sinofsky we're talking about so I wouldn't be surprised if he doesn't know Samsung isn't from the US...
     
    Maybe the issue isn't that "the legislation is aimed at US companies because the EU is evil and want to harm America!" but rather "a lot of the big companies that are abusing their power are from the US"?
     
     
    There is so much bullshit in this article it's not even funny.
    Things like claiming Apple has never abused their position of power and that no consumer has been harmed by the way Apple acts. I would argue that the 30% cut Apple takes is an abuse of their position. Especially since they forbid developers from telling users about for example cheaper rates on their website. Telling developers "no, you are not allowed to tell your users that they can subscribe to the service without also paying us, Apple, is not allowed and we will take away everything from you if you do" is not exactly a friendly and non-abusive way of handling your users or developers. Sinofsky might think that's not abusive because he looks up to Apple a lot and wanted Microsoft to be like Apple, but if he is going to claim that's perfectly fine, good and not an abuse of power then he is in my eyes a dumbass.
  18. Agree
    LAwLz got a reaction from Blasty Blosty in TV brain= hacked firestick?   
    It's most likely next to impossible. I kind of doubt the ribbon cable just sends a regular HDMI signal to the display. These things are typically quite "non-standard". 
     
    If you were to make it work, you'd probably need to solve far more difficult challenges than getting it powered on. 
    Since a lot of things will probably be very specific to your particular model of TV, it's most likely too much to ask someone to help you. It would just require way to much time and effort. 
  19. Agree
    LAwLz reacted to RejZoR in Apple will allow iOS app downloads directly from websites in the EU   
    I mean, Samsung has Auto Blocker which blocks everything non authorized and official. If you want pure security you can enable it and it will refuse to allow any sideloading entirely. But you can also choose not to if you want to sideload. Why can't Apple just do that instead of being always so stupidly pedantic about "their ways"?
  20. Agree
    LAwLz reacted to BrandonTech.05 in Nvidia now offering day passes for $4 and $8. But at least we got G-Sync and Reflex right? Right?   
    No? it's just a marketing strategy. There is no brain hijacking or dark patterns. Respectfully, in this case you are using those terms wrong. They aren't forcing you to use this product and if you think it's a bad deal then its not for you and you can just not buy it. 
  21. Agree
    LAwLz got a reaction from 8tg in Looking for advice on weight loss   
    That sounds great! Losing weight is really hard and if you have been at it for 5 months then you are doing a fantastic job.
    Although 1000 calories, if accurate, seems too low to me. It's important not to starve yourself too much. Not really because of the "starvation mode" (more on that later) but because you will feel really bad and it may have other health complications.
     
     
    "Starvation mode" is a real thing, but it is not really what people typically assume or describe it as.
    At the end of the day, calories in and calories out is all that matters. If you consume fewer calories than you expend then you will lose weight. Anything else would be breaking the laws of thermodynamics.
     
    "Starvation mode" is not the reason why people typically stop losing weight or recommend against doing extreme diets. The reason why those diets can result in you not losing weight is because the temptation to eat a lot of food becomes too great and you feel like shit. When your body doesn't get enough energy it will respond by making you feel bad in an attempt to make you eat more. A lot of people who say they stop losing weight are simply eating more because the cravings become too big. They don't actually enter the "starvation mode" we often hear about.
    Starvation mode is a real thing that has been observed, but it only occurs in extreme situations. One study I found was about soldiers during world war 2 which expended over 3000 calories a day, ate around 1800 calories a day, and they didn't enter "starvation mode" until they were at around 5% body fat. 
     
    In your case there are a few likely scenarios for why you haven't lost weight which are unrelated to starvation mode.
    It might be that you have been eating more even though you don't think you have. There have been studies where cameras were placed around peoples' houses and what was observed was that people ate a lot more than they though, and then simply didn't remember it.
    Another explanation could be that your exercises have prompted your body to start building muscles. 
     
    Depending on how overweight you are, it might be a good idea to start taking measurements instead of strictly looking at the scale. The scale only tells you weight, but measurements will also give you an idea of progress in terms of fat vs muscle mass.
    It might just be that you are, as you suggested, that you are gaining weight in terms of muscles and losing weight in terms of fat. It's really hard for us to tell, and it might even be hard for you to tell if you are just using a scale to track progress.
     
     
     
    This is probably awful advice that could potentially be dangerous.
     
    Calories mean everything when it comes to losing weight.
    If you eat fewer calories than you use then you will lose weight, regardless of whether those calories come from Big Macs or vegetables and chicken.
     
    It's not recommended to eat Big Macs and other fast food stuff for various reasons, but the idea that their calories are "worse" for the purpose of losing weight is not true.
    You should avoid them because they are very calorie-dense and thus won't make you feel that full compared to the equivalent amount of calories from "better foods". It will also be hard to fit the other things that are important to eat to feel good while not exceeding your calorie budget, but that's it. It's not because 1500 calories of fast food will make you more overweight than 1500 calories of vegetables. They will provide your body with just as much energy as each other, and fat is energy.
     
     
    It sounds to me like you are blindly recommending your diet and workout routine to OP without even considering if they are in the same situation as you. Diet and training need to be tailored to the specific person based on their goals. If you want to bulk up and gain weight in muscles then your diet will look very different from someone who wants to lose a bunch of weight in terms of fat. 
  22. Agree
    LAwLz got a reaction from saintlouisbagels in Getting 9 year old to play games other than Roblox?   
    So if I understand correctly, she wants to play Roblox with you but you don't want to play that game. 
    So now you're asking for help finding games that both of you might enjoy? 
     
     
     
    By the way, I think you should be more careful about how you word your posts. Calling your daughter slurs, being demeaning towards the things she likes and such is not exactly a good way to come across as "dad of the year". I think that's why a lot of people seem to react a certain way towards your posts. 
  23. Agree
    LAwLz reacted to porina in Bundesnetzagentur requires provider to supply telecommunications services (Germany) 10 mbit download 1.7 mbit upload   
    A bit more context would help. It sounds like this is a property that is relatively detached so there is no existing infrastructure reaching it, even if there is some nearby. If there isn't an explicit universal access requirement, then costs can be high. It sounds like this is a push towards something like universal access.
     
    The minimum requirements listed might seem low, but for remote locations it could potentially be provided over the mobile network or even LEO satellite outside of pricing.
  24. Agree
    LAwLz got a reaction from wanderingfool2 in Looking for advice on weight loss   
    That sounds great! Losing weight is really hard and if you have been at it for 5 months then you are doing a fantastic job.
    Although 1000 calories, if accurate, seems too low to me. It's important not to starve yourself too much. Not really because of the "starvation mode" (more on that later) but because you will feel really bad and it may have other health complications.
     
     
    "Starvation mode" is a real thing, but it is not really what people typically assume or describe it as.
    At the end of the day, calories in and calories out is all that matters. If you consume fewer calories than you expend then you will lose weight. Anything else would be breaking the laws of thermodynamics.
     
    "Starvation mode" is not the reason why people typically stop losing weight or recommend against doing extreme diets. The reason why those diets can result in you not losing weight is because the temptation to eat a lot of food becomes too great and you feel like shit. When your body doesn't get enough energy it will respond by making you feel bad in an attempt to make you eat more. A lot of people who say they stop losing weight are simply eating more because the cravings become too big. They don't actually enter the "starvation mode" we often hear about.
    Starvation mode is a real thing that has been observed, but it only occurs in extreme situations. One study I found was about soldiers during world war 2 which expended over 3000 calories a day, ate around 1800 calories a day, and they didn't enter "starvation mode" until they were at around 5% body fat. 
     
    In your case there are a few likely scenarios for why you haven't lost weight which are unrelated to starvation mode.
    It might be that you have been eating more even though you don't think you have. There have been studies where cameras were placed around peoples' houses and what was observed was that people ate a lot more than they though, and then simply didn't remember it.
    Another explanation could be that your exercises have prompted your body to start building muscles. 
     
    Depending on how overweight you are, it might be a good idea to start taking measurements instead of strictly looking at the scale. The scale only tells you weight, but measurements will also give you an idea of progress in terms of fat vs muscle mass.
    It might just be that you are, as you suggested, that you are gaining weight in terms of muscles and losing weight in terms of fat. It's really hard for us to tell, and it might even be hard for you to tell if you are just using a scale to track progress.
     
     
     
    This is probably awful advice that could potentially be dangerous.
     
    Calories mean everything when it comes to losing weight.
    If you eat fewer calories than you use then you will lose weight, regardless of whether those calories come from Big Macs or vegetables and chicken.
     
    It's not recommended to eat Big Macs and other fast food stuff for various reasons, but the idea that their calories are "worse" for the purpose of losing weight is not true.
    You should avoid them because they are very calorie-dense and thus won't make you feel that full compared to the equivalent amount of calories from "better foods". It will also be hard to fit the other things that are important to eat to feel good while not exceeding your calorie budget, but that's it. It's not because 1500 calories of fast food will make you more overweight than 1500 calories of vegetables. They will provide your body with just as much energy as each other, and fat is energy.
     
     
    It sounds to me like you are blindly recommending your diet and workout routine to OP without even considering if they are in the same situation as you. Diet and training need to be tailored to the specific person based on their goals. If you want to bulk up and gain weight in muscles then your diet will look very different from someone who wants to lose a bunch of weight in terms of fat. 
  25. Like
    LAwLz reacted to Stahlmann in Looking for advice on weight loss   
    So far, I haven't had any negative effects like feeling hungry, lack of energy, etc. I'm generally happier with myself than before, probably because of the amount of weight I've lost and the muscle I've already built up. Not to mention the compliments I get from colleagues and family, which no doubt helps to keep me motivated.
     
    Interesting, so I can basically rule that out since I still have more than enough body fat. I figured it probably wasn't happening because I don't have any of the typical symptoms.
     
    I try not to buy snacks in the first place, so I'm pretty sure this isn't a problem since I don't have much food at home other than my cooking ingredients. And if I do have a snack in the evening, I will still roughly count the calories to try to stay within 1500 kcal for the day.
     
    That's probably what's happening because I'm seeing significant muscle growth in the mirror. I just didn't think I'd be at a point where it would be at a similar rate as fat loss.
     
    Makes sense. It seems I'm at a weight now where I can't see my progress by just looking at the number on the scale. So, I'll basically continue as usual, but take more measurements besides just body weight and see where it goes in the coming weeks.
×