Jump to content

LAwLz

Member
  • Posts

    19,151
  • Joined

  • Last visited

Awards

About LAwLz

  • Birthday Feb 11, 1993

Contact Methods

  • Discord
    LAwLz#8319

Profile Information

  • Gender
    Female
  • Location
    Sweden
  • Interests
    Anime/manga, networks, some gaming, tabletop RPGs and posting on forums.
  • Occupation
    Consultant (networking)
  • Member title
    (´・ω・`)

System

  • CPU
    Intel i5-13600K
  • Motherboard
    Gigabyte Z690 UD (DDR5)
  • RAM
    ADATA XPG 32GB DDR5 5200MHz CL38
  • GPU
    Gigabyte RTX 3070
  • Case
    Fractal Design Define R5
  • Storage
    2TB Samsung 970 EVO Plus
    4TB Crucial P3 Plus
  • PSU
    Corsair RM750X
  • Display(s)
    Samsung C49RG9x
  • Cooling
    Noctua D15
  • Keyboard
    Corsair K95 (Brown switches)
  • Mouse
    Logitech G502
  • Sound
    Sennheiser HD650 - FiiO E9
  • Operating System
    Windows 11
  • Laptop
    Lenovo ThinkPad X1 Carbon Gen 10
  • Phone
    Samsung Galaxy S22 Ultra

Recent Profile Visitors

47,985 profile views
  1. Holy shit, Nvidia is killing it right now at their GTC.

    Blackwell looks to be a monster in terms of AI performance. If these numbers are correct, then it is insane.

     

     

    It will probably become Nvidia's most successful product launch ever.

    1. Agall

      Agall

      I have a feeling the Geforce side of things are going to get sidelined, especially in rasterization.

    2. LAwLz

      LAwLz

      16 minutes ago, Agall said:

      I have a feeling the Geforce side of things are going to get sidelined, especially in rasterization.

      I too feel like that is a risk. Hopefully it doesn't, and I suspect that Nvidia will still bring out some major gaming improvements for the 50 series. 

       

      On the bright side, in a worst case scenario they mostly ignore the gaming market (IMO unlikely) and it leaves a big opportunity for AMD to take back some market share. Although AMD seems to be mostly focused on trying to take AI customers from Nvidia so the absolute worst case scenario for gamers is that nether company makes big gaming improvements next gen. 

  2. Beeper, the app that was in the news recently for having iMessage support on Android, has just released a preview version of their new "Beeper Universal Chat". 

     

    I just want to warn everyone to not use this though. Why? Because it works by acting as a relay. Essentially, it signs in to your accounts on their servers, and then sends and receives messages for you. Then the app they released is used to connect to their server where your messages are stored. 

     

    This is a really big security risk for several reasons. Not only is signing in to your accounts on someone else's server a bad idea, it also breaks the end-to-end encryption some of the chat services uses. 

    1. Senzelian

      Senzelian

      Who uses more than one messenger app anyway? Especially now that the EU has said that messager apps have to work interoperably. 

    2. Eigenvektor

      Eigenvektor

      That sounds like a bad idea on so many levels.

       

      Even if you never exchange any type of private information via chat, even if they never sell it, even if they never get breached, you still risk losing your account by breaking Apple's ToS.

    3. LAwLz

      LAwLz

      Since I forgot to include a source, here it is:

       

      Quote

      How does Beeper Cloud connect to encrypted chat networks like iMessage/Signal/WhatsApp?

      When sending and receiving Signal, iMessage and WhatsApp messages, Beeper Cloud's web service acts as a relay. For example, if you send a message from Beeper to a friend on WhatsApp, the message is encrypted on your Beeper Cloud client, sent to the Beeper Cloud web service, which decrypts and re-encrypts the message with WhatsApp's proprietary encryption protocol.

      ⚠️ Using native chat apps independently may be more secure than connecting to other encrypted chat networks with Beeper Cloud.⚠️

       

  3. Pretty sure it's not an UWP. I haven't heard any issues from the client teams regarding the rollout of new Teams. I've heard that some customers have had issues with the new teams client, but we have successfully rolled out Teams on several customers. Just uninstall the old Teams client if you have verified that the new one works well.
  4. The creators have full control over when ads are shown, how many, and which type. If you see unskippable ads, it's because the creator has it enabled on the video. If you see midroll ads, it's because the creator has it enabled. At least that's the case with uploaded videos. It might be different for live streams, but I am 99% I've seen streams where someone says then need a toilet break, and then an ad shows perfectly. I doubt that's a coincidence. It might also be that the default is that YouTube sprinkles in ads every once in a while by default if the creator doesn't manually ad adds themselves. Edit: Found a source that confirms what I said. https://support.google.com/youtube/answer/7385599?hl=en If you think LTT live streams co rain a lot of ads, it's because they (the creator) chose to include a lot of ads, because more ads = more money. You can place ads manually if you want, and you can let YouTube insert ads for you. If you do the latter, you have several options for how often ads should appear.
  5. This was added back a while ago (one or two moment updates), although in a way that I think is worse than in Windows 10. The size of the taskbar icon is based on the length of the window title, so you end up with varying lengths. So programs with shorter window names will be smaller and harder to hit. I still use Explorer Patcher to get that particular feature back, but at least we have a decent option built into Windows now. It feels bad that it took them 2 years to partially fix something that wasn't broken until they fucked it up. I don't understand how someone can use Windows with the "always combine" feature. Do people not run multiple copies of the same program? It isn't the default so I assume people just put up with it. I don't really get the hate the new Teams client gets for how it was installed. I thought it was great that they let you run both apps alongside each other. Especially in the beginning when certain features didn't work in the new client. Once you had everything working in the new client you could just uninstall the old one. It was only when you had both installed and accidentally clicked the old one you ran into annoyances. I admit I ran into that a few times, but I prefer that over not being able to easily revert back.
  6. The output will still be human-readable code that someone can check. The actual process of generating the code can be viewed as a "black box", but neither the input (the words a human feeds it) nor the output (code written in programming languages) will be black boxes. Maybe I am misunderstanding you but it sounds to me like you think this would create a new programing language that humans wouldn't understand. That is not what Jensen is saying nor is anyone else saying. I meant absolutely as "I could absolutely see myself using it in some scenarios". I would say it depends on what you are going to use the code for and how much you can test it. I think this is exactly what Jensen meant with his comment.
  7. It depends on when we are talking and what scenario. Would I trust the first thing ChatGPT spits out today when I ask it to write the firmware for a pacemaker? No. Would I trust code written by ChatGPT to do a fairly mundane thing and I test it first? Absolutely. I think it is foolish to make a blanket statement like "I wouldn't trust code written by AI" because it depends. I don't think code written by humans are inherently safer to blindly trust. It also depends on which code generator we are talking about. Back in the 1940's and earlier, people were scared of trusting automatic elevators. They wanted a human to operate the engine that caused the elevator to go to a certain floor and to accelerate/brake. I am not saying we should blindly trust AI today because people were scared of stupid stuff in the past, but it is important to understand that technology processes and what we perceive as dangerous and/or scary today might be completely safe and will become common in the future. There are a lot of statements regarding future technology that sound very silly to us today. As a result, I am usually reluctant to make any absolute statements regarding the future process of technology.
  8. Rumors have it that Qualcomm will launch a new Snapdragon 8s Gen 3 chip soon.

     

    The naming is extremely similar to the Snapdragon 8 Gen 3 which is their flagship SoC. The difference is the "s" after the 8. 

     

     

    According to rumors, the s variant will be a slower version of the non-s version.

    Not just lower clocks but also different core configuration (one less middle core, one more small core) and smaller GPU.

     

     

    If it turns out to be true then I'd say Qualcomm is being quite deceptive. Not a fan.

    1. Lurick

      Lurick

      The s in the 8s is for slow and stupid 😄

      I agree though, very deceptive indeed

  9. "1080p60fps" doesn't really tell us much. There is a big difference between 1080p60fps when using upscaling and running at lowest detail settings, and if you are running at native resolution with everything maxed out. It's also a big difference if we are talking about average FPS or minimum FPS. The recommended specs for 1080p 60 FPS with medium settings is an Nvidia RX 2060 or AMD RX 6600 XT with a Ryzen 3700K, 16GB of RAM. Judging by forum posts I've heard, it seems like an AMD 3600 with an RX 2060 is enough for 50-60 FPS with some settings turned down for 1440p gameplay. Assuming that's okay for you, a brand new PC ends up being about 650-700 dollars. Here is an example of a build I quickly threw together as an example: https://pcpartpicker.com/list/HmCW89 Please note that that's something I quickly threw together. There may be room for improvements. You could also try and reuse some parts to get away cheaper. The PSU and the case might be reusable. That would save 110 dollars from my build. It might be possible to reuse the motherboard, RAM and CPU as well although at this point I think it might not be a great idea to do that. They are quite old and would potentially hold you back.
  10. Two thoughts I had when I saw this video, and I am a bit scared that nobody else has pointed this out. 1) The video is super heavily edited. I am not even sure this is what he said because it is so heavily edited. Does anyone have the original video? In that 1 minute video I counted something like 18 cuts. That the large amount of cuts doesn't ring an alarm bell to anyone else is in my opinion a pretty big alarm bell. Are people just so used to the jump cuts that they don't notice them? Each and every cut could be taking out important context that may bring nuance or it could also just be stitching together words to form sentences that were never even said to begin with. 2) Is he really saying "don't learn computer science"? To me, the way I interpret what was said, is that while we used to hear people say "everyone should learn how to program" we are quickly moving towards a world where few people will need to learn how to code. That we should aim to make tools that makes it so that everyone can "program" without knowing a programming language. That people who used to need to learn programming to make their computers do something that was actually related to something else (like biology) will hopefully be able to just focus on the biology part in the future. Instead of needing to learn how to code because they need to write a program to check some protein reaction or whatever, they will hopefully in the future be able to just focus on learning the biology part and let the computer handle the "how to create a program to check this" part. That we are moving away from "everyone need to learn programming because programming is a skill every job needs" to "you don't need to learn programming if that's not going to be your primary focus". I might be misinterpreting what he said, but I think that is a fairly logical explanation and view considering some particular word choices and context.
  11. I am not sure Tim Cook is the most suitable tech CEO for Riley to impersonate. Maybe Bobby Kotick would be a better fit?
  12. Sinofsky is a massive twat. I am not surprised that he is strongly against the DMA, a piece of legislation trying to keep giant tech companies from abusing their positions of power to lock users into their ecosystems. That is exactly what Sinofsky was trying to push (and in some regards succeeded) when he was at Microsoft. I also think he is being a bit silly or maybe disingenuous when he says the legislation is "clearly aimed at specific US companies" and then goes ahead and lists companies like Samsung, ByteDance, Alibaba, AliExpress, Booking.com, and Zalando as also being affected. Is he aware that those companies aren't American? I mean, it is Sinofsky we're talking about so I wouldn't be surprised if he doesn't know Samsung isn't from the US... Maybe the issue isn't that "the legislation is aimed at US companies because the EU is evil and want to harm America!" but rather "a lot of the big companies that are abusing their power are from the US"? There is so much bullshit in this article it's not even funny. Things like claiming Apple has never abused their position of power and that no consumer has been harmed by the way Apple acts. I would argue that the 30% cut Apple takes is an abuse of their position. Especially since they forbid developers from telling users about for example cheaper rates on their website. Telling developers "no, you are not allowed to tell your users that they can subscribe to the service without also paying us, Apple, is not allowed and we will take away everything from you if you do" is not exactly a friendly and non-abusive way of handling your users or developers. Sinofsky might think that's not abusive because he looks up to Apple a lot and wanted Microsoft to be like Apple, but if he is going to claim that's perfectly fine, good and not an abuse of power then he is in my eyes a dumbass.
  13. It's most likely next to impossible. I kind of doubt the ribbon cable just sends a regular HDMI signal to the display. These things are typically quite "non-standard". If you were to make it work, you'd probably need to solve far more difficult challenges than getting it powered on. Since a lot of things will probably be very specific to your particular model of TV, it's most likely too much to ask someone to help you. It would just require way to much time and effort.
  14. Nvidia will be hosting the 2024 GTC (GPU Technology Conference) event on March 18 at 1 PM, PDT.

     

    Please remember that the GTC is for enterprise products. Don't expect some new RTX cards to launch at this event. It will most likely involve a lot of AI-related news. We might see the Blackwell architecture show up in some datacenter GPU though.

     

     

  15. Except you weren't keeping it simple because what you said was just straight up wrong. As I have said time and time again, the frequency does not matter at all. The 2.4GHz and 5Ghz networks that are usually sent out by typical consumer routers? They have full access to each other. Putting IoT devices on the 2.4GHz network and other devices on the 5GHz network does not provide any additional security, at all, because they are the same network. Judging by your comments in this thread I suspect your reasoning is wrong, and now you're giving advice to others that at best gives them a false sense of security while actually providing zero security benefits. At worst you might be doing that on top of making their network less functional and more limited. Excuse me but... What are you on about? You are wording things very weirdly, but I assume you are talking about how having a let's say a 802.11g device on a network capable of 802.11n will slow things down for all the wireless-N devices. Is that what you are talking about? Because that is completely unrelated to everything you have said so far. That has nothing to do with security, it has nothing to do with frequency, and it is not a reason for segmenting IoT devices into 2.4GHz and other devices into 5GHz. What you are talking about is simply not related to frequency. It's also not because the AP has to "constantly switch speeds" that it slows things down... The switching of "speed" (as in, the protocol used) is instantaneous. The reason why older devices might slow down speeds for newer devices is because they require more airtime to transmit the same amount of data. Since Wi-Fi is a shared medium, everyone has to wait for the transmission to be over. The slower the device is, the longer time the others have to wait (or rather, the more time slots it will be allocated to send the same amount of data as a faster device). Beacon frames are also sent at the lowest mandatory data rate so that also slows things down since they use up airtime too. But again, that has nothing to do with the frequency it operates on. It is not a reason to keep IoT devices on a 2.4GHz network and other devices on a 5Ghz network. It might be a reason why you would like to put them on different SSIDs, but that is unrelated to frequency as I have said time and time again.
×