Jump to content

Verizon's Customer Conversations LEAKED!

There was a discovered leak from Verizon that has been reportedly been leaking customers' information, potentially for months, due to a flaw in a chat system on their website. This event caused addresses, phone numbers, account numbers, and other personal/private information to be leaked.

 

Verizon's response: “We’re looking into an issue involving our online chat system that assists individuals who are checking on the availability of Fios services. We believe a small number of users may have seen a name, phone number, and/or a home or building address from an unrelated individual who had previously used this chat system to enter that information. Since the issue was brought to our attention, we’ve identified and isolated the problem and are working to have it resolved as quickly as possible.”

 

TL;DR

  • Flaw discovered in Verizon's chat system on their website
  • Leak caused personal/private information to be leaked (Addresses, phone numbers, account numbers, etc.)
  • Check your security/profile & change password just in case

 

Articlehttps://hotforsecurity.bitdefender.com/blog/verizon-leaks-customer-conversations-personal-data-through-flawed-chat-window-on-its-website-24708.html

Edited by DankDoodles
Link to comment
Share on other sites

Link to post
Share on other sites

Can someone move this post to "Tech News"?

Accidentally posted this on the wrong section... my bad :/

Link to comment
Share on other sites

Link to post
Share on other sites

Sounds like caches to me, down side to web caching is when they go wrong the results can be highly interesting. Also very hard to pick up without relying on user reporting of these instances.

 

You can engineer very good security in to your web application and all that effort can be undermined by a simple front end caching issue resulting in previous session data being pulled in to a new/different session.

Link to comment
Share on other sites

Link to post
Share on other sites

18 minutes ago, leadeater said:

Sounds like caches to me, down side to web caching is when they go wrong the results can be highly interesting. Also very hard to pick up without relying on user reporting of these instances.

 

You can engineer very good security in to your web application and all that effort can be undermined by a simple front end caching issue resulting in previous session data being pulled in to a new/different session.

Aren't those things supposed to be encrypted?

There is more that meets the eye
I see the soul that is inside

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

25 minutes ago, like_ooh_ahh said:

Aren't those things supposed to be encrypted?

Doesn't matter, it's a server side issue with server side caching. If it goes wrong and feels like giving you someone else's data well RIP you're going to see it lol. It's happened to us, can't remember which way round but either Prod or Dev got pointed to the wrong instances of Redis caching because of a file restore restoring a configuration file which itself is a cached configuration file holding the Redis server connection settings. People started seeing all kinds of weird stuff.

 

I believe it was Dev environment being pointed to the Prod Redis, most likely based on the setup. The correct procedure during a restore is to delete that cache file so on process startup it reads the stored configuration and creates a new cached configuration file for the Redis connection settings, if the file exists it just reads it. Refreshes of the Dev environment is just restoring a copy of Prod (actually it's a FlexClone but for the sake of easy understanding think of it as a restore), but you need to do other things after that like change the database connection settings, change the Redis settings and delete that file. 

Link to comment
Share on other sites

Link to post
Share on other sites

8 minutes ago, gabrielcarvfer said:

Just like Steam a few years ago. That's what you get for caching dynamically rendered stuff instead of rendering/serving static pages and refreshing them as necessary. =/

Well it can depend on what it is, going back to source data or generating required content can be expensive so scaling out resources to do that can be cost prohibitive or not even technically feasible and that is why caching came to be to help solve those types of issues. A lot of things just need a complete redesign too though but most things have been created and maintained before many things were possible or ready to be used and people just hate reinventing the wheel when they have one now that is round enough and works.

 

But honestly web designs and web applications are just pure hell which is why I stay as far away from that as possible. It is the best example of spaghetti mess of standards that exists while simultaneously literally everyone ignoring them and doing their own thing.

Link to comment
Share on other sites

Link to post
Share on other sites

16 hours ago, leadeater said:

 

But honestly web designs and web applications are just pure hell which is why I stay as far away from that as possible. It is the best example of spaghetti mess of standards that exists while simultaneously literally everyone ignoring them and doing their own thing.

Most HTML5.0 "web 2.0" stuff became pure garbage once jQuery entered the scene.

 

Like you have layers of stuff people don't understand built on top of layers of bugs. Some of jQuery's functionality eventually found it's way into the browser, so "responsive" web pages became a thing, but at the same time, things like "ssl everywhere" completely obliterated the ability to cache anything. You send an encrypted page once, and then you send micro-updates via json to minimize the overhead on the server and the browser, but that's at a great cost to the client device.

 

If your server serves nothing but static pages (and I still maintain a website that is like this) the site is super-fast, except where the ads drive the performance into the toilet. Like I've only ever found one (also static) site that was faster than the ones I was running, and it was only faster because it was literately hosted in my city, unlike the sites I was managing which were out out california.

 

Link to comment
Share on other sites

Link to post
Share on other sites

On 12/28/2020 at 6:36 AM, leadeater said:

Sounds like caches to me, down side to web caching is when they go wrong the results can be highly interesting. Also very hard to pick up without relying on user reporting of these instances.

 

You can engineer very good security in to your web application and all that effort can be undermined by a simple front end caching issue resulting in previous session data being pulled in to a new/different session.

You'd hope so since the alternative would indicate some VERY bad programming.

Main Rig:-

Ryzen 7 3800X | Asus ROG Strix X570-F Gaming | 16GB Team Group Dark Pro 3600Mhz | Corsair MP600 1TB PCIe Gen 4 | Sapphire 5700 XT Pulse | Corsair H115i Platinum | WD Black 1TB | WD Green 4TB | EVGA SuperNOVA G3 650W | Asus TUF GT501 | Samsung C27HG70 1440p 144hz HDR FreeSync 2 | Ubuntu 20.04.2 LTS |

 

Server:-

Intel NUC running Server 2019 + Synology DSM218+ with 2 x 4TB Toshiba NAS Ready HDDs (RAID0)

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×