Jump to content

Pictures Of A Women On Toilet Leaked Online Via Roomba Vacuum

According to multiple news outlets around 15 screenshot images have been leaked on social media which were taken by a Roomba robotic vaccum cleaner. The screenshots in question have been acquired from owners in France, Germany, Spain, US and Japan. The main concern is that within the screenshots appeared to be a woman on the toilet in which these screenshots have been leaked on private Facebook and Discord groups. 

 

The main cause of the leak was due to gig workers in Venezuela in which were tasked to label audio, photo and video data in order to train the companys AI for iRobots Roomba vaccums. 

 

iRbot has stated that the units were for testing and development purposes only in which were given to staff which agree to share their data. The main concern is that iRobot has offloaded this data to cheaper low-cost contract workers which are more relaxed when it comes to legalities and security. 

 

Summary

Leaked screenshots of Roomba testers have appeaered on private social media groups from cheap low-cost contract workers. 

 

Quotes

Quote

Images of a woman seated on the toilet with her shorts pulled down to her thighs, captured by a robot vacuum cleaner, were shared on private Facebook and Discord groups, as reported by MIT Technology Review.

Quote

The pictures were among 15 screenshots taken from recordings made by special development versions of the Roomba J7 series of robot vacuum cleaners in 2020.

Quote

The images were shared on social media by gig workers in Venezuela, whose job it is to label audio, photo and video data to train the company’s train artificial intelligence.

My thoughts

After seeing the whole scandal of Eufy and their security issues it has really come to a point for people to open their eyes and actually look behind the machines and how these big tech corporations are handling your data. I know these were test devices, but the fact that iRobot has offloaded this sensitive data to less secure contract workers really shows you the lack of care and due diligence companies have with your data. It's about time that Governments wake up and actually realise the dangers these companies pose in which should be taken accountable for their actions. 

 

Sources

https://ca.finance.yahoo.com/news/roomba-says-leaked-pictures-including-194856679.html?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=AQAAAKh819ZEOzGuwANvbZR7RgMrKE-gnjujeGSXUG5z9x66XgvhK1Oeaoz679v-3Ov-7vqO1rMX91qLaM_RDtfWKDWKEN12iZpKlLuxKhNoRDpo1GRBdtLG8FiGkKTj0Os9aY703nkXlTMwGr5xvTHd8cWkQG-Hg3Xp8YJaHl2-ZDd9

 

https://metro.co.uk/2022/12/22/intimate-photos-taken-by-roomba-vacuums-leaked-online-17983006/

 

https://www.dailymail.co.uk/news/article-11562599/Robot-vacuum-cleaner-took-photos-woman-toilet-images-ended-Facebook.html

Link to comment
Share on other sites

Link to post
Share on other sites

31 minutes ago, V3ptur said:

It's about time that Governments wake up and actually realise the dangers these companies pose in which should be taken accountable for their actions. 

Sorry but that is exactly what they want. In general it is easier to just buy data from a broker than to hack into your devices one by one. That being said, the real scandal if you will is that those devices probably listen to wifi and other signals and record your passwords if unchecked. So ask yourself what is worse, some pics of your behind on the internet that no one can identify you by or all your login credentials shown to underpaid and morally flexible IT staff? You want privacy, don't get "smart" devices. You want to start reading the EULAs of those devices. It is scary what you allow them to do and where to send that data.

Link to comment
Share on other sites

Link to post
Share on other sites

40 minutes ago, V3ptur said:

The main cause of the leak was due to gig workers in Venezuela in which were tasked to label audio, photo and video data in order to train the companys AI for iRobots Roomba vaccums. 

 

40 minutes ago, V3ptur said:

offloaded this sensitive data to less secure contract workers

So the roomba went somewhere it shouldn't have while someone was there. The workers were meant to label pictures. In human fashion they instead sent the pictures to other people, but the tech is what we blame here and not the people?

40 minutes ago, V3ptur said:

but the fact that iRobot has offloaded this sensitive data to less secure contract workers really shows you the lack of care and due diligence companies have with your data. It's about time that Governments wake up and actually realise the dangers these companies pose in which should be taken accountable for their actions. 

Governments can make laws. People will break them. Companies can make contractors sign contracts saying they're handling confidential data and that the contractors can't share it, but people will still do what they want to do. 

 

I'd also like to point out a big pair of quotes that you seemed to skip over to make the stories fit the way you wanted better.

Quote

iRobot told the review that the development version of the Roombas were given to employees and paid contractors who signed contracts recognising that their data, including video, would be sent to the company for training purposes.

The company also said that the cleaning robots had a bright green 'video recording in progress' sticker, and it was the responsibility of those involved to 'remove anything they deem sensitive from any space the robot operates in, including children,' the review reported.

After the leaked data was flagged, iRobot CEO Colin Angle told the MIT Technology Review: 'iRobot is terminating its relationship with the service provider who leaked the images, is actively investigating the matter, and [is] taking measures to help prevent a similar leak by any service provider in the future.'

Quote

Even though the images did not come from Roomba customers, consumers often opt into getting their data monitored once they purchase "smart" devices as part of company privacy policies. Smart-device makers sometimes analyze the data, which can sometimes include personal or sensitive details, to train algorithms to improve their products.

 

James Baussmann, an iRobot spokesperson, confirmed the photo leak to Insider. When asked for further comment, Baussman referred Insider to a blog post from iRobot chairman and CEO Colin Angle.

 

Angle wrote that the test robots contained hardware and software modifications that were never available on the market to consumers. And Baussmann told Insider that "iRobot has strict data processing agreements in place with our service providers that require sensitive data be treated as confidential information."

But it's "technology bad" not "Contractors who did sex crimes bad"

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, IkeaGnome said:

So the roomba went somewhere it shouldn't have while someone was there. The workers were meant to label pictures. In human fashion they instead sent the pictures to other people, but the tech is what we blame here and not the people?

What I would blame is irobot making these images available to random gig workers without consulting the owners of the robot and having them review the images first... why does a fucking vacuum cleaner even need the ability to take and store pictures of your house? Even if the victim was part of some trial run that's no good reason to just dump their pictures to a third party without any checks.

29 minutes ago, IkeaGnome said:

Governments can make laws. People will break them.

Damn, I guess murder might as well be legal then...

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, Caroline said:

But sex crime lmao what, they did wrong but that's blowing it out of proportion.

Depends on where you live.

Quote

https://www.akleg.gov/basis/statutes.asp#11.61.123

(a) A person commits the crime of indecent viewing or production of a picture if the person knowingly
     (1) views, or views a picture of, the private exposure of the genitals, anus, or female breast of another person; or

     (2) produces a picture of the private exposure of the genitals, anus, or female breast of another person.

 (b) Each viewing of a person, and each production of a picture of a person, whose genitals, anus, or female breast are viewed or are shown in a picture constitutes a separate violation of this section.

 (c) This section does not apply to the viewing or production of a picture conducted by a law enforcement agency for a law enforcement purpose.

 (d) In a prosecution under this section, it is an affirmative defense that the viewing or production of a picture was conducted as a security surveillance system, notice of the viewing or production was posted, and any viewing or use of pictures produced is done only in the interest of crime prevention or prosecution.

 (e) In this section,
     (1) “picture” means a film, photograph, negative, slide, book, newspaper, or magazine, whether in print, electronic, magnetic, or digital format; and

     (2) “private exposure” means that a person has exposed the person's body or part of the body in a place, and under circumstances, that the person reasonably believed would not result in the person's body or body parts being viewed by the defendant or produced in a picture; “private exposure” does not include the exposure of a person's body or body parts in a law enforcement facility, correctional facility, treatment institution, designated treatment facility, juvenile treatment facility, or juvenile detention facility; in this paragraph,
          (A) “correctional facility” has the meaning given in AS 33.30.901;

          (B) “designated treatment facility” has the meaning given in AS 47.30.915;

          (C) “juvenile detention facility” and “juvenile treatment facility” have the meanings given in AS 47.12.990;

          (D) “treatment institution” has the meaning given in AS 47.14.990.

 (f) The provisions of this section do not apply to acts
     (1) that may reasonably be construed to be normal caretaker responsibilities for a child, interactions with a child, or affection for a child; or

     (2) performed for the purpose of administering a recognized and lawful form of treatment that is reasonably adapted to promoting the physical or mental health of the person being treated.

 (g) Indecent viewing or production of a picture is a
     (1) class B felony if the person violates (a)(2) of this section and the person shown in the picture was, at the time of the production of the picture, a minor;

     (2) class C felony if the person
          (A) violates (a)(1) of this section and the person viewed
               (i) was, at the time of the viewing, a minor; or

               (ii) in the picture was, at the time of the production of the picture, a minor; or

          (B) violates (a)(2) of this section and the person shown in the picture was, at the time of the production of the picture, an adult;

     (3) class A misdemeanor if the person violates (a)(1) of this section and the person viewed
          (A) was, at the time of the viewing, an adult; or

          (B) in the picture was, at the time of the production of the picture, an adult.

That's a felony where I live. 

12 minutes ago, Sauron said:

What I would blame is irobot making these images available to random gig workers without consulting the owners of the robot and having them review the images first... why does a fucking vacuum cleaner even need the ability to take and store pictures of your house? Even if the victim was part of some trial run that's no good reason to just dump their pictures to a third party without any checks.

Give the Daily Mail article a read.

https://www.dailymail.co.uk/news/article-11562599/Robot-vacuum-cleaner-took-photos-woman-toilet-images-ended-Facebook.html

Quote

They were taken by a development version of the cleaning robot and sent to Scale AI, a software start-up with contracted workers from across the globe.

Their job is to categorise data which is then used to train artificial intelligence, such as the robot vacuum. 

This helps them to more accurately map their surroundings in the home, avoid obstacles, understand the size of a room and tailor its cleaning pattern better.

Quote

iRobot told the review that the development version of the Roombas were given to employees and paid contractors who signed contracts recognising that their data, including video, would be sent to the company for training purposes.

The company also said that the cleaning robots had a bright green 'video recording in progress' sticker, and it was the responsibility of those involved to 'remove anything they deem sensitive from any space the robot operates in, including children,' the review reported.

They knew they were recording. The contractors mishandled that information.

 

12 minutes ago, Sauron said:

Damn, I guess murder might as well be legal then...

Not entirely what I said. We can make what ever we want illegal. People are still going to do what they want. That doesn't make it right, and still makes it punishable, but doesn't stop the act from happening, now does it?

Edited by IkeaGnome
I borked up my quotes and pasted over Sauron's quote.

I'm not actually trying to be as grumpy as it seems.

I will find your mentions of Ikea or Gnome and I will /s post. 

Project Hot Box

CPU 13900k, Motherboard Gigabyte Aorus Elite AX, RAM CORSAIR Vengeance 4x16gb 5200 MHZ, GPU Zotac RTX 4090 Trinity OC, Case Fractal Pop Air XL, Storage Sabrent Rocket Q4 2tbCORSAIR Force Series MP510 1920GB NVMe, CORSAIR FORCE Series MP510 960GB NVMe, PSU CORSAIR HX1000i, Cooling Corsair XC8 CPU block, Bykski GPU block, 360mm and 280mm radiator, Displays Odyssey G9, LG 34UC98-W 34-Inch,Keyboard Mountain Everest Max, Mouse Mountain Makalu 67, Sound AT2035, Massdrop 6xx headphones, Go XLR 

Oppbevaring

CPU i9-9900k, Motherboard, ASUS Rog Maximus Code XI, RAM, 48GB Corsair Vengeance LPX 32GB 3200 mhz (2x16)+(2x8) GPUs Asus ROG Strix 2070 8gb, PNY 1080, Nvidia 1080, Case Mining Frame, 2x Storage Samsung 860 Evo 500 GB, PSU Corsair RM1000x and RM850x, Cooling Asus Rog Ryuo 240 with Noctua NF-12 fans

 

Why is the 5800x so hot?

 

 

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, IkeaGnome said:

They knew they were recording. The contractors mishandled that information.

That's not the point. They should have been able to review the information before it was sent off to whomever.

16 minutes ago, IkeaGnome said:

Not entirely what I said. We can make what ever we want illegal. People are still going to do what they want. That doesn't make it right, and still makes it punishable, but doesn't stop the act from happening, now does it?

If it is illegal and actively punished there's a chance it will also become unprofitable and companies will stop doing it. Regardless, a law not preventing every instance of someone breaking it isn't a good enough argument to say it shouldn't or doesn't need to exist. If it's unenforceable or has unintended negative consequences, then we can argue over whether it's worth the tradeoff...

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

2 hours ago, Sauron said:

why does a fucking vacuum cleaner even need the ability to take and store pictures of your house? 

VSLAM+obstacle avoidance

Link to comment
Share on other sites

Link to post
Share on other sites

Why the fuck does a roomba need  camera that can take pictures

🌲🌲🌲

 

 

 

◒ ◒ 

Link to comment
Share on other sites

Link to post
Share on other sites

7 hours ago, Arika S said:

Why the fuck does a roomba need  camera that can take pictures

Answered that above. If you ever had a vaccum robot, you know it can get tangled in wires, socks, towels, etc people leave on the floor. In case of animals, it can go over poop, and spread all over, which isn't any fun.

 

So they started including cameras to identify and avoid those obstacles. This usually works with an AI, and that's simply how it's done: manually tagging for supervised training. 

Link to comment
Share on other sites

Link to post
Share on other sites

10 hours ago, Forbidden Wafer said:

Answered that above. If you ever had a vaccum robot, you know it can get tangled in wires, socks, towels, etc people leave on the floor. In case of animals, it can go over poop, and spread all over, which isn't any fun.

 

So they started including cameras to identify and avoid those obstacles. This usually works with an AI, and that's simply how it's done: manually tagging for supervised training. 

Stuff lying on the floor can be more accurately identified and avoided with a 1 dimensional lidar... without even requiring training data or guesswork to figure out how far the object is. Do I really need to know exactly what is in the way in order to avoid it? No. I only need to know that it's there and its position.

Don't ask to ask, just ask... please 🤨

sudo chmod -R 000 /*

Link to comment
Share on other sites

Link to post
Share on other sites

The Roomba was labelled that it was recording and the people agreed to it recording. If the woman didn't sign anything about it, then it's not iRobot's fault. It would be whoever let it roam around.

 

Haven't checked OP's link, but I posted about this last Wednesday as a Status Update, because of the sheer ClickBait titles when it's clearly stated that these are development robots that will never be seen in consumer products and that they signed agreements about it.

Quote

iRobot confirmed that these images were captured by its Roombas in 2020. All of them came from “special development robots with hardware and software modifications that are not and never were present on iRobot consumer products for purchase,” the company said in a statement. They were given to “paid collectors and employees” who signed written agreements acknowledging that they were sending data streams, including video, back to the company for training purposes. According to iRobot, the devices were labeled with a bright green sticker that read “video recording in progress,” and it was up to those paid data collectors to “remove anything they deem sensitive from any space the robot operates in, including children.”

https://www.technologyreview.com/2022/12/19/1065306/roomba-irobot-robot-vacuums-artificial-intelligence-training-data-privacy/

CPU: AMD Ryzen 3700x / GPU: Asus Radeon RX 6750XT OC 12GB / RAM: Corsair Vengeance LPX 2x8GB DDR4-3200
MOBO: MSI B450m Gaming Plus / NVME: Corsair MP510 240GB / Case: TT Core v21 / PSU: Seasonic 750W / OS: Win 10 Pro

Link to comment
Share on other sites

Link to post
Share on other sites

46 minutes ago, Sauron said:

Stuff lying on the floor can be more accurately identified and avoided with a 1 dimensional lidar... without even requiring training data or guesswork to figure out how far the object is. Do I really need to know exactly what is in the way in order to avoid it? No. I only need to know that it's there and its position.

Not reliable with cheap lidars. Some stuff is thin enough that won't be properly caught by it or identified correctly. I bought one with a top-mounted lidar thinking it wouldn't smash into my grandmas crystals cabinet, but it did that due to the thin legs. And those are about a 1.5cm thick.
 

 

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×