Jump to content

Search the Community

Showing results for tags 'user data'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Site Related
    • News & Updates
    • Site / Forum Feedback
    • Member Introduction
  • News
    • General News
    • FileSharing News
    • Mobile News
    • Software News
    • Security & Privacy News
    • Technology News
  • Downloads
    • nsane.down
  • General Discussions & Support
    • Filesharing Chat
    • Security & Privacy Center
    • Software Chat
    • Mobile Mania
    • Technology Talk
    • Entertainment Exchange
    • Guides & Tutorials
  • Off-Topic Chat
    • The Chat Bar
    • Jokes & Funny Stuff
    • Polling Station

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Found 11 results

  1. Australia-based AmazingCo Exposed User Data Through Unsecured Database These days, the frequency of incidents of firms leaking user data through unprotected databases is on a rise. Once again, a similar report surfaced online as an Australian firm AmazingCo exposed user data publicly. The exposed information included personally identifiable data of the customers. AmazingCo Exposed User Data According to Jeremiah Fowler of Security Discovery, AmazingCo exposed user data through an unsecured database. He shared details of his findings in his blog post. As discovered, the researcher noticed an open Elastic databasewithout a password that contained detailed customer records. Scratching the surface revealed that the database belonged to the Australian firm AmazingCo. The leaky database had a folder entitled ‘customers’ with 174,000 records. Regarding the type of information exposed, Fowler found, 212,220 records in total including many user names, emails, phone numbers, internal notes, and other sensitive details… IP addresses, Ports, Pathways, and storage info that cybercriminals could exploit to access deeper in to the network. Additionally, the leaked data also included customer feedback associated with personal information. Each of these were connected to the client’s real personally identifiable data and the files also included internal notes on the clients, their events and any challenges Amazingco’s staff experienced. Most of the details linked back to some children’s parties and wine tours. It remains unconfirmed that for how long the database leaked the information. Nonetheless, considering the indexing date, the researcher assumes that it may have stayed available at least for 6 to 7 days. Database Went Offline Fowler discovered the unprotected database on May 11, 2019, after its indexation on May 6, 2019. Upon noticing the database, he quickly notified AmazingCo the same day. Two days later, on May 13, 2019, he confirmed that the database went offline. Although, the company acted quickly to resolve the matter. They didn’t actually reply to the researcher with regard to his notifications. AmazingCo is an Australian event planning firm located in Melbourne. The company provides services for various parties, family gatherings, wine tours, etc. It also offers services to outside Australia, specifically, the USA and New Zealand. The firm claims to have “over 35,000 experiences delivered” involving more than 1 million organizers and attendees. Source
  2. An open database exposed at least 11 million photographs after the Theta360 photo sharing system run by Ricoh was breached. “The data breach exposed thousands of users’ photos, many of whom chose to keep their images private,” according to a blog post from vpnMonitor, whose researchers, Noam Rotem and Ran Locar, discovered the database. “The breach did not expose users’ most personal information, but in many cases, we located their usernames, first and last names, and the captions they wrote in the exposed database.” While the researchers couldn’t directly access users’ social media accounts through the system, they said information exposed included user names, usernames, each photo’s universal unique identifier (UUID), captions and privacy settings. The UUID’s allowed access to any exposed photo and in some cases, the researchers could easily connect the usernames in the database to the user’s social media account. Rotem and Locar discovered the leak on May 14 and contacted Theta360 on May 15, receiving a response that same day. By May 16, Theta360 had closed the leak. “Exposing personal photos publicly is a major violation of customer privacy,” said Jonathan Bensen, CISO and senior director of product management at Balbix, giving Ricoh the nod for taking immediate action but noting“organizations should not be relying on third-party researchers to detect this kind of vulnerability.” Bensen added that it’s impossible for humans alone to monitor all assets that may be vulnerable to attack or exposure, but machine learning and artificial intelligence tools can—and should—be leveraged by organizations to continuously monitor for risk and vulnerabilities. Source
  3. Facebook sues Ukrainian browser extension makers for scraping user data Facebook said the malicious extensions were installed by more than 63,000 users. Funnytest.pro - one of the sites cited in the Facebook civil complaint Image: ZDNet Facebook has filed a suit against two Ukrainian developers for creating Facebook apps and browser extensions that harvested user data and injected ads into users' timelines. The two developers cited in a lawsuit Facebook filed late Friday, March 8, are named Gleb Sluchevsky and Andrey Gorbachov, both based out of Kiev, and working for a company called the Web Sun Group. According to court documents, Sluchevsky and Gorbachov ran at least four web apps that provided quizzes on various topics. The web apps were advertised and shared on Facebook but they were hosted on a multitude of third-party websites such as megatest.online, supertest.name, testsuper.su, testsuper.net, fquiz.com, and funnytest.pro. Named "Supertest," "FQuiz," "Megatest," and "Pechenka," the web apps were mainly advertised toward Russian and Ukrainian-speaking audiences, and enticed users with themes of "Do you have royal blood?, "You are yin. Who is your yang?" and "What kind of dog are you according to your zodiac sign?," among many. Sluchevsky and Gorbachov ran their scheme between 2016 and 2018, Facebook said. Once users landed on these sites, they'd be prompted to enable push notifications in their browsers, which at later points would prompt the user to install various browser extensions. These extensions contained malicious code that would scrape the user's profile for public and non-public data, and insert authentic-looking ads into victims' timelines. Other social networking sites were also targeted, but Facebook didn't name other victimized sites in its civil complaint. The extensions were promoted on at least three official browser stores and sent back user data to servers in the Netherlands under the two suspects' control. In total, Facebook said that the malicious extensions were installed more than 63,000 times. "Defendants used the compromised app users as a proxy to access Facebook computers without authorization," Facebook said, which is now looking for an injunction and restraining order against the two developers to prohibit them from creating any more apps targeting Facebook users. The company is also requesting financial relief for its efforts of investigating the defendants' operation and restitution of any funds the two made through the scheme. The Daily Beast and Law360 first reported the lawsuit on Friday. This is Facebook's second lawsuit of this kind. A week before, on March 1, Facebook sued four companies and three people in China for operating a network that sold fake accounts, likes and followers on Facebook and Instagram. Source
  4. Updated: Google is preparing a patch for late April 2019. Some of the suspicious PDF files exploiting this bug don't appear to be malicious in nature. A security firm said this week that it discovered PDF documents exploiting a Google Chrome browser zero-day. The vulnerability allowed attackers to collect data from users who opened PDF files inside Chrome's built-in PDF viewer. Exploit detection service EdgeSpot, the company that found the files, says the PDF documents would contact a remote domain with information on the users' device --such as IP address, OS version, Chrome version, and the path of the PDF file on the user's computer. This phone-home behavior did not take place when researchers opened the same PDF files in desktop PDF viewer apps, such as Adobe Reader and others, but was limited to Chrome only. The company said it spotted two distinct sets of malicious PDF files exploiting this Chrome bug, with one series of files being circulated circa October 2017, and the second set in September 2018. The first batch of malicious PDF files sent user data back to the "readnotify.com" domain, while the second sent it to "zuxjk0dftoamimorjl9dfhr44vap3fr7ovgi76w.burpcollaborator.net," researchers said. There was no additional malicious code in the PDF files that EdgeSpot discovered. However, collecting data on users who open a PDF file can aid attackers in fine-tuning future attacks and exploits. But in a conversation with ZDNet after the publication of this story, Mac malware security expert Patrick Wardle explained that the first batch of files that EdgeSpot detected weren't meant to be malicious in nature, despite exploiting the Chrome bug. He said they were assembled using ReadNotify's PDF tracking service that lets users track when someone views their PDF files, a service that has been around since 2010. "What the researchers 'uncovered' is just a document tagged by ReadNotify," Wardle told us, "but yes, Chrome should alert the user." There is no information available on the second set of PDF files (the ones circulated in September 2018) and their nature --if they were assembled by a threat actor, if they're just tests, or were generated for benign user tracking purposes. For its part, EdgeSpot said it notified Google over the Christmas holiday, last year, when they first discovered the documents. The Chrome team acknowledged the zero-day and promised a fix for late April. "We decided to release our finding prior to the patch because we think it's better to give the affected users a chance to be informed/alerted of the potential risk, since the active exploits/samples are in the wild while the patch is not near away," researchers said in a blog post yesterday. The blog post also contains samples and indicators of compromise (IOCs) for the PDF files the company discovered. Until a patch is out, EdgeSpot is recommending that users either use a desktop app to view PDF files or disable their internet connection while they open PDF documents in Chrome. In unrelated research, but also connected to the world of PDF documents, earlier this week, security researchers revealed vulnerabilities that allowed them to fake signatures on 21 of 22 desktop PDF viewer apps and 5 out of 7 online PDF digital signing services. Article updated with Wardle's analysis. Source
  5. The biggest and perhaps best source of data about what people like to watch on the internet and what they would pay for doesn’t come from streaming giants like Netflix, Amazon Prime Video, or Hulu. It comes from porn. While consuming porn is typically a private and personal affair, porn sites still track your every move: What content you choose, which moments you pause, which parts you repeat. By mining this data to a deeper degree than other streaming services, many porn sites are able to give internet users exactly what they want—and they want a lot of it. There are 125 million daily visits to the Pornhub Network of sites, including YouPorn and Redtube, and 100 million of those are to Pornhub alone. (It’s widely acknowledged that Pornhub is the most popular porn site in the world although exact statistics on the industry are few and far between.) To put into perspective how much content that is: In 2017, Pornhub transmitted more than the entire contents of the New York Public Library’s 50 million books combined. In money terms, Quartz has previously found that the porn industry could have a bigger economic influence on the US than Netflix. Revenue estimates are as high as $97 billion (Netflix brings in about $11.7 billion), and even trade publications can’t determine an accurate number because of privately held companies, rampant piracy, and discrepancies in porn studies. However, there is one company that clearly dominates the industry: MindGeek. MindGeek is the world’s biggest porn company—more specifically, it’s a holding company that owns numerous adult entertainment sites and production companies, including the Pornhub Network. Like other streaming giants, MindGeek’s sites analyze user data, but the company has an edge when it comes to producing tailor-made content in-house. With at least 125 million daily visits, MindGeek has a massive range of users to draw data from and create content for. What’s more, producing many short porn videos—people spend an average of less than 10 minutes on Pornhub—is much cheaper than producing TV shows and movies with A-list stars that often require multi-million-dollar budgets. MindGeek’s adult sites can also host user-generated content, like homemade videos, expanding its range of offerings even further and at virtually no cost. Lynn Comella, associate professor of Gender and Sexuality Studies at the University of Nevada, told Quartz that it’s “helpful to take a step back and think about a company like MindGeek like we would any other monopolized streaming media entity.” It is, after all, controlling the flow of information to one of, if not the most, captive audiences on the internet. And in a landscape peppered with subscription models, MindGeek’s and other porn sites have got it right. The average user can watch as much porn as they’d like without so much as making an account, let alone paying, but in exchange for meeting desires that can’t always be met elsewhere, companies like MindGeek access user data because the user more willingly lets them. And it eventually pays off, when users decide to pay for premium content and the habits of paying subscribers become even clearer. What’s more, Pornhub, in particular, operates one of the most sophisticated digital data analysis operations that caters primarily to users and not advertisers. Pornhub Insights provides transparency into its data collection—on the most intimate of subjects—by making research and analysis from billions of data points about viewership patterns, often tied to events from politics to pop culture, available to the public. It offers more than many other tech giants do. One company to rule them all It’s hard to calculate MindGeek’s share of the digital economy. As Ross Benes, author of The Sex Effect put it, “You’re not going to get a regulator or politician who is going say, ‘we’re going to break up the porn monopoly.’” There are myriad reasons for this: There are other monopolistic companies that exist, such as Facebook and Google’s dominance of the online ad industry, and speaking about human sexuality is still taboo, even if increasingly more people find porn morally acceptable. And while there have been attempts to regulate the industry (the UK is trying to enforce age-verification for porn sites, for instance), the adult entertainment sector has still flourished further from the limelight than other massive digital companies. MindGeek, whose bandwidth use exceeds that of Facebook or Amazon, began as a company named Mansef, founded by Stephane Manos and Ouissam Youssef in 2004. It was bought by tech entrepreneur Fabian Thylmann in 2010, re-named Manwin, then MindGeek, and now runs a near-monopoly of streaming porn sites. “Streaming is not just about distribution of content, it’s also about communication,” Kal Raustiala, professor at the UCLA School of Law and International Institute, told Quartz. “When you stream a video or listen to a song, you are sending back information that can be measured,” he says. According to a recent study by Raustiala and Christopher Spigman, a professor at New York University Law School, MindGeek is at the “leading edge” of analyzing this kind of communication. While Netflix and Spotify might be more well-known household names, they know slightly less about their users than MindGeek does. Raustiala said that you can think of it as a spectrum, and MindGeek is furthest along in terms of using big data in a feedback loop. This is because MindGeek relies heavily on “data-driven authorship,” or tailor-making content for viewers, to encourage more paying users. MindGeek owns several production companies of its own, which can harvest data analysis from MindGeek’s porn sites, which also means that customized porn videos can be created efficiently. This production side has proved profitable for MindGeek. “We still believe very strongly in the paid subscription model. So much so, that we have invested heavily in the production side of the business and build several of the top names in Paysites from scratch,” Catherine Dunn, vice president of MindGeek, told Quartz. Sites that charge viewers, such as Brazzers, offer paid-for premium content. Dunn adds that over half of the company’s revenue is generated from such sites, and through “appealing to a demographic that prefers a premium experience with exclusive content.” In their study, Raustiala and Spigman demonstrate just how meticulously MindGeek can appeal to its audience using a script of a porn video MindGeek actually produced. The script begins by specifying the exact clothing the actors will wear, in terms of both color and style (like “white thong and camisole”). Later, it includes especially important details in bold. This direction, along with other combinations of dialogue and sex positions, is derived from A/B testing across the company’s massive number of videos. (A/B testing compares similar items to one another, with one variable switched.) Raustiala and Spigman point out that everything in MindGeek’s script “reflected data mining of millions of views,” and that “over many thousands of runs, one can determine what variable produced the highest viewership.” All these preferences are tracked by porn sites, and if they’re shared by enough people, videos can be produced that are attuned to these exact tastes—even down to the color of the furniture. Netflix does something similar by using “taste clusters” to recommend titles. More recently, Cary Fukunaga, director of the original Netflix series Maniac, admitted that important film decisions were dictated by the streaming service’s algorithm, which analyzes viewer behavior on a granular level. Like MindGeek, Netflix knows what content attracts and repels the most users, and Maniac was made accordingly. But the huge number and variety of porn videos yield more specific data points. (Netflix declined to comment.) Raustiala and Spigman write: The study also notes that because it costs less to produce adult content, MindGeek can quickly adapt to emerging trends. And search trends in porn evolve as rapidly as search trends in the news. In 2017, the year of #MeToo, the top search term on PornHub was “Porn for Women,” followed by “Rick and Morty,” and “Fidget Spinners,” two other phenomena that took off that year. In 2018, the top two searches were “Stormy Daniels” and “Fortnite.” According to Pornhub Insights, Bigfoot erotica searches spiked 8,000% during the Virginia congressional race, as did searches for Marvel’s Avengers characters after Avengers: Infinity War came out. The porn kingdom, and I MindGeek isn’t selling its valuable user data or data analytics to any third-parties, Dunn says, which makes sense considering the massive blackmail potential of porn-related user data. MindGeek puts that data to good economic use in-house. But beyond serving up the exact content that users want and profiting from it, there are other ways that porn sites wield power. In a story for The Cut last year titled “Pornhub is the Kinsey Report of our time,” Maureen O’Connor wrote: “The streaming sex empire may have done more to expand the sexual dreamscape than Helen Gurley Brown, Masters and Johnson, or Sigmund Freud.” In other words, internet users’ sexual lives are likely to be more impacted by the explicit short videos we trawl through, sometimes even at work, than by the theories of the world’s most respected minds. How would a user get hooked on Fortnite-themed porn without expressly searching for it? Imaginations are limitless, but they can be fed. O’Connor writes that there is such a thing as a sexual meme, “fantasies that replicate and spread like wildfire,” and typing it into search engines alters the sexual universe both in terms of what it provides and what people can sexualize. Fetishes have existed throughout history, but a Fortnite fantasy could only exist now. But like many industries, there is a dark side to porn’s outsized influence. “People having access to various forms of sexual expression is a good thing. People having access to various forms of sexual expression without context and/or accurate, relevant sex education complicates that though,” Chauntelle Tibbals, sociologist and author of Exposure: A Sociologist Explores Sex, Society, and Sex Entertainment, told Quartz. But as experts continue advocating that parents to talk to their children about porn (Pornhub has developed its own sex education portal), it’s the creation of increasingly more content to draw in and maintain the most users that still has the most momentum. The future of user data from sex tech Porn companies regularly adopt different technologies to analyze user data and prepare for future customer demands. In March of this year, YouPorn used artificial intelligence to predict the top searches of 2018. (“T’Challa & Shuri,” who are a pair of siblings and two of the main characters from Marvel’s comic and blockbuster film, Black Panther, took the number one spot.) And in October, the site introduced a “search by emoji” function to cater to its increasingly large proportion of viewers watching on mobile. Spotify is also using AI to find out what its users wants, and is continually analyzing user data. Netflix is also committed to investing upwards of $8 billion in original content in 2018 alone, and has algorithms to ensure that audiences react positively. But if the porn industry’s track record is anything to go by, adult entertainment companies will find ways to stay in the user data lead. Porn has historically remained ahead of the curve in terms of reaching its market because sex is a demand that doesn’t go away. The industry was an early adopter of new technologies, and consequently, propelled these technologies into widespread use. Prominent examples include the use of VHS, instant messaging, e-commerce, and video streaming. While the porn industry was not responsible for these innovations, “porn helped those things take off quicker,” Benes says. “If you get a bunch of people watching VHS in the ‘70s, or buying off the internet in the ‘90s, it convinces the capitalists that there is something worth investing in.” Shira Tarrant, professor of Women’s, Gender, and Sexuality Studies at California University and author of The Pornography Industry says, “technology and sexually explicit material have always gone hand in hand.” If it wasn’t for the “titillation factor” that arises at any mention of porn, Tarrant suggests, it would be easier to view the industry for what it is: A thriving business that should be part of “media literacy,” in the same way that social media or fake news is. Bryony Cole, founder of the Future of Sex podcast, points out that porn and adult entertainment is a subset of “sex tech,” which can be defined as “any technology designed to enhance sexuality.” (VR, sex-bots, and sex toys are other examples.) In Cole’s experience, just adding the word “tech” onto the end of “sex” has already made the industry seem more credible. Like one that can build a better online profile of its users than more well-known tech giants. In many ways, emerging sex tech—like VR to help with intimacy —is another avenue for porn companies to get ahead (the porn and gaming industries already use VR the most). New technologies might better help companies collect, and create from, more data. “Imagine tech that’s like a Fitbit for your sexual health,” Cole says. It doesn’t seem like a big jump from A/B testing that determines everything from the sex position to the color of the carpet, to A/B testing that discovers how quickly users switch between vibrator modes, or which characters in a virtual stimulation are the most appealing. It’s a little scary, but your data, after all, is used by porn companies to feed your desire in a way that other industries haven’t got the hang of yet. Source
  6. Staff penned an open letter in an effort to be transparent A security bug that hit Tumblr’s recommended blogs module may have exposed users’ private information, according to an open letter. Information like email addresses, passwords, IP addresses, and self-reported locations may have become exposed due to the bug if individual accounts were hit. It’s unclear if the bug affected individual accounts, according to the open letter, but an investigation concluded that the bug “was rarely present.” “We’ve also thoroughly investigated any way in which our community could have been affected,” the letter reads. “We found no evidence that this bug was abused, and there is nothing to suggest that unprotected account information was accessed.” The bug was brought to Tumblr’s attention through a bug bounty program run by Oath, Tumblr’s parent company. A security researcher discovered that if a blog appeared in the recommended section of a user’s dashboard, “it was possible, using debugging software in a certain way, to view certain account information associated with the blog.” Tumblr’s desire to be transparent with users about security bugs and potentially compromised information comes at a time when other social media platforms are being hit with criticism. Facebook has encountered several major security flaws this year, leading to widespread concern among users as millions of accounts were affected. “It’s our mission to provide a safe space for people to express themselves freely and form communities around things they love,” Tumblr’s open letter reads. “We feel that this bug could have affected that experience. We want to be transparent with you about it. In our view, it’s simply the right thing to do.” Source
  7. News broke last week that Adware Doctor, an ad-blocker sold in the Mac App Store, quietly stole its users' browser histories and sent them to a server in China. This malicious data collection was independently confirmed by two researchers and promptly disclosed to Apple but remained in the virtual store, seemingly until it started making headlines. Over the weekend, the saga continued with revelations that several other apps in the Mac App Store were doing the same thing. A report said to be published by cybersecurity vendor Trend Micro says people had been complaining that Dr. Unarchiver, Dr. Cleaner and other utilities sold in the Mac App Store were exfiltrating their browser history since at least December 2017. Nobody seemed to pay much attention to those reports until Adware Doctor's scandal. Every app in this group--or should it be a "practice" since they're all doctors?--appeared to steal data in the same way. They looked like legitimate applications, with several of them making best-selling apps lists, then they'd work their way around the sandboxing Apple uses to prevent apps from accessing data they shouldn't. From there, all they had to do was gather and send the browser history. Of course, this isn't supposed to happen. The whole point of the Mac App Store, much like the iOS App Store, Google's Play Store or their equivalents, is to assure consumers that software downloaded from those marketplaces is safe. Store owners are supposed to vet every app they sell to make sure they aren't secretly gathering personal information. These problems raise serious questions about the security of software downloaded from the Mac App Store. Apple missed all of these problems while it was vetting these utilities, and the apps' malicious activities largely went unnoticed despite their popularity. Even after security researchers investigated these claims, confirmed their validity and reported the issues to Apple, it took more than a month for Adware Doctor to be removed. There is some good news: Apple has already removed the other apps named this weekend. The question is whether or not significant media attention will be required to reveal other bad actors in the Mac App Store too, or if this series of events will change the vetting process. Source
  8. ICO probe: No legal basis for Facebook slurps WhatsApp has agreed not to share users' data with parent biz Facebook after failing to demonstrate a legal basis for the ad-fuelling data slurp in the EU. The move comes after a years-long battle between the biz and European data protection agencies, which argued that changes to WhatsApp's small print hadn't been properly communicated and didn't comply with EU law. An investigation by the UK's Information Commissioner's Office, which reported today, confirmed the biz has failed to identity a legal basis for sharing personal data in a way that would benefit Facebook's business. Moreover, any such sharing would have been in breach of the Data Protection Act. In response, WhatsApp has agreed to sign an undertaking (PDF) in which it commits not to share any EU user data to any other Facebook-owned company until it can comply with the incoming General Data Protection Regulation. The ICO celebrated the deal as a "win for the data protection of UK customers" – a statement that Paul Bernal, IP and internet law expert at the University of East Anglia, said he agreed with only up to a point. "This is indeed a 'win', but a limited one," he told The Register. "It's only a commitment until they believe they've worked out how to comply with the GDPR – and I suspect they'll be working hard to find a way to do that to the letter rather than to the spirit of the GDPR." Using consent as the lawful basis? No dice At the heart of the issue is consent. In summer 2016, a privacy policy update said that, although it would continue to operate as a separate service, WhatsApp planned to share some account information, including phone numbers, with Facebook for targeted advertising, business analysis and system security. Although users could withhold consent for targeted advertising, they could not for the other two purposes – any users that didn't like the terms would have to stop using WhatsApp. The EU data protection bodies have previously said that this "like it or lump it" approach to service use doesn't constitute freely given consent – as required by EU rules. Similarly, they felt that WhatsApp's use of pre-ticked boxes was not "unambiguous" and that the information provided to users was "insufficiently specific". The ICO has also noted that matching account data might lead to "privacy policy creep", with further uses of data slipping into the Ts&Cs unnoticed by users. The investigation – which looked only at situations where WhatsApp wanted to share information with Facebook for business interests, not service support – confirmed concerns that the policy wasn't up to scratch. Information commissioner Elizabeth Denham said WhatsApp had not identified a lawful basis for processing, or given users "adequate fair processing information" about any such sharing. "In relation to existing users, such sharing would involve the processing of personal data for a purpose that is incompatible with the purpose for which such data was obtained," she said. She added that if the data had been shared, the firm "would have been in contravention of the first and second data protection principles" of the UK's Data Protection Act. WhatsApp has maintained that it hasn't shared any personal data with Facebook in the EU, but in a letter to the biz's general counsel Anne Hoge, Denham indicated that this had not been made clear at the outset. Denham wrote that the initial letter from WhatApp had only stated data sharing was paused for targeted ads. It was, she said, "a fair assumption for me to make" that WhatsApp may have shared data for the other two purposes, "but have at some point since that letter decided to pause" this too. However, she said that since WhatsApp has "assured" the ICO that "no UK user data has ever been shared with Facebook", she could not issue the biz with a civil monetary penalty and had to ask WhatsApp to sign the undertaking instead. Next up: Legitimate interests Denham's letter makes it clear that the companies will be working to make sure that data sharing can go ahead in a lawful way, particularly for system security purposes, for which it may consider using the "legitimate interests" processing condition. She noted that there would be "a range" of legitimate interests – such as fighting spam or for business analytics – but that in all cases it would need to show that processing was necessary to achieve it, and balance it against individuals' rights. Bernal said that if the biz had any plans to use the consent condition for processing, it "will need huge scrutiny". "It's almost impossible for most users to understand what they're really consenting to," he said. "And if ordinary users can't understand, how can they consent?" Jon Baines, data protection adviser at Mishcon de Reya, also noted that the fact WhatsApp had held its ground on what he described as a "key point" could put the ICO in a difficult position down the line. "It's very interesting that the ICO is classing this as a 'win', because – although on the surface it seems like a success – it's notable that WhatsApp have reserved their position on a key point, which is whether the processing in question falls under the UK's remit by virtue of the fact that it takes place in the UK on users' devices," he said. "Normally the effect of an informal undertaking will be to encourage a data controller voluntarily to take or cease action, to avoid the need for legal enforcement which would otherwise be available. "Here, should WhatsApp subsequently fail to perform the undertaking, the ICO might be compromised if there is no clear basis on which it can follow up with enforcement action." In a statement sent to The Register, WhatsApp emphasised the pause it had put on data sharing. "As we've repeatedly made clear for the last year we are not sharing data in the ways that the UK Information Commissioner has said she is concerned about anywhere in Europe." It added that it "cares deeply" about users' privacy and that "every message is end-to-end encrypted". Source
  9. Facebook is one social media platform where people from all walks of life share pretty much everything about their life, from work and school to events and adventures. It’s a giant database constantly feeding and growing on personal information. By the end of the first quarter of 2018, Facebook had more than 1.9 billion active users around the world. It should therefore come as no surprise that requests for Facebook data from government agencies have also skyrocketed with time. Law Enforcement Agency Requests for Facebook Data Continue to Rise According to the Facebook biannual report, which provides a good idea of how interested US law enforcement agencies really are in the data that Facebook users create on a daily basis, that interest is increasing. In fact, from the first half of 2013 to the end of 2016, the total data requests and accounts targeted by law enforcement agencies have more than doubled. What’s perhaps more alarming is that around 56 percent of all government requests accompanied a non-disclosure order that legally restrains Facebook from notifying the affected user. So, there is no way Facebook users would know if US law enforcement agencies either requested their data or if it has been compromised. According to the Facebook report, by the second half of 2016, Facebook received 14,736 search warrant requests, 6,536 subpoenas, 738 court orders (18 USC 2703(d)), 236 court orders (non-18 USC 2703(d)), 1,948 pen register/trap and trace requests, 1,695 emergency disclosures and 125 real-time wiretap requests. Action of ACLU against these Alarming Stats The American Civil Liberties Union (ACLU) has taken notice of these stats. It has especially voiced concerns over the complete absence of disclosures that play an integral role in the transparency of the entire process. What is more disturbing, however, is that businesses have realized how big a gold mine social media platforms like Facebook and Twitter really are since they store everything they could possibly want to know about a potential consumer. By knowing about the personal information, geolocation, browsing habits, and likes and dislikes of Facebook users, businesses would be in a better position to tailor their ads according to their needs, tastes and preferences. So your personal data is literally up for sale to the highest bidder. Nicole Ozer, Director of Technology and Civil Liberties Policy at ACLU California, stated in a post on govtech.com that the legal framework of California has seen consistent progressive updates over the years, but federal communications privacy law is one area that still remains unchanged for more three decades. Ozer is quoted, “The federal law, the Electronic Communications Privacy Act, is supposed to … make sure there are proper safeguards in place for when the government can demand electronic information, including things like data from Facebook. That law has not been updated since 1986. In 1986, cellphones were the size of bricks, Mark Zuckerberg was still in diapers, the World Wide Web did not even exist.” She further states that owing to large loopholes in outdated privacy laws, many US law enforcement agencies continue to aggressively pursue myriad forms of digital communications with complete impunity and absolute disregard for user privacy. Of the various types of information available to be collected by law enforcement, Ozer believes there is one in particular that should concern Facebook users the most. There is information out there on the back end of platforms and services that is not easily visible to the public. “This data isn’t publicly available where you can just go onto Facebook; this is actually data held by the back end of the company and you are compelling it with a warrant or another type of legal process,” explained Ozer in the govtech.com post. “That third piece, the kind of legal process that is required for sort of accessing this very sensitive back-end data, that law has not been updated and it leaves a lot of gray areas, which can make users quite vulnerable.” Seeing how government requests for Facebook data have more than doubled in the last four years, this should leave a lot of questions on the minds of users of social media, the most important of which is, “How safe is our data on social media?” And the brutal irony about all this is that the personal data of Facebook users is being collected and scrutinized by the very people sworn in to protect them! In a Nutshell As digital communication connects the far corners of the globe, we are bound to see more and more people connecting to Facebook and other social media. And with this increase in users, a consequent increase in government requests for intelligence data seems inevitable. So, unless US states solidify their legal framework around digital privacy, bipartisan support at the federal level will continue to encourage law enforcement agencies to exploit loopholes. Source
  10. Berlin-based online music distribution platform SoundCloud has announced a change to its privacy policy, a change which is aimed at its users from the United States. The firm has a number of local equivalents in different countries, collectively referred to as the 'SoundCloud Group'. This is important because of the new section called 'Data Controller' that will become part of its policy starting next month. In essence, all user data from the site was until now controlled by its European entity, SoundCloud Limited. From May onwards, data from users in the US will be transferred to the United States subsidiary, SoundCloud Inc. The full change is below: This expansion of the policy is done on the back of what SoundCloud views as its continued growth, motivating that they want users' data from a certain country to be controlled by the appropriate local equivalent. The firm also goes on to state that this move will allow them to "provide those users with a more personalized SoundCloud experience". The new privacy policy takes effect May 17, 2017 and can be read in full here. Source
  11. Facebook Bans Devs From Creating Surveillance Tools With User Data Without a hint of irony, Facebook has told developers that they may not use data from Instagram and Facebook in surveillance tools. The social network says that the practice has long been a contravention of its policies, but it is now tidying up and clarifying the wording of its developer policies. American Civil Liberties Union, Color of Change and the Center for Media Justice put pressure on Facebook after it transpired that data from users' feeds was being gathered and sold on to law enforcement agencies. The re-written developer policy now explicitly states that developers are not allowed to "use data obtained from us to provide tools that are used for surveillance." It remains to be seen just how much of a difference this will make to the gathering and use of data, and there is nothing to say that Facebook's own developers will not continue to engage in the same practices. Deputy chief privacy officer at Facebook, Rob Sherman, says: Transparency reports published by Facebook show that the company has complied with government requests for data. The secrecy such requests and dealings are shrouded in means that there is no way of knowing whether Facebook is engaged in precisely the sort of activity it is banning others from performing. Source
×
×
  • Create New...