Jump to content
Donations Read more... ×

Search the Community

Showing results for tags 'Google'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Site Related
    • News & Updates
    • Site / Forum Feedback
    • Member Introduction
  • News
    • General News
    • FileSharing News
    • Mobile News
    • Software News
    • Security & Privacy News
    • Technology News
  • Downloads
    • nsane.down
  • General Discussions & Support
    • Filesharing Chat
    • Security & Privacy Center
    • Software Chat
    • Mobile Mania
    • Technology Talk
    • Entertainment Exchange
    • Guides & Tutorials
  • Off-Topic Chat
    • The Chat Bar
    • Jokes & Funny Stuff
    • Polling Station

Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Found 887 results

  1. We welcome the EU cracking down on Google's anti-competitive search behavior. We have felt its effects first hand for many years and has led directly to us having less market share on Android vs iOS and in general mobile vs desktop. A couple examples. Up until just last year, it was impossible to add DuckDuckGo to Chrome on Android, and it is still impossible on Chrome on iOS. We are also not included in the default list of search options like we are in Safari, even though we are among the top search engines in many countries. The Google search widget is featured prominently on most Android builds and is impossible to change the search provider. For a long time it was also impossible to even remove this widget without installing a launcher that effectively changed the whole way the OS works. Their anti-competitive search behavior isn't limited to Android. Every time we update our Chrome browser extension, all of our users are faced with an official-looking dialogue asking them if they'd like to revert their search settings and disable the entire extension. Google also owns http://duck.com and points it directly at Google search, which consistently confuses DuckDuckGo users. Source: DuckDuckGo
  2. An open-source collaboration for ‘the future of portability’ Today, Google, Facebook, Microsoft, and Twitter joined to announce a new standards initiative called the Data Transfer Project, designed as a new way to move data between platforms. In a blog post, Google described the project as letting users “transfer data directly from one service to another, without needing to download and re-upload it.” The current version of the system supports data transfer for photos, mail, contacts, calendars, and tasks, drawing from publicly available APIs from Google, Microsoft, Twitter, Flickr, Instagram, Remember the Milk, and SmugMug. Many of those transfers could already be accomplished through other means, but participants hope the project will grow into a more robust and flexible alternative to conventional APIs. In its own blog post, Microsoft called for more companies to sign onto the effort, adding that “portability and interoperability are central to cloud innovation and competition.” The existing code for the project is available open-source on GitHub, along with a white paper describing its scope. Much of the codebase consists of “adapters” that can translate proprietary APIs into an interoperable transfer, making Instagram data workable for Flickr and vice versa. Between those adapters, engineers have also built a system to encrypt the data in transit, issuing forward-secret keys for each transaction. Notably, that system is focused on one-time transfers rather than the continuous interoperability enabled by many APIs. “The future of portability will need to be more inclusive, flexible, and open,” reads the white paper. “Our hope for this project is that it will enable a connection between any two public-facing product interfaces for importing and exporting data directly.” The bulk of the coding so far has been done by Google and Microsoft engineers who have long been tinkering with the idea of a more robust data transfer system. According to Greg Fair, product manager for Google Takeout, the idea arose from a frustration with the available options for managing data after it’s downloaded. Without a clear way to import that same data to a different service, tools like Takeout were only solving half the problem. “When people have data, they want to be able to move it from one product to another, and they can’t,” says Fair. “It’s a problem that we can’t really solve alone.” Most platforms already offer some kind of data-download tool, but those tools rarely connect with other services. Europe’s new GDPR legislation requires tools to provide all available data on a given user, which means it’s far more comprehensive than what you’d get from an API. Along with emails or photos, you’ll find thornier data like location history and facial recognition profiles that many users don’t even realize are being collected. There are a few projects trying to make use of that data — most notably Digi.me, which is building an entire app ecosystem around it — but for the most part, it ends up sitting on users’ hard drives. Download tools are presented as proof that users really do own their data, but owning your data and using it have turned into completely different things. The project was envisioned as an open-source standard, and many of the engineers involved say a broader shift in governance will be necessary if the standard is successful. “In the long term, we want there to be a consortium of industry leaders, consumer groups, government groups,” says Fair. “But until we have a reasonable critical mass, it’s not an interesting conversation.” This is a delicate time for a data-sharing project. Facebook’s API was at the center of the Cambridge Analytica scandal, and the industry is still feeling out exactly how much users should be trusted with their own data. Google has struggled with its own API scandal, facing outcry over third-party email apps mishandling Gmail users’ data. In some ways, the proposed consortium would be a way to manage that risk, spreading the responsibility out among more groups. Still, the specter of Cambridge Analytica puts a real limit on how much data companies are willing to share. When I asked about the data privacy implications of the new project, Facebook emphasized the importance of maintaining API-level controls. “We always want to think about user data protection first,” says David Baser, who works on Facebook’s data download product. “One of the things that’s nice about an API is that, as the data provider, we have the ability to turn off the pipeline or impose conditions on how they can use it. With a data download tool, the data leaves our hands, and it’s truly out there in the wild. If someone wants to use that data for bad purposes, Facebook truly cannot do anything about it.” At the same time, tech companies are facing more aggressive antitrust concerns than ever before, many of them centering on data access. The biggest tech companies have few competitors. And as they face new questions about federal regulation and monopoly power, sharing data could be one of the least painful ways to rein themselves in. It’s an unlikely remedy for companies that are reeling from data privacy scandals, but it’s one that outsiders like Open Technology Institute director Kevin Bankston have been pushing as more important than ever, particularly for Facebook. “My primary goal has been to make sure that the value of openness doesn’t get forgotten,” Bankston says. “If you’re concerned about the power of these platforms, portability is a way to balance that out.” Update 7/20/2018 12:00PM EST: This piece was updated to include reference to Microsoft’s announcement of the Data Transfer Project. Source
  3. Google Chrome Software Removal Tool is an easy-to-use program which tries to get a broken Chrome installation working again.Launch it and the tool scans your PC for programs which Google considers "suspicious" or "known to cause problems with Chrome", and offers to remove them. Bizarrely, the CSRT won't give you the names of these suspicious programs, so you'll have to trust it. Or you can just run the program to see if it thinks there are any, then click "Cancel" instead of "Remove" when the report appears. Whatever you do, once the scan is complete, CSRT launches Chrome with the chrome://settings/resetProfileSettings command, prompting you to reset your Chrome settings. Click "Reset" and Chrome will be reset to its default settings, otherwise just close the window to continue as usual. There are no other settings or options, nothing else to do at all. Google provides few details of what the Chrome Software Removal Tool actually does. They do claim. Find programs and components that affect Chrome If you notice changes in the settings of your Chrome browser, there is a small utility that can help you identify the issue and correct it. Created by Google itself, it goes by the name of Chrome Cleanup Tool (Google Chrome Software Removal Tool), enabling you to detect programs that interfere with Google Chrome and remove them. Since toolbars, browser add-ons and pop-up ads are not typical malware, your antivirus solution might fail to detect their presence. Chrome Cleanup Tool is specifically designed to find programs and components whose installation resulted in modifications of Chrome's settings, providing you with a simple means to reset them. Remove interfering components with a click The application does not require installation and starts looking for suspicious programs as soon as you launch it. The number of findings are displayed within a small window, along with an option to remove them all, but their names are not revealed, so as to prevent name modifications that might cause Chrome Cleanup Tool not to work as it should. In some cases, a system reboot might be required in order for the changes to take effect. Once the issue is fixed, Chrome restarts and prompts you to reset the browser settings. Scan for malicious programs that cause issues with Chrome Chrome Cleanup Tool is an attempt to enhance the browsing experience of Chrome users, providing them with a simple method to factory reset the settings and remove programs that cause trouble to the browser. More aggressive malware might be impossible to remove or detect, so you might need a reputable antivirus solution to clean the system. Note that this application is not designed to search for all types of viruses and malware components, but only those that cause issues with Google Chrome. Homepage or here Download: Link 1 - New Link 2 - New
  4. Google required to stop ‘illegally tying’ Chrome and search apps to Android Google has been hit with a record-breaking €4.3 billion ($5 billion) fine by EU regulators for breaking antitrust laws. The European Commission says Google has abused its Android market dominance in three key areas. Google has been bundling its search engine and Chrome apps into the operating system. Google has also allegedly blocked phone makers from creating devices that run forked versions of Android, and it “made payments to certain large manufacturers and mobile network operators” to exclusively bundle the Google search app on handsets. The European Commission now wants Google to bring its “illegal conduct to an end in an effective manner within 90 days of the decision.” That means Google will need to stop forcing manufacturers to preinstall Chrome and Google search in order to offer the Google Play Store on handsets. Google will also need to stop preventing phone makers from using forked versions of Android, as the commission says Google “did not provide any credible evidence that Android forks would be affected by technical failures or fail to support apps.” Google’s illegal payments for app bundling ceased in 2014 after the EU started to look into the issue. Android has long been considered open-source software, but Google has slowly been adding key components into its Google Play Services software and associated agreements. Alongside anti-fragmentation agreements to keep manufacturers on Google’s version of Android, most Android handsets (outside of China) now ship with Google’s software and services bundled on them. The EU has now ordered Google to adhere to its judgment within 90 days and unbundle search and Chrome from its Android offering. With Google appealing the decision, the legal process is likely to run for many years ahead. While many had expected Google to face its own “Microsoft moment,” the EU doesn’t seem to be forcing any strong future oversight on Android or asking Google to modify its software to include a ballot for alternative browsers or search engines. This decision seems to be more about preventing Google from bundling its services to Android, than forcing the company to change Android significantly. Phone manufacturers will still be free to bundle Chrome and Google search apps if they wish, but they won’t be forced to do so, and they’ll be free to offer devices with forked versions of Android. Source
  5. geeteam

    Hidden Android Secret Codes

    How well do you know your Android device? Here are some of the hidden Android secret codes. Since most hidden menus are manufacturer specific, there’s no guarantee that they’ll work across all Android smartphones, but you can try them out nevertheless on your Samsung, HTC, Motorola, Sony and other devices. Be advised, though, that some of these can cause serious changes to your device’s configuration, so don’t play with something that you don’t fully understand. You can find more of these spread across the internet, and they’re usually very handy to have, even if just to show off your geekiness to your social circle. Update x1: More codes! Source : Redmondpie
  6. Google is set to face a record-busting EU antitrust fine this week over its Android mobile operating system but rivals hoping that an order to halt unfair business practices will help them may be disappointed. The European Commission's decision, delayed by a week by U.S. President Donald Trump's visit to a NATO summit in Brussels last week, is expected on Wednesday. It comes just over a year after the Commission slapped a landmark 2.4-billion-euro ($2.8 billion) penalty on Google, a unit of Alphabet Inc, for favouring its shopping service over those of competitors. The EU penalty is likely to exceed the 2017 fine because of the broader scope of the Android case, sources familiar with the matter have told Reuters. The EU sanction comes in the midst of a trade conflict between the United States and the EU, which has hit back against U.S. tariffs on European steel and aluminium by targeting $3.2 billion in American exports with higher duties. European Commission President Jean-Claude Juncker will meet Trump in Washington D.C. on the trade issue next week. The Android decision is the most important of a trio of antitrust cases against Google. With the company able to make its ads show up in more smartphone apps than any other tech rival, Google's app network has quietly become a huge growth engine. The company's high payouts to app developers, coupled with its entrenched relationship with millions of advertisers, has turned Google into the main revenue source for many apps. Its Play Store accounts for more than 90 percent of apps downloaded on Android devices in Europe. Its popularity in turn could mean an uphill battle for EU antitrust regulators seeking to level the playing field for Google's rivals by ensuring that users can download from competing app stores and that smartphone makers are free to choose pre-installed apps. Regulators say Google has tilted the field in its favour by forcing smartphone makers to pre-install Google Search together with its Play Store and Chrome browser, sign agreements not to sell devices on rival Android systems and also pay smartphone makers to only pre-install Google Search on devices. Google has denied the charges, saying that bundling search with its Google Play allows it to offer the entire package for free, and that smartphone makers and users have a wide choice. Regulatory action is probably too late because of Google's entrenched position, said analyst Richard Windsor at research company Radio Free Mobile. "Users in the EU are now completely accustomed to using Google services and have come to prefer them," he said. "Hence, I think separating Google Play from the rest of Google’s Digital Life services would have very little impact as users would simply download and install them from the store," Windsor said. The Android case was triggered by a 2013 complaint by lobbying group FairSearch whose members at the time included competitors such as Oracle, Nokia and Microsoft. < Here >
  7. Google parent Alphabet announced Tuesday it was raising the profile of two "moonshot" projects—one for drone delivery and the other for global internet connectivity with balloons. The announcement means that balloon project Loon and drone project Wing will be independent companies within Alphabet—and in theory could be spun off entirely in the future by the California technology giant. Wing and Loon have been part of the Alphabet "moonshot factory" known as X, creating projects with potential to disrupt new sectors. "X's job is to create radical new technologies and build a bridge from an idea to a proven concept," said moonshots "captain" Astro Teller in a blog post. "Now that the foundational technology for these projects is built, Loon and Wing are ready to take their products into the world" Alphabet has previously "graduated" its Waymo self-driving car division, along with the cybersecurity unit Chronicle and the life sciences project Verily. Another moonshot project, the geothermal energy unit called Dandelion, has been spun off as a fully independent company. Wing is building an autonomous delivery drone service which aims to reduce fossil fuel use and urban congestion, and facilitate disaster relief transport. James Ryan Burgess was named chief executive. Loon is building a network of balloons, traveling along the edge of space, to expand internet connectivity to underserved areas and disaster zones. Its CEO will be Alastair Westgarth. While Alphabet has kept some of its projects under wraps, Teller said the latest moves will allow the company to concentrate on "new moonshot adventures," and ongoing projects including Google Glass, robotics and wireless optical communications. < Here >
  8. Google has secretly enabled a security feature called Site Isolation for 99% of its desktop users on Windows, Mac, Linux, and Chrome OS. This happened in Chrome 67, released at the end of May. Site Isolation isn't a new feature per-se, being first added in Chrome 63, in December 2017. Back then, it was only available if users changed a Chrome flag and manually enabled it in each of their browsers. The feature is an architectural shift in Chrome's modus operandi because when Site Isolation is enabled, Chrome runs a different browser process for each Internet domain. Site Isolation put on the fasttrack after Meltdown and Spectre Initially, Google described Site Isolation as an "additional security boundary between websites," and as a way to prevent malicious sites from messing with the code of legitimate sites. Google's slow-moving plans towards Site Isolation's rollout changed a month after its launch, in January 2018, when the Meltdown and Spectre vulnerabilities were disclosed to the public. From an experimental project that's been in the works for several months, Site Isolation became Chrome's primary defense against Meltdown and Spectre attacks. Ever since January, Google has been slowly enabling Site Isolation by default for more and more users, testing how it affected the browser's performance. Splitting the code of each domain into a separate process takes a heavy toll on Chrome and the underlying OS. According to Google, this impact is "about a 10-13% total memory overhead in real workloads due to the larger number of processes." But Google engineers seem to be OK with this performance overhead, as a trade-off for the improved security. The feature is now enabled by default for 99% of Chrome's desktop userbase, and Android will follow soon. Site Isolation is supported in Chrome for Android, but it's still disabled by default and hidden underneath the chrome://flags/#enable-site-per-process flag. Google rolling back other Meltdown and Spectre mitigations Google says that by enabling Site Isolation for the vast majority of Chrome users in v67, it's engineers can now roll back some of the other Meltdown and Spectre mitigations they added to Chrome, but which have also had a negative performance impact, and which are not needed anymore. "We are planning to re-enable precise timers and features like SharedArrayBuffer (which can be used as a precise timer) for desktop," Google said. Precise timers that have a lower accuracy and the disabling of SharedArrayBuffer function is how Firefox, Edge, and Safari also dealt with Meltdown and Spectre, the CPU hardware bugs that allowed attackers to use JavaScript to retrieve information from a browser's process, such as passwords or encryption keys. With Site Isolation enabled, such attacks aren't possible because each site domain runs in a separate browser process which contains data from one domain alone, and not multiple sites at once, at Chrome used to do before Site Isolation. Furthermore, Site Isolation also destroys a site's process and creates a new one if a user navigates to a different site inside the same tab, keeping isolation at the site-level and not at the tab level, as its name implies. Reenabling precise timers and access to the SharedArrayBuffer function isn't expected to have a negative impact on Chrome users' security, but will allow web developers the ability to create more accurate web apps that handle with real-time data once again. Source
  9. So you’ve just bought the best Windows laptop, you’ve gritted your teeth through Cortana’s obnoxiously cheery setup narration, and the above screenshot is the Start menu you’re presented with. Exactly how special do you feel as you watch the tiles animating and blinking at you like a slots machine? I’ll tell you how I felt as I was getting to grips with the Huawei MateBook X Pro for the first time: perplexed. Perplexed that this level of bloatware infestation is still a thing in 2018, especially on a computer costing $1,499 and running an OS called Windows 10 “Pro.” Why are we still tolerating this? Before anyone assumes that this is just a rant against and about Windows, I’ll happily include Apple’s iOS and some varieties of Google’s Android in my scorn. The blight of undesired software and prompts is all around us. If I buy an iPhone, Apple pins the Apple Watch app on my home screen, whether I have the compatible watch or not. Or if I go to Apple’s nemesis, Samsung forces its Bixby assistant into everything I do with a Galaxy S. Cloud storage is a common cause for unnecessary nags. I had to decline Microsoft’s offers to activate auto-backup to OneDrive three times before Windows 10 got the message. If I use the majority of Apple’s free 5GB of iCloud storage, I’ll get a daily reminder that I can pay to get more. And Google’s Photos app is so thirsty for any images I generate with my phone that it will ask me if I want to automatically back up new folders I create (such as ones for screenshots or downloaded photos). I understand that for some people and in some instances those prompts will be helpful, but those are exceptions to the rule. And the rule seems to be that companies are trying to beat us into submission through mental attrition. To my mind, bloatware is any piece of software within an operating system that receives a disproportionate amount of prominence or system resources relative to its functionality. We can have debates about whether pushy notifications about ancillary services from the OS maker necessarily constitute bloat, but there’s little room for disagreement when it comes to third-party additions. Candy Crush Saga on Windows, the News Republic app on HTC phones, and the Oath bundle that Samsung preloads on Verizon Galaxy S9s all serve corporate interests before those of the user. Oath CEO Tim Armstrong, speaking to Reuters, leaves no doubt about it: “This gets ads one step closer to being direct to consumer. You can’t be more direct than being on the mobile phone home screen and app environment.” Again I ask, why do we tolerate this? The best answer I can come up with is a lack of genuine alternatives. The most horrific examples of carrier bloatware that I know of come from Korea. I’ve reviewed flagship LG phones with as many as 54 (fifty-four!) carrier-imposed bloatware apps, accompanied by carrier branding on the box, a carrier splash screen integrated into the boot-up sequence, and even a carrier-specific home screen theme. Presenting such a device to people more familiar with the American or European smartphone markets draws gasps of surprised revulsion. But that’s the thing: if everyone in Korea is used to seeing a thicket of carrier apps and nonsense preloaded on their phone, if no one is showing consumers a better option, they just accept it as an unhappy status quo and get on with life. I get the feeling that Windows users are now in a similar class. They know they’ll get the occasional Start menu ad popping up, they know that they’ll have to spend half an hour just disabling, uninstalling, and unpinning the superfluous crap their PC comes with, and they’ve grown to accept that situation. But the truth is that we don’t all have to continue to tolerate the crappy status quo. Google’s Pixelbook and the family of Chromebook devices from traditional PC makers give a great counterexample to the overbearing Windows experience. I can have a Pixelbook up and running in roughly the same time as it would take to brew a good cup of tea. Google’s Pixel phones are hard to find in stores, but they also present a version of Android that is far worthier of a user’s trust than the typical, overly inquisitive Android OEM variation. Trust is an especially important theme in consumer tech this year. Revelations about Facebook’s negligence with user data, Android OEMs outright lying to their users about software updates, and the recent bizarre example of Samsung phones spontaneously texting photos to random contacts have raised the requirement for trustworthiness as well as high specs from a device maker. Few things erode that trust quite as quickly as a user interface designed to bait you into clicking on some ad that’s useless to you. Source
  10. At a developer conference in May, Google CEO Sundar Pichai demonstrated how a cutting-edge, computer-generated voice assistant called Duplex could call up a restaurant or hair salon and make an appointment without the person on the other end ever realizing they were talking with a robot. The technology was as controversial as it was impressive, drawing sharp criticism from people concerned about its ethical implications. What Mr. Pichai didn’t mention is that the technology could be more than just a nifty trick to help users save a bit of time on reservations. Some big companies are in the very early stages of testing Google’s technology for use in other applications, such as call centers, where it might be able to replace some of the work currently done by humans, according to a person familiar with the plans. [...] If interested, please read the entire article < here >.
  11. Just how far is Google willing to go to avoid paying artists what’s fair? Several weeks ago, the European Union Legal Affairs Committee ruled on a controversial measure. Dubbed Article 13, internet tech giants – including Facebook, Google, Microsoft – would have to install “effective technologies” to ensure content creators, artists, authors, and journalists receive fair pay for their work online. The committee approved the initiative, all-but-guaranteeing the EU Parliament will eventually sign the measure into law. Now, the search giant wants to do everything in its power to bury the initiative. And, it has the funds to do so. According to UK Music, Google has spent over €31 million ($36 million) on lobbying European Union members against Article 13. The European Union’s Lobbying Transparency Register has confirmed that in 2016 alone, the search giant spent €5.5 million ($6.5 million) to “try and influence policy decisions.” Google paid eight consultancy firms, including McLarty Associates, MUST & Partners, and MKC Communications. 14 Google staff members have also worked on EU policies. Google’s lobbying initiative hasn’t stopped there. The company has also lobbied the European Parliament through 24 other organizations, including OpenForum Europe. According to Michael Dugher, head of UK Music, the 24 organizations have spent €25 million ($29 million) to lobby EU member countries. But, why does Google fear Article 13? Simple. Under the Directive on Copyright in the Digital Services Market’ (Article 13), platforms that host user-generated content (UGC) would have to obtain music licenses. It would also prevent further ‘safe harbor’ provisions. Basically, the Copyright Directive would finally place websites like YouTube and Vimeo on par with streaming music platforms. They would have to pay royalties when hosting videos featuring copyrighted music. Article 13 would also force the platforms to introduce content recognition systems. These would block UGC that infringes on existing copyrights. Social media websites, including Facebook and Twitter, would likely also have to install the systems. Simply put, the measure would end YouTube’s historic exploitation of ‘safe harbor’ loopholes in the European Union. Google would now have to pay the music industry royalties for user-generated content featuring copyrighted content. Mass Hysteria – “Article 13 means the end of memes, remixes, and ‘Internet Freedom.'” Critics have argued that Article 13 would ‘censor the internet.’ Comedian Stephen Fry, for example, has argued that the vote would outright ban meme sharing on social media. Users could no longer create and share remixes and other unique content online. ” Of course, he – along with other critics – didn’t provide proof to support these claims. Crispin Hunt, Chair of the British Academy of Songwriters, Composers & Authors, has fought back against the ‘censorship’ claims. On Twitter, he wrote, Hunt also slammed Fry for spreading false information. Dugher also refuted the claims. He wrote, He added the Copyright Directive will “protect rightsholders in the digital age.” Tech firms would finally have to pay artists the true value of their works. English singer/songwriter Billy Bragg also voiced his support for Article 13. With the measure’s passing, YouTube’s ‘value gap’ would finally disappear. European Union members will meet on July 5th to hold a vote on the Copyright Directive. Source
  12. Wladimir Palant's (AdBlock Plus creator) notes Musings about extensions, security and some more Today, I found this email from Google in my inbox: We routinely review items in the Chrome Web Store for compliance with our Program policies to ensure a safe and trusted experience for our users. We recently found that your item, “Google search link fix,” with ID: cekfddagaicikmgoheekchngpadahmlf, did not comply with our Developer Program Policies. Your item did not comply with the following section of our policy: We may remove your item if it has a blank description field, or missing icons or screenshots, and appears to be suspicious. Your item is still published, but is at risk of being removed from the Web Store. Please make the above changes within 7 days in order to avoid removal. Not sure why Google chose the wrong email address to contact me about this (the account is associated with another email address) but luckily this email found me. I opened the extension listing and the description is there, as is the icon. What’s missing is a screenshot, simply because creating one for an extension without a user interface isn’t trivial. No problem, spent a bit of time making something that will do to illustrate the principle. And then I got another mail from Google, exactly 2 hours 30 minutes after the first one: We have not received an update from you on your Google Chrome item, “Google search link fix,” with ID: cekfddagaicikmgoheekchngpadahmlf, item before the expiry of the warning period specified in our earlier email. Because your item continues to not comply with our policies stated in the previous email, it has now been removed from the Google Chrome Web Store. I guess, Mountain View must be moving at extreme speeds, which is why time goes by way faster over there — relativity theory in action. Unfortunately, communication at near-light speeds is also problematic, which is likely why there is no way to ask questions about their reasoning. The only option is resubmitting, but: Important Note: Repeated or egregious policy violations in the Chrome Web Store may result in your developer account being suspended or could lead to a ban from using the Chrome Web Store platform. In other words: if I don’t understand what’s wrong with my extension, then I better stay away from the resubmission button. Or maybe my update with the new screenshot simply didn’t reach them yet and all I have to do is wait? Anyway, dear users of my Google search link fix extension. If you happen to use Google Chrome, I sincerely recommend switching to Mozilla Firefox. No, not only because of this simple extension of course. But Addons.Mozilla.Org policies happen to be enforced in a transparent way, and appealing is always possible. Mozilla also has a good track record of keeping out malicious extensions, something that cannot be said about Chrome Web Store (a recent example). Update (2018-07-04): The Hacker News thread lists a bunch of other cases where extensions were removed for unclear reasons without a possibility to appeal. It seems that having a contact within Google is the only way of resolving this. < Here >
  13. Google promised a year ago to provide more privacy to Gmail users, but The Wall Street Journal reports that hundreds of app makers have access to millions of inboxes belonging to Gmail users. The outside app companies receive access to messages from Gmail users who signed up for things like price-comparison services or automated travel-itinerary planners, according to The Journal. Some of these companies train software to scan the email, while others enable their workers to pore over private messages, the report says. What isn't clear from The Journal's story is whether Google is doing anything differently than Microsoft or other rival email services. Employees working for hundreds of software developers are reading the private messages of Gmail users, The Wall Street Journal reported on Monday. A year ago, Google promised to stop scanning the inboxes of Gmail users, but the company has not done much to protect Gmail inboxes obtained by outside software developers, according to the newspaper. Gmail users who signed up for "email-based services" like "shopping price comparisons," and "automated travel-itinerary planners" are most at risk of having their private messages read, The Journal reported. Hundreds of app developers electronically "scan" inboxes of the people who signed up for some of these programs, and in some cases, employees do the reading, the paper reported. Google declined to comment. The revelation comes at a bad time for Google and Gmail, the world's largest email service, with 1.4 billion users. Top tech companies are under pressure in the United States and Europe to do more to protect user privacy and be more transparent about any parties with access to people's data. The increased scrutiny follows the Cambridge Analytica scandal, in which a data firm was accused of misusing the personal information of more than 80 million Facebook users in an attempt to sway elections. It's not news that Google and many top email providers enable outside developers to access users' inboxes. In most cases, the people who signed up for the price-comparison deals or other programs agreed to provide access to their inboxes as part of the opt-in process. Gmail's opt-in alert spells out generally what a user is agreeing to source Google In Google's case, outside developers must pass a vetting process, and as part of that, Google ensures they have an acceptable privacy agreement, The Journal reported, citing a Google representative. What is unclear is how closely these outside developers adhere to their agreements and whether Google does anything to ensure they do, as well as whether Gmail users are fully aware that individual employees may be reading their emails, as opposed to an automated system, the report says. Mikael Berner, the CEO of Edison Software, a Gmail developer that offers a mobile app for organizing email, told The Journal that its employees had read emails from hundreds of Gmail users as part of an effort to build a new feature. An executive at another company said employees' reading of emails had become "common practice." Companies that spoke to The Journal confirmed that the practice was specified in their user agreements and said they had implemented strict rules for employees regarding the handling of email. It's interesting to note that, judging from The Journal's story, very little indicates that Google is doing anything different from Microsoft or other top email providers. According to the newspaper, nothing in Microsoft or Yahoo's policy agreements explicitly allows people to read others' emails. Source
  14. Cloud support busters? If you can find a human at the end of a number The Stay Puft Marshmallow Man, a big, puffy, cloudy "ghost" from the film Ghostbusters. A sysadmin given just three days to respond to the threatened deletion of a mission-critical system has prompted a vigorous debate about the quality of cloud support. Writing on Medium, the user describes how the Google Cloud-hosted asset monitoring system for his firm's wind and solar energy plants was suspended on the recommendation of Google's bots, and his company was then given 72 hours to respond appropriately. The appropriate response here was sending photo ID verification of the account's cardholder, his CFO. Failure to provide this verification would have resulted in the destruction of the businesses code and data. In this instance, the second such one, the admin revealed, downtime was just one hour. But what if the CFO had been unreachable? "What if the card holder is on leave and is unreachable for three days? We would have lost everything  –  years of work  –  millions of dollars in lost revenue," he mused. The casual violence of Google's support response led to a fairly heated discussion on Reddit and Hacker News. Three factors appear to be in play here: robots detected a problem; robots handled the communications with the system operator; and said operator appears not to have a professional support contract. Google Cloud offers three levels of SLAs – the contract which should set minimum levels of uptime and communication between vendor and buyer. "This guy is running important multimillion dollar production on a consumer plan. This is on him, not Google," observed one Reddit member. Nevertheless, two complaints seem to strike a chord. One is that the entire data set and code base disappears within three days – turning a suspension into a punishment. The other is that under certain circumstances Google support lacks human judgement – something echoed by other Google Cloud users. "Simply ceding control to algorithmic judgement just won't work in the short term if ever at all," wrote one on YCombinator News. "In our experience AWS handles billing issues in a much more humane way. They warn you about suspicious activity and give you time to explain and sort things out. They don't kick you down the stairs," added another. A third commentator suggested it was a cultural issue at Google: "Customer support isn't and never has been in their DNA. It's often rage-inducing how hard it is to contact a human at Google. They seem to think they can engineer products that don't need humans behind them." El Reg asked Google to comment but had not received a response at publication time. We'd welcome your experiences of support from all the major cloud providers. In addition, we're keen to hear of any insurance policies you may have devised for such an eventually Source
  15. Right to view, delete personal info is here – and you'll be amazed to hear why the privacy law passed so fast Analysis California has become the first state in the US to pass a data privacy law – with governor Jerry Brown signing the California Consumer Privacy Act of 2018 into law on Thursday. The legislation will give new rights to the state's 40 million inhabitants, including the ability to view the data that companies hold on them and, critically, request that it be deleted and not sold to third parties. It's not too far off Europe's GDPR. Any company that holds data on more than 50,000 people is subject to the law, and each violation carries a hefty $7,500 fine. Needless to say, the corporations that make a big chunk of their profits from selling their users' information are not overly excited about the new law. "We think there's a set of ramifications that's really difficult to understand," said a Google spokesperson, adding: "User privacy needs to be thoughtfully balanced against legitimate business needs." Likewise tech industry association the Internet Association complained that "policymakers work to correct the inevitable, negative policy and compliance ramifications this last-minute deal will create." So far no word from Facebook, which put 1.5 billion users on a boat to California back in April in order to avoid Europe's similar data privacy regulations. Don't worry if you are surprised by the sudden news that California, the home of Silicon Valley, has passed a new information privacy law – because everyone else is too. And this being the US political system there is, of course, an entirely depressing reason for that. Another part of the statement by the Internet Association put some light on the issue: "Data regulation policy is complex and impacts every sector of the economy, including the internet industry," it argues. "That makes the lack of public discussion and process surrounding this far-reaching bill even more concerning. The circumstances of this bill are specific to California." I see... So this bill was rushed through? Yes, it was. And what's more it was signed in law on Thursday by Governor Brown just hours after it was passed, unanimously, by both houses in Sacramento. What led lawmakers to push through privacy legislation at almost unheard-of speed? A ballot measure. That’s right, since early 2016, a number of dedicated individuals with the funds and legislative know-how to make data privacy a reality worked together on a ballot initiative in order to give Californians the opportunity to give themselves their own privacy rights after every other effort in Sacramento and Washington DC has been shot down by the extremely well-funded lobbyists of Big Tech and Big Cable. Real estate developer Alastair Mactaggart put about $2m of his own money into the initiative following a chance conversation with a Google engineer in his home town of Oakland in which the engineer told him: "If people just understood how much we knew about them, they’d be really worried." Mactaggart then spoke with a fellow dad at his kid's school, a finance guy called Rick Arney who had previously worked in the California State Senate, about it. And Arney walked him through California's unusual ballot measure system where anyone in the state can put forward an initiative and if it gets sufficient support will be put on the ballot paper at the next election. If a ballot initiative gets enough votes, it becomes law. There have been some good and some bad outcomes from this exercise in direct democracy over the years but given the fact that both Mactaggart and Arney felt that there was no way a data privacy law would make its way through the corridors of power in Sacramento in the normal way, given the enormous influence of Silicon Valley, they decided a ballot measure was the way to go. Beware the policy wonk One other individual is worth mentioning: Mary Stone Ross was a former CIA employee and had been legal counsel for the House of Representatives Intelligence Committee and she also lives in Oakland. Mactaggart persuaded her to join the team to craft the actual policy and make sure it could make it through the system. Together the three of them then spend the next year talking to relevant people, from lawyers to tech experts to academics to ordinary citizens to arrive at their overall approach and draft the initiative. And it is at that point that, to be put in bluntly, the shit hit the fan. Because the truth is that consumers – and especially Californians who tend to be more tech-savvy than the rest of the country given the concentration of tech companies in the state – understand the issues around data privacy rules and they want more rights over it. With the initiative well structured and the policy process run professionally, the ballot measure gained the required number of supporters to get it on the ballot. And thanks to the focus groups and polls the group carried out, they were confident that come November it would pass and data privacy become law through direct democracy. At which point, it is fair to say, Big Internet freaked out and made lots of visits to lawmakers in Sacramento who also freaked out. The following months have seen a scurry of activity but if you want to know why the bill became law in almost record time and was signed by Governor Brown on Thursday all you need to know is this single fact: the deadline for pulling the initiative from November's ballot as last night – Thursday evening – and Mactaggart said publicly that if the bill was signed, he would do exactly that and pull his ballot measure. Privy see You may be wondering why Sacramento was able to get it through unanimously without dozens of Google and Facebook-funded lawmakers continually derailing the effort, especially since it was still a ballot measure. After all, the tech giants could have spent millions campaigning against the measure in a bid to make sure people didn’t vote for it. And the truth is that they had already lined up millions of dollars to do exactly that. Except they were going to lose because, thanks to massively increased public awareness of data privacy given the recent Facebook Russian election fake news scandal and the European GDPR legislation, it was going to be very hard to push back against the issue. And it has been structured extremely well – it was, frankly, good law. There is another critical component: laws passed through the ballot initiative are much, much harder for lawmakers to change, especially if they are well structured. So suddenly Big Tech and Sacramento were faced with a choice: pass data privacy legislation at record speed and persuade Mactaggart to pull his ballot initiative with the chance to change it later through normal legislative procedures; or play politics as usual and be faced with the same law but one that would be much harder to change in future. And, of course, they went with the law. And Mactaggart, to his eternal credit, agreed to pull his ballot measure in order to allow the "normal" legislative approach to achieve the same goal. And so the California Consumer Privacy Act of 2018 is now law and today is the first day that most Californians will have heard of it. Sausage making at its finest. Of course, Google, Facebook et al are going to spend the next decade doing everything they can trying to unravel it. And as we saw just last week, lawmakers are only too willing to do the bidding of large corporate donors. But it is much harder to put a genie back in the bottle than it is to stop it getting out. Source
  16. What is this?# I've been writing about Google's efforts to deprecate HTTP, the protocol of the web. This is a summary of why I am opposed to it. DW Their pitch# Advocates of deprecating HTTP make three main points: Something bad could happen to my pages in transit from a server to the user's web browser. It's not hard to convert to HTTPS and it doesn't cost a lot. Google is going to warn people about my site being "not secure." So if I don't want people to be scared away, I should just do what they want me to do. Why this is bad# The web is an open platform, not a corporate platform. It is defined by its stability. 25-plus years and it's still going strong. Google is a guest on the web, as we all are. Guests don't make the rules. Why this is bad, practically speaking# A lot of the web consists of archives. Files put in places that no one maintains. They just work. There's no one there to do the work that Google wants all sites to do. And some people have large numbers of domains and sub-domains hosted on all kinds of software Google never thought about. Places where the work required to convert wouldn't be justified by the possible benefit. The reason there's so much diversity is that the web is an open thing, it was never owned. The web is a miracle# Google has spent a lot of effort to convince you that HTTP is not good. Let me have the floor for a moment to tell you why HTTP is the best thing ever. Its simplicity is what made the web work. It created an explosion of new applications. It may be hard to believe that there was a time when Amazon, Netflix, Facebook, Gmail, Twitter etc didn't exist. That was because the networking standards prior to the web were complicated and not well documented. The explosion happened because the web is simple. Where earlier protocols were hard to build on, the web is easy. I don't think the explosion is over. I want to make it easier and easier for people to run their own web servers. Google is doing what the programming priesthood always does, building the barrier to entry higher, making things more complicated, giving themselves an exclusive. This means only super nerds will be able to put up sites. And we will lose a lot of sites that were quickly posted on a whim, over the 25 years the web has existed, by people that didn't fully understand what they were doing. That's also the glory of the web. Fumbling around in the dark actually gets you somewhere. In worlds created by corporate programmers, it's often impossible to find your way around, by design. The web is a social agreement not to break things. It's served us for 25 years. I don't want to give it up because a bunch of nerds at Google think they know best. The web is like the Grand Canyon. It's a big natural thing, a resource, an inspiration, and like the canyon it deserves our protection. It's a place of experimentation and learning. It's also useful for big corporate websites like Google. All views of the web are important, especially ones that big companies don't understand or respect. It's how progress happens in technology. Keeping the web running simple is as important as net neutrality. They believe they have the power# Google makes a popular browser and is a tech industry leader. They can, they believe, encircle the web, and at first warn users as they access HTTP content. Very likely they will do more, requiring the user to consent to open a page, and then to block the pages outright. It's dishonest# Many of the sites they will label as "not secure" don't ask the user for any information. Of course users won't understand that. Many will take the warning seriously and hit the Back button, having no idea why they're doing it. Of course Google knows this. It's the kind of nasty political tactic we expect from corrupt political leaders, not leading tech companies. Sleight of hand# They tell us to worry about man-in-the-middle attacks that might modify content, but fail to mention that they can do it in the browser, even if you use a "secure" protocol. They are the one entity you must trust above all. No way around it. They cite the wrong stats# When they say some percentage of web traffic is HTTPS, that doesn't measure the scope of the problem. A lot of HTTP-served sites get very few hits, yet still have valuable ideas and must be preserved. It will destroy the web's history# If Google succeeds, it will make a lot of the web's history inaccessible. People put stuff on the web precisely so it would be preserved over time. That's why it's important that no one has the power to change what the web is. It's like a massive book burning, at a much bigger scale than ever done before. If HTTPS is such a great idea...# Why force people to do it? This suggests that the main benefit is for Google, not for people who own the content. If it were such a pressing problem we'd do it because we want to, not because we're being forced to. The web isn't safe# Finally, what is the value in being safe? Twitter and Facebook are like AOL in the old pre-web days. They are run by companies who are committed to provide a safe experience. They make tradeoffs for that. Limited linking. No styles. Character limits. Blocking, muting, reporting, norms. Etc etc. Think of them as Disney-like experiences. The web is not safe. That is correct. We don't want every place to be safe. So people can be wild and experiment and try out new ideas. It's why the web has been the proving ground for so much incredible stuff over its history. Lots of things aren't safe. Crossing the street. Bike riding in Manhattan. Falling in love. We do them anyway. You can't be safe all the time. Life itself isn't safe. If Google succeeds in making the web controlled and bland, we'll just have to reinvent the web outside of Google's sphere. Let's save some time, and create the new web out of the web itself. PS: Of course we want parts of the web to be safe. Banking websites, for example. But my blog archive from 2001? Really there's no need for special provisions there. Update: A threatening email from Google# On June 20, 2018, I got an email from Google, as the owner of scripting.com. According to the email I should "migrate to HTTPS to avoid triggering the new warning on your site and to help protect users' data." It's a blog. I don't ask for any user data. Google’s not secure message means this: “Google tried to take control of the open web and this site said no.” Last update: Tuesday June 26, 2018; 2:11 PM GMT+0200. Source
  17. Don't worry, it seems to be just for security. For now Google Play is the backbone of Android and the most likely part to be abused GOOGLE HAS started adding a string of metadata to all packages downloaded to the Google Play Store. The announcement was quietly dropped last week, but before everyone starts panicking let's take a step back. Yes, it's a form of DRM. But DRM isn't all bad. In fact, the main problem with DRM to date has been the completely dastardly way it has been used. Remember 15 years ago when putting a CD into the drive would bring up an embedded music player to try and prevent you ripping it? They never really worked anyway. Anyway, the point is that DRM has got a practical purpose and in this case its to ensure that all apks coming from the store are digitally signed. That means that even if you don't get your apps from Google Play, they're still safe, and still work with the Google Play Services framework for things like cloud saving and cross-device play. Plus they're a heck of a lot safer. Not to mention it makes it a lot easier to track down anyone who has hacked a malicious payload. Perhaps the best thing about this is that if you got an app from elsewhere (say, Amazon) and installed it, as long as it has been certified, it will be added to your library and get all the updates it will ever need. Now let's be very clear here - this is now an open Pandora's box. Google is using its DRM for good right now, but this is Google, lest we forget and there could be a bit more data collection than we expected at any time. By not out-and-out calling it DRM, Google is avoiding a lot of the instant backlash the very mention of the term causes, but the fact is that, if you're the type of person that trusts big corporations as far as you can throw them, then this is one to keep an eye on - it would take very little extra code to make the new safety measures into a giant surveillance device powered by your phone. Let's just hope that doesn't happen, eh Source
  18. A work boycott from the Group of Nine is yet another hurdle to the company’s efforts to compete for sensitive government work. Earlier this year, a group of influential software engineers in Google’s cloud division surprised their superiors by refusing to work on a cutting-edge security feature. Known as “air gap,” the technology would have helped Google win sensitive military contracts. The coders weren’t persuaded their employer should be using its technological might to help the government wage war, according to four current and former employees. After hearing the engineers’ objections, Urs Hölzle, Google’s top technical executive, said the air gap feature would be postponed, one of the people said. Another person familiar with the situation said the group was able to reduce the scope of the feature. The act of rebellion ricocheted around the company, fueling a growing resistance among employees with a dim view of Google’s yen for multi-million-dollar government contracts. The engineers became known as the “Group of Nine” and were lionized by like-minded staff. The current and former employees say the engineers’ work boycott was a catalyst for larger protests that convulsed the company’s Mountain View, California, campus and ultimately forced executives to let a lucrative Pentagon contract called Project Maven expire without renewal. They declined to name the engineers and requested anonymity to discuss a private matter. Internal disputes are common at Alphabet Inc.’s Google, which gives employees ample space to air grievances. But dissent is on the rise (as it is at other tech companies). Last month, in a highly unusual move, a Google employee proposed that executive compensation be tied to efforts to make the company more diverse and inclusive. That proposal was easily voted down by shareholders, but the engineers’ boycott could actually hamper Google’s ability to compete. Big federal contracts often require certification to handle sensitive data—authorizations that rivals Amazon.com Inc. and Microsoft Corp. have, but Google doesn't. Without certain measures including air gap technology, Google may struggle to win portions of the Joint Enterprise Defense Infrastructure, or JEDI, a Pentagon deal worth upwards of $10 billion. It's unclear if Google has abandoned air gap technology or is still planning to build it over employees’ objections. The feature is not technically very difficult, so Google could easily find other engineers to do the work. While over 4,000 people at the company signed a petition against Project Maven, that’s roughly 5 percent of total full-time staff. A company spokesperson declined to comment. Google cloud chief Diane Greene has expressed continued interest in working with government. Federal agencies are among the largest spenders on corporate computing and starting to gravitate toward cloud services. In March, Greene and her deputies proudly touted Google's new approvals under FedRAMP, federal compliance standards for information technology. Google was approved FedRAMP “Moderate,” a designation required for almost 80 percent of government cloud contracts. Google cloud staff said internally that the Project Maven deal was “fast-tracking” higher FedRAMP authorization, according to a Gizmodo report. For now, Google falls short of rivals. Both Microsoft's Azure and Amazon Web Services (AWS) have “High” certificates that authorize them to hold sensitive or classified data and sell to bodies like the Central Intelligence Agency. To do so, both companies had to set up a separate service called a government cloud. A critical component of that service is the air gap. Put simply, it physically separates computers from others on a network. So rather than store data from multiple companies on a single server or system, as the commercial cloud providers typically do, a company or agency can place its data and computing processes in isolation on a single piece of hardware. That separation is particularly desirable for agencies in national security, says Michael Carter, vice president of Coalfire, a cybersecurity firm. “Amazon and Azure can literally say, ‘This is your rack,’” he says. “In the government, they want to know where their data is. So if you want to wipe it, go wipe it.” In sales pitches, Google touts the security features of its cloud service. In a March press briefing, company executives noted how their artificial intelligence software could spot cybersecurity attacks early. “We think that Google cloud is today the most secure cloud out there,” Hölzle said during the briefing. The entities most likely to require air-gap security systems are government agencies or financial firms. While experts debate the technology’s merits, it does give customers “psychological” comfort, according to Jim Reavis, who runs Cloud Security Alliance, an industry group. “They’re used to having their own computer that they look at, their own blinking light,” he says. “I do question whether or not that’s useful security.” Greene and other Google executives will have to persuade employees it’s possible to bid for government contracts without violating Google’s new ethical standards. After pledging not to renew the Project Maven contract, which involves using artificial intelligence to analyze drone footage, the company issued a set of AI principles this month that prohibit weapons work. But they don’t rule out selling to the military, and Google continues to pursue other Defense Department cloud contracts. “We are still doing everything we can within these guidelines to support our government, the military and our veterans,” Greene wrote in a blog on June 7. “For example, we will continue to work with government organizations on cybersecurity, productivity tools, healthcare, and other forms of cloud initiatives." Several Googlers protesting Project Maven have complained about poor communication from senior leaders. Most staff outside the cloud unit were unaware of the contract until February—five months after it was signed—when questions about the deal began circulating more widely on internal message boards. At one point, Greene told staff the deal was worth a meager $9 million. Subsequent reports revealed Google expected the contract to rake in $15 million and grow to as much as $250 million. Google has yet to address these reports publicly. But on June 8, a day after the company issued its ethical charter, Greene addressed the discrepancy in an internal note. “In speaking about Maven, I did not always have accurate information,” she wrote in an email viewed by Bloomberg News. “For example, I said the contract was $9 [million] when it actually was a different number.” Google employees have a history of objecting on ethical grounds. After the Edward Snowden revelations in 2013, several engineers confronted Hölzle about allegations the company had assisted the government in its surveillance program. They threatened to resign, telling Hölzle, “this isn't why we signed up for the company,” according to a former senior executive who attended the meeting. Hölzle voiced his support for the engineers, this person says. The latest confrontation at Google coincides with growing concern about the entire industry’s relationship with the U.S. government. Civil rights groups have targeted Amazon for selling facial recognition tech to police departments. Microsoft faced similar heat for its work with the U.S. Immigration and Customs Enforcement. Some Google employees resigned over the Project Maven deal. Tyler Breisacher, a software developer on Google infrastructure who left in April, cited the lack of clear communications about the contract and how Google's software was being used. Management, he says, appeared surprised at the response from employees once they shared more about the program. “It seems like they weren't expecting it to be as controversial as it was,” he says. Breisacher, who joined Google in 2011, says the company has changed. Earlier, if employees felt a decision was bad for Google, its users or the broader world, they had their leaders’ ears. “It felt like you were really listened to,” he says. Greene wrote in the internal email that she wanted to address the “trust issue that has developed” in the past five months. She said she regretted not emailing earlier to correct her misstatement about the size of the Project Maven deal. “In the past, I would have,” Greene wrote, “but in the current climate of leaks, the sense was that it would be a mistake to do because the correction would leak and start another ‘press cycle’ that would not be good for any of us.” Source
  19. Google maintains a rapidly growing list of copyright-infringing URLs which they haven't indexed yet. This blacklist ensures that these links are never added to the search engine. Thanks to a new update in the transparency report, we now know how many non-indexed links every takedown notice includes, which is surprisingly high in some cases. In recent years, Google has had to cope with a continuous increase in takedown requests which target pirate sites in search results. The total number of ‘removed’ URLs just reached 3.5 billion and millions more are added every day. While that’s nothing new, Google just started sharing some additional insight into the nature of these requests. As it turns out, millions, if not hundreds of millions, of the links copyright holders target have never appeared in Google’s search index. Earlier this year Google copyright counsel Caleb Donaldson revealed that the company had started to block non-indexed links ‘prophylactically.’ In other words, Google blocks URLs before they appear in the search results, as some sort of piracy vaccine. “Google has critically expanded notice and takedown in another important way: We accept notices for URLs that are not even in our index in the first place. That way, we can collect information even about pages and domains we have not yet crawled,” Donaldson noted. “We process these URLs as we do the others. Once one of these not-in-index URLs is approved for takedown, we prophylactically block it from appearing in our Search results,” he added. Unfortunately, Google provided no easy way to see how many links in a request were not indexed, but that has now changed. Over the past week or so the search engine added a new signal to its DMCA transparency report listing how many of the submitted URLs in a notice are not indexed yet. In some cases, this is the vast majority. Take the Mexican branch on the anti-piracy group APDIF, for example. This organization is one of the most active DMCA reporters and has asked Google to remove over a million URLs last week alone. As can be seen below, the majority of the links appear to be non-indexed links. We browsed through dozens of recent listings from APDIF and these reveal a pattern where in most cases over 90% of the submitted URLs are not in Google’s search results. Google now reporting non-indexed takedown requests These URLs are obviously not removed since they weren’t listed. According to the company’s earlier statement, they are put on a separate blocklist instead, which prevents them from being added in the future. APDIF is not the only reporter that does this though. Rivendell, the most active sender of all, also has a high rate of non-indexed links, often well over 50%. The tactic turns out to be rather common. Well known players such as Fox, Walt Disney, NBC Universal, BPI, and the RIAA, all report non-indexed links as well, to varying degrees. Not all reporting agencies have such high rates as APDIF. However, it is clear that millions of non-indexed pirate URLs are added to the preemptive blocklist every month. Technically, the DMCA takedown process is meant for links and content which actually exist on a service, but it appears that Google doesn’t mind going a step further. TorrentFreak reached out to the search giant several days ago, hoping to find out what percentage of the overall requests are not in Google’s search results, but at the time of writing, we have yet to hear back. Source
  20. A work boycott from the Group of Nine is yet another hurdle to the company’s efforts to compete for sensitive government work. Earlier this year, a group of influential software engineers in Google’s cloud division surprised their superiors by refusing to work on a cutting-edge security feature. Known as “air gap,” the technology would have helped Google win sensitive military contracts. The coders weren’t persuaded their employer should be using its technological might to help the government wage war, according to four current and former employees. After hearing the engineers’ objections, Urs Hölzle, Google’s top technical executive, said the air gap feature would be postponed, one of the people said. Another person familiar with the situation said the group was able to reduce the scope of the feature. The act of rebellion ricocheted around the company, fueling a growing resistance among employees with a dim view of Google’s yen for multi-million-dollar government contracts. The engineers became known as the “Group of Nine” and were lionized by like-minded staff. The current and former employees say the engineers’ work boycott was a catalyst for larger protests that convulsed the company’s Mountain View, California, campus and ultimately forced executives to let a lucrative Pentagon contract called Project Maven expire without renewal. They declined to name the engineers and requested anonymity to discuss a private matter. Internal disputes are common at Alphabet Inc.’s Google, which gives employees ample space to air grievances. But dissent is on the rise (as it is at other tech companies). Last month, in a highly unusual move, a Google employee proposed that executive compensation be tied to efforts to make the company more diverse and inclusive. That proposal was easily voted down by shareholders, but the engineers’ boycott could actually hamper Google’s ability to compete. Big federal contracts often require certification to handle sensitive data—authorizations that rivals Amazon.com Inc. and Microsoft Corp. have, but Google doesn't. Without certain measures including air gap technology, Google may struggle to win portions of the Joint Enterprise Defense Infrastructure, or JEDI, a Pentagon deal worth upwards of $10 billion. It's unclear if Google has abandoned air gap technology or is still planning to build it over employees’ objections. The feature is not technically very difficult, so Google could easily find other engineers to do the work. While over 4,000 people at the company signed a petition against Project Maven, that’s roughly 5 percent of total full-time staff. A company spokesperson declined to comment. Google cloud chief Diane Greene has expressed continued interest in working with government. Federal agencies are among the largest spenders on corporate computing and starting to gravitate toward cloud services. In March, Greene and her deputies proudly touted Google's new approvals under FedRAMP, federal compliance standards for information technology. Google was approved FedRAMP “Moderate,” a designation required for almost 80 percent of government cloud contracts. Google cloud staff said internally that the Project Maven deal was “fast-tracking” higher FedRAMP authorization, according to a Gizmodo report. For now, Google falls short of rivals. Both Microsoft's Azure and Amazon Web Services (AWS) have “High” certificates that authorize them to hold sensitive or classified data and sell to bodies like the Central Intelligence Agency. To do so, both companies had to set up a separate service called a government cloud. A critical component of that service is the air gap. Put simply, it physically separates computers from others on a network. So rather than store data from multiple companies on a single server or system, as the commercial cloud providers typically do, a company or agency can place its data and computing processes in isolation on a single piece of hardware. That separation is particularly desirable for agencies in national security, says Michael Carter, vice president of Coalfire, a cybersecurity firm. “Amazon and Azure can literally say, ‘This is your rack,’” he says. “In the government, they want to know where their data is. So if you want to wipe it, go wipe it.” In sales pitches, Google touts the security features of its cloud service. In a March press briefing, company executives noted how their artificial intelligence software could spot cybersecurity attacks early. “We think that Google cloud is today the most secure cloud out there,” Hölzle said during the briefing. The entities most likely to require air-gap security systems are government agencies or financial firms. While experts debate the technology’s merits, it does give customers “psychological” comfort, according to Jim Reavis, who runs Cloud Security Alliance, an industry group. “They’re used to having their own computer that they look at, their own blinking light,” he says. “I do question whether or not that’s useful security.” Greene and other Google executives will have to persuade employees it’s possible to bid for government contracts without violating Google’s new ethical standards. After pledging not to renew the Project Maven contract, which involves using artificial intelligence to analyze drone footage, the company issued a set of AI principles this month that prohibit weapons work. But they don’t rule out selling to the military, and Google continues to pursue other Defense Department cloud contracts. “We are still doing everything we can within these guidelines to support our government, the military and our veterans,” Greene wrote in a blog on June 7. “For example, we will continue to work with government organizations on cybersecurity, productivity tools, healthcare, and other forms of cloud initiatives." Several Googlers protesting Project Maven have complained about poor communication from senior leaders. Most staff outside the cloud unit were unaware of the contract until February—five months after it was signed—when questions about the deal began circulating more widely on internal message boards. At one point, Greene told staff the deal was worth a meager $9 million. Subsequent reports revealed Google expected the contract to rake in $15 million and grow to as much as $250 million. Google has yet to address these reports publicly. But on June 8, a day after the company issued its ethical charter, Greene addressed the discrepancy in an internal note. “In speaking about Maven, I did not always have accurate information,” she wrote in an email viewed by Bloomberg News. “For example, I said the contract was $9 [million] when it actually was a different number.” Google employees have a history of objecting on ethical grounds. After the Edward Snowden revelations in 2013, several engineers confronted Hölzle about allegations the company had assisted the government in its surveillance program. They threatened to resign, telling Hölzle, “this isn't why we signed up for the company,” according to a former senior executive who attended the meeting. Hölzle voiced his support for the engineers, this person says. The latest confrontation at Google coincides with growing concern about the entire industry’s relationship with the U.S. government. Civil rights groups have targeted Amazon for selling facial recognition tech to police departments. Microsoft faced similar heat for its work with the U.S. Immigration and Customs Enforcement. Some Google employees resigned over the Project Maven deal. Tyler Breisacher, a software developer on Google infrastructure who left in April, cited the lack of clear communications about the contract and how Google's software was being used. Management, he says, appeared surprised at the response from employees once they shared more about the program. “It seems like they weren't expecting it to be as controversial as it was,” he says. Breisacher, who joined Google in 2011, says the company has changed. Earlier, if employees felt a decision was bad for Google, its users or the broader world, they had their leaders’ ears. “It felt like you were really listened to,” he says. Greene wrote in the internal email that she wanted to address the “trust issue that has developed” in the past five months. She said she regretted not emailing earlier to correct her misstatement about the size of the Project Maven deal. “In the past, I would have,” Greene wrote, “but in the current climate of leaks, the sense was that it would be a mistake to do because the correction would leak and start another ‘press cycle’ that would not be good for any of us.” < Here >
  21. Google in the coming weeks is expected to fix a location privacy leak in two of its most popular consumer products. New research shows that Web sites can run a simple script in the background that collects precise location data on people who have a Google Home or Chromecast device installed anywhere on their local network. Craig Young, a researcher with security firm Tripwire, said he discovered an authentication weakness that leaks incredibly accurate location information about users of both the smart speaker and home assistant Google Home, and Chromecast, a small electronic device that makes it simple to stream TV shows, movies and games to a digital television or monitor. Young said the attack works by asking the Google device for a list of nearby wireless networks and then sending that list to Google’s geolocation lookup services. “An attacker can be completely remote as long as they can get the victim to open a link while connected to the same Wi-Fi or wired network as a Google Chromecast or Home device,” Young told KrebsOnSecurity. “The only real limitation is that the link needs to remain open for about a minute before the attacker has a location. The attack content could be contained within malicious advertisements or even a tweet.” It is common for Web sites to keep a record of the numeric Internet Protocol (IP) address of all visitors, and those addresses can be used in combination with online geolocation tools to glean information about each visitor’s hometown or region. But this type of location information is often quite imprecise. In many cases, IP geolocation offers only a general idea of where the IP address may be based geographically. This is typically not the case with Google’s geolocation data, which includes comprehensive maps of wireless network names around the world, linking each individual Wi-Fi network to a corresponding physical location. Armed with this data, Google can very often determine a user’s location to within a few feet (particularly in densely populated areas), by triangulating the user between several nearby mapped Wi-Fi access points. [Side note: Anyone who’d like to see this in action need only to turn off location data and remove the SIM card from a smart phone and see how well navigation apps like Google’s Waze can still figure out where you are]. “The difference between this and a basic IP geolocation is the level of precision,” Young said. “For example, if I geolocate my IP address right now, I get a location that is roughly 2 miles from my current location at work. For my home Internet connection, the IP geolocation is only accurate to about 3 miles. With my attack demo however, I’ve been consistently getting locations within about 10 meters of the device.” Young said a demo he created (a video of which is below) is accurate enough that he can tell roughly how far apart his device in the kitchen is from another device in the basement. “I’ve only tested this in three environments so far, but in each case the location corresponds to the right street address,” Young said. “The Wi-Fi based geolocation works by triangulating a position based on signal strengths to Wi-Fi access points with known locations based on reporting from people’s phones.” Beyond leaking a Chromecast or Google Home user’s precise geographic location, this bug could help scammers make phishing and extortion attacks appear more realistic. Common scams like fake FBI or IRS warnings or threats to release compromising photos or expose some secret to friends and family could abuse Google’s location data to lend credibility to the fake warnings, Young notes. “The implications of this are quite broad including the possibility for more effective blackmail or extortion campaigns,” he said. “Threats to release compromising photos or expose some secret to friends and family could use this to lend credibility to the warnings and increase their odds of success.” When Young first reached out to Google in May about his findings, the company replied by closing his bug report with a “Status: Won’t Fix (Intended Behavior)” message. But after being contacted by KrebsOnSecurity, Google changed its tune, saying it planned to ship an update to address the privacy leak in both devices. Currently, that update is slated to be released in mid-July 2018. According to Tripwire, the location data leak stems from poor authentication by Google Home and Chromecast devices, which rarely require authentication for connections received on a local network. “We must assume that any data accessible on the local network without credentials is also accessible to hostile adversaries,” Young wrote in a blog post about his findings. “This means that all requests must be authenticated and all unauthenticated responses should be as generic as possible. Until we reach that point, consumers should separate their devices as best as is possible and be mindful of what web sites or apps are loaded while on the same network as their connected gadgets.” Earlier this year, KrebsOnSecurity posted some basic rules for securing your various “Internet of Things” (IoT) devices. That primer lacked one piece of advice that is a bit more technical but which can help mitigate security or privacy issues that come with using IoT systems: Creating your own “Intranet of Things,” by segregating IoT devices from the rest of your local network so that they reside on a completely different network from the devices you use to browse the Internet and store files. “A much easier solution is to add another router on the network specifically for connected devices,” Young wrote. “By connecting the WAN port of the new router to an open LAN port on the existing router, attacker code running on the main network will not have a path to abuse those connected devices. Although this does not by default prevent attacks from the IoT devices to the main network, it is likely that most naïve attacks would fail to even recognize that there is another network to attack.” For more on setting up a multi-router solution to mitigating threats from IoT devices, check out this in-depth post on the subject from security researcher and blogger Steve Gibson. Source
  22. Google announced on Monday that it will invest $550 million in Chinese e-commerce company JD.com. The all-cash investment, by which Google will buy 27 million new shares of NASDAQ-listed JD at $40.58 per share, is expected to aid Google in competing with Amazon.com in e-commerce sales and advertising. The deal would give Google about one percent of JD shares, which are also owned by Walmart and Tencent Holdings. Google has been blocked in China since 2010 for refusing to censor content, and the deal could help strengthen its Chinese connections, the Wall Street Journal reported on Monday. Google has an artificial intelligence laboratory in Beijing, and recently introduced its Files Go app in China, a digital storage application. JD is China's second-largest e-commerce website, with about 25 percent of China's business-to-consumer Internet market share. It remains far behind Alibaba's 60 percent markets share. Google and JD said they sought a retail infrastructure that can better personalize the shopping experience in several markets, including Southeast Asia. JD also said it will make some items available, through Google Shopping, available to U.S. and European markets. The partnership can open a retail channel for JD to sell products globally at a time when a trade war between China and the United States could be imminent, CNBC reported. < Here >
  23. Google Chrome Software Removal Tool is an easy-to-use program which tries to get a broken Chrome installation working again.Launch it and the tool scans your PC for programs which Google considers "suspicious" or "known to cause problems with Chrome", and offers to remove them. Bizarrely, the CSRT won't give you the names of these suspicious programs, so you'll have to trust it. Or you can just run the program to see if it thinks there are any, then click "Cancel" instead of "Remove" when the report appears. Whatever you do, once the scan is complete, CSRT launches Chrome with the chrome://settings/resetProfileSettings command, prompting you to reset your Chrome settings. Click "Reset" and Chrome will be reset to its default settings, otherwise just close the window to continue as usual. There are no other settings or options, nothing else to do at all. Google provides few details of what the Chrome Software Removal Tool actually does. They do claim. Find programs and components that affect Chrome If you notice changes in the settings of your Chrome browser, there is a small utility that can help you identify the issue and correct it. Created by Google itself, it goes by the name of Chrome Cleanup Tool (Google Chrome Software Removal Tool), enabling you to detect programs that interfere with Google Chrome and remove them. Since toolbars, browser add-ons and pop-up ads are not typical malware, your antivirus solution might fail to detect their presence. Chrome Cleanup Tool is specifically designed to find programs and components whose installation resulted in modifications of Chrome's settings, providing you with a simple means to reset them. Remove interfering components with a click The application does not require installation and starts looking for suspicious programs as soon as you launch it. The number of findings are displayed within a small window, along with an option to remove them all, but their names are not revealed, so as to prevent name modifications that might cause Chrome Cleanup Tool not to work as it should. In some cases, a system reboot might be required in order for the changes to take effect. Once the issue is fixed, Chrome restarts and prompts you to reset the browser settings. Scan for malicious programs that cause issues with Chrome Chrome Cleanup Tool is an attempt to enhance the browsing experience of Chrome users, providing them with a simple method to factory reset the settings and remove programs that cause trouble to the browser. More aggressive malware might be impossible to remove or detect, so you might need a reputable antivirus solution to clean the system. Note that this application is not designed to search for all types of viruses and malware components, but only those that cause issues with Google Chrome. Homepage or here Download: Link 1 - New Link 2 - New
  24. Google is bracing itself for what will likely be a record-breaking EU fine in the coming weeks. The Financial Times reports that the EU antitrust investigation into Android is concluding, and a fine is expected to be announced in July. The European Commission has been investigating Android after rivals complained that Google has been abusing its market dominance in software than runs on smartphones. Google has been accused of limiting access to the Google Play Store unless phone makers also bundle Google search and Chrome apps, a practice that may have breached EU antitrust rules. Google has also reportedly blocked phone makers from creating devices that run forked versions of Android, as part of an anti-fragmentation agreement. A fine is reportedly expected next month, but it’s not clear how big it will be. The EU could fine Google up to $11 billion, 10 percent of Alphabet’s (Google’s parent company) annual turnover. It’s unlikely that Google will be fined the full $11 billion, but anything over $2.7 billion would set a new record. Google was previously hit with a record-breaking $2.7 billion fine last year by the European Commission for breaking antitrust laws. The EU accused Google of demoting rivals and unfairly promoting its own services in search results related to shopping. Google’s previous fine didn’t lead to any significant changes to its business, but the Android case could be very different. Google has been accused of bundling its search engine with Android, and the European Commission could force the company to make changes to this practice. Google could face its own Microsoft moment, with years of monitoring to ensure the company is compliant with any changes imposed by the EU. Microsoft had a similar fight with the EU more than 10 years ago. Microsoft was accused of bundling its Windows Media Player with Windows, and the EU forced it to unbundle the app so that competitors could get a fair advantage. Microsoft created a special version of Windows for Europe without the app, but it was the EU’s next ruling that really hurt the company. Microsoft was also accused of bundling its Internet Explorer browser with Windows, and the EU forced the company to include a browser ballot with non-Microsoft browsers in an effort to improve competition. The EU’s changes helped push browser alternatives like Firefox and Chrome directly inside Windows, and rival browsers benefited. If the EU forces Google to make similar changes inside Android, those will be a much bigger headache than a record-breaking fine. Microsoft was paralyzed by the EU oversight, and the company had to consider its business decisions wisely as a result. We’ll find out next month if Google will face similar action. Source
  25. Google as a window into our private thoughts What are the weirdest questions you've ever Googled? Mine might be (for my latest book): “How many people have ever lived?” “What do people think about just before death?” and “How many bits would it take to resurrect in a virtual reality everyone who ever lived?” (It's 10 to the power of 10123.) Using Google's autocomplete and Keyword Planner tools, U.K.-based Internet company Digitaloft generated a list of what it considers 20 of the craziest searches, including “Am I pregnant?” “Are aliens real?” “Why do men have nipples?” “Is the world flat?” and “Can a man get pregnant?” This is all very entertaining, but according to economist Seth Stephens-Davidowitz, who worked at Google as a data scientist (he is now an op-ed writer for the New York Times), such searches may act as a “digital truth serum” for deeper and darker thoughts. As he explains in his book Everybody Lies (Dey Street Books, 2017), “In the pre-digital age, people hid their embarrassing thoughts from other people. In the digital age, they still hide them from other people, but not from the internet and in particular sites such as Google and PornHub, which protect their anonymity.” Employing big data research tools “allows us to finally see what people really want and really do, not what they say they want and say they do.” People may tell pollsters that they are not racist, for example, and polling data do indicate that bigoted attitudes have been in steady decline for decades on such issues as interracial marriage, women's rights and gay marriage, indicating that conservatives today are more socially liberal than liberals were in the 1950s. Using the Google Trends tool in analyzing the 2008 U.S. presidential election, however, Stephens-Davidowitz concluded that Barack Obama received fewer votes than expected in Democrat strongholds because of still latent racism. For example, he found that 20 percent of searches that included the N-word (hereafter, “n***”) also included the word “jokes” and that on Obama's first election night about one in 100 Google searches with “Obama” in them included “kkk” or “n***(s).” “In some states, there were more searches for ‘[n***] president’ than ‘first black president,’” he reports—and the highest number of such searches were not predominantly from Southern Republican bastions as one might predict but included upstate New York, western Pennsylvania, eastern Ohio, industrial Michigan and rural Illinois. This difference between public polls and private thoughts, Stephens-Davidowitz observes, helps to explain Obama's underperformance in regions with a lot of racist searches and partially illuminates the surprise election of Donald Trump. But before we conclude that the arc of the moral universe is slouching toward Gomorrah, a Google Trends search for “n*** jokes,” “bitch jokes” and “fag jokes” between 2004 and 2017, conducted by Harvard University psychologist Steven Pinker and reported in his 2018 book Enlightenment Now, shows downward-plummeting lines of frequency of searches. “The curves,” he writes, “suggest that Americans are not just more abashed about confessing to prejudice than they used to be; they privately don't find it as amusing.” More optimistically, these declines in prejudice may be an underestimate, given that when Google began keeping records of searches in 2004 most Googlers were urban and young, who are known to be less prejudiced and bigoted than rural and older people, who adopted the search technology years later (when the bigoted search lines were in steep decline). Stephens-Davidowitz confirms that such intolerant searches are clustered in regions with older and less educated populations and that compared with national searches, those from retirement neighborhoods are seven times as likely to include “n*** jokes” and 30 times as likely to contain “fag jokes.” Additionally, he found that someone who searches for “n***” is also likely to search for older-generation topics such as “Social Security” and “Frank Sinatra.” What these data show is that the moral arc may not be bending toward justice as smoothly upward as we would like. But as members of the Silent Generation (born 1925–1945) and Baby Boomers (born 1946–1964) are displaced by Gen Xers (born 1965–1980) and Millennials (born 1981–1996), and as populations continue shifting from rural to urban living, and as postsecondary education levels keep climbing, such prejudices should be on the wane. And the moral sphere will expand toward greater inclusiveness. < Here > [Note: If the content is deemed inappropriate here, please close the thread ASAP. Thank you.]
×