Google’s latest flagship smartphone raises concerns about user privacy and security. It frequently transmits private user data to the tech giant before any app is installed. Moreover, the Cybernews research team has discovered that it potentially has remote management capabilities without user awareness or approval.

Cybernews researchers analyzed the new Pixel 9 Pro XL smartphone’s web traffic, focusing on what a new smartphone sends to Google.

“Every 15 minutes, Google Pixel 9 Pro XL sends a data packet to Google. The device shares location, email address, phone number, network status, and other telemetry. Even more concerning, the phone periodically attempts to download and run new code, potentially opening up security risks,” said Aras Nazarovas, a security researcher at Cybernews…

… “The amount of data transmitted and the potential for remote management casts doubt on who truly owns the device. Users may have paid for it, but the deep integration of surveillance systems in the ecosystem may leave users vulnerable to privacy violations,” Nazarovas said…

    • Buddahriffic@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      I was just wondering earlier today if Google kept the bootloader open to allow custom OS installation only because they had other hardware on the phone that would send them their information anyways, possibly through covert side channels.

      Like they could add listeners for cell signals that pick up data encoded in the lower bits of timestamps attached to packets, which would be very difficult to detect (like I’m having trouble thinking of a way to determine if that’s happening even if you knew to look for it).

      Or maybe there’s a sleeper code that can be sent to “wake up” the phone’s secret circuitry and send bulk data when Google decides they want something specific (since encoding in timestamps would be pretty low bandwidth), which would make detection by traffic analysis more difficult, since most of the time it isn’t sending anything at all.

      This is just speculation, but I’ve picked up on a pattern of speculating that something is technically possible, assuming there’s no way they’d actually be doing that, and later finding out that it was actually underestimating what they were doing.

      • Andromxda 🇺🇦🇵🇸🇹🇼@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 months ago

        I don’t mean to discredit your opinion, but it is pure speculation and falls in the category of conspiracy theories. There are plenty of compelling arguments, why this is likely completely wrong:

        • Google Pixels have less than 1% of the global smartphone market share, in fact, they are currently only sold in 12 (the Pixel 9 is sold in 32 countries, my bad, I had an outdated number in mind) countries around the world. Do you really think that Google would spend all the money in research, custom manufacturing, software development and maintenance to extract this tiny bit of data from a relatively small number of users? I’d say more than 90% of Pixel owners use the Stock OS anyways, so it really doesn’t matter. And Google has access to all the user data on around 70% of all the smartphones in the world through their rootkits (Google Play services and framework, which are installed as system apps and granted special privileges), which lets them collect far more data than they ever could from Pixel users.
        • Keeping this a secret would also immensely difficult and require even more resources, making this even less profitable. Employees leave the company all the time, after which they might just leak the story to the press, or the company could get hacked and internal records published on the internet. Since this would also require hardware modifications, it’s also likely that it would get discovered when taking apart and analyzing the device. PCB schematics also get leaked all the time, including popular devices like several generations of iPhones and MacBooks.
        • Lastly, the image damage would be insane, if this ever got leaked to the public. No one would ever buy any Google devices, if it was proven that they actually contain hardware backdoors that are used to exfiltrate data.
        • Buddahriffic@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          2 months ago

          You’re right that it’s pure speculation just based on technical possibilities and I hope you’re right to think it should be dismissed.

          But with the way microchip design (it wouldn’t be at the PCB level, it would be hidden inside the SoC) and manufacturing work, I think it’s possible for a small number of people to make this happen, maybe even a single technical actor on the right team. Chips are typically designed with a lot of diagnostic circuitry that could be used to access arbitrary data on the chip, where the only secret part is, say, a bridge from the cell signal to that diagnostic bus. The rest would be designed and validated by teams thinking it’s perfectly normal (and it is, other than leaving an open pathway to it).

          Then if you have access to arbitrary registers or memory on the chip, you can use that to write arbitrary firmware for one of the many microprocessors on the SoC (which isn’t just the main CPU cores someone might notice has woken up and is running code that came from nowhere), and then write to its program counter to make it run that code, which can then do whatever that MP is capable of.

          I don’t think it would be feasible for mass surveillance, because that would take infrastructure that would require a team that understands what’s going on to build, run, and maintain.

          But it could be used for smaller scale surveillance, like targeted at specific individuals.

          But yeah, this is just speculation based on what’s technically possible and the only reason I’m giving it serious thought is because I once thought that it was technically possible for apps to listen in on your mic, feed it into a text to speech algorithm, and send it back home, hidden among other normal packets, but they probably aren’t doing it. But then I’d hear so many stories about uncanny ads that pop up about a discussion in the presence of the phone and more recently it came out that FB was doing that. So I wouldn’t put it past them to actually do something like this.

          • Andromxda 🇺🇦🇵🇸🇹🇼@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            But it could be used for smaller scale surveillance, like targeted at specific individuals

            Why would this only be present in Pixels then? Google isn’t interested in specific people. Intelligence agencies are. This would mean, that every phone in the world needs to be compromised using this sophisticated, stealthy technology, which is even more unlikely.

            • Buddahriffic@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              2 months ago

              If it is present there, it doesn’t imply it’s only present there.

              And we really have no idea how close of a relationship Google, or any other corp for that matter, has with various intelligence agencies. Same thing with infiltrations by intelligence agencies.

              And no, it doesn’t mean that every phone in the world is compromised with this, which wouldn’t be that sophisticated, just stealthy. The sophisticated part would be part of the normal design process, it’s called DFT or design for test if you want to read about it, used legitimately to determine what parts of the chip have manufacturing flaws for chip binning.

              Most phones don’t have an unlocked bootloader, and this post is about the data Google is pulling on factory pixels.

              Why would they do all the work on the software side and then themselves offer a device that allows you to remove their software entirely? And if it’s worth it just from the “make more money from people who only want unlocked phones”, why isn’t it more common?

              Mind you, my next phone might still be a pixel. Even if this stuff is actually there, I wouldn’t expect to be targeted. I can’t help but wonder about it, though, like just how deep does the surveillance or surveillance potential go?

              • sleepyplacebo@rblind.com
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                2 hours ago

                The Pixel is a good phone to test the latest android features for development purposes. I would imagine to some degree they are trying to target developers interested in testing software by offering the ability to unlock and relock the bootloader. This fosters a vibrant developer community and encourages innovation. Certain things can be tested in an android emulator but it helps to have a real device to test as well.

                Pixels often ship with hardware features that other phones later include. For example Pixel 8 was the first phone with hardware memory tagging extensions and if developers wanted to test that feature they would buy a Pixel first and then use that experience with the devices their company is manufacturing. Pixels are often released with new android versions that implement android features and APIs the way they were intended to work. There have been cases of OEMs releasing devices with broken implementations of standard android features.

                Pixel was the first phone with Strongbox as well. Additionally, It was the first android phone with satellite connectivity.

                It also attracts the segment of the market that just enjoys modifying their phones as well. So basically they are targeting the power user community and developers. Despite the Pixel having the ability to install custom verified boot keys and custom OSs, Google knows that very few users use those features so it does not cut into their Play Store and Play Services market share very much.

    • Southern Boy@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      What is the advantage over Calyx/Lineage/iode OS on compatible devices? I just don’t want Google to have any of my money at all. Buying a privacy solution from them recoups their loss.

      • Tazerface@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        I don’t know about Calyx or Iode but Lineage doesn’t allow for a locked bootloader. This is a massive security hole and without security, sooner or later, your privacy will be violated.

        Currently, GrapheneOS on a newer Pixel are the only phones that Celebrite can’t breach. Celebrite machines are cheap enough that the border guards and your local cops probably have one. In my country, it’s the law that a cop is allowed to examine a phone during a traffic stop.

        • sleepyplacebo@rblind.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 hours ago

          Schools even have Cellebrite devices now, that is how prolific they have become. GrapheneOS has a duress password to wipe the phone and you can block all data or even power to the USB port while the phone is running. If you blocked all power to the USB port while the phone is on the only way to charge it is if it is fully turned off putting your encrypted data at rest. You can just disable data on the USB port options menu in GrapheneOS if you don’t want to completely turn off the whole port.

          You probably already know this stuff I was just mentioning it for people reading this comment section. :)

      • RubberElectrons@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        I like calyx, might try graphene some day. But I absolutely won’t run Google’s play services ala graphene. It’s sandboxed, supposedly, but why run it at all?

        Calyx uses microG, a much smaller, fully open source emulator of Google’s services.

        • Andromxda 🇺🇦🇵🇸🇹🇼@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          2 months ago

          but why run it at all?

          Because it is unfortunately required by some apps. microG is not a viable alternative, as it requires root access on the device, which drastically reduces the security. It also has worse compatibility than Sandboxed Play services, and doesn’t offer much of a benefit. It still downloads and executes proprietary Google blobs in the background in order to function. Apps that require Google services also include a proprietary Google library, making microG essentially useless. It’s an open source layer that sits between a proprietary library and a proprietary network service, using proprietary binaries and requiring root access. You gain absolutely nothing from using it, and significantly increases the attack surface of your device.

          fully open source emulator

          This is simply false, as I explained, only a tiny bit of what microG requires to function is open source

          You’re far better off using Sandboxed Play services on GrapheneOS