The Google Pixel’s Assistant Squeeze Was A Button Without A Button

    3
    The Google Pixel's Assistant Squeeze Was A Button Without A Button

    Even though the Pixel 2 is over five years old, it offered a function that I progressively miss as time goes on. Its name was Active Edge, and it allowed you to call Google Assistant by gently squeezing your phone. It’s a unique concept in several respects. However, it successfully provided you something that current phones sorely lack: a means to physically engage with the phone to simply complete a task.

    You won’t see anything to suggest that you’re holding anything unique if you look at the sides of the Pixel 2 and 2 XL. The sides are fairly empty except from the power button and volume rocker. The Google Assistant will appear at the bottom of the screen and be ready to begin listening to you if you squeeze the phone’s exposed edges hard enough. There is no need to tap the screen, long-press any physical or virtual button, or wake up the phone. You squeeze and begin to speak.

    I don’t want to downplay how amazing it feels, but we’ll speak about how beneficial this is in a moment. Despite the fact that phones are tough metal and plastic items, the Pixel can detect when I’m exerting greater pressure than I would by simply holding it. This is made possible, according to an earlier iFixit breakdown, by a few strain gauges put within the phone that can detect the very little bend in the casing when the phone is squeezed. For the record, I am unable to detect any bending of the phone because of the limitations of the human nerve system.

    As this Reddit thread demonstrates, whether you found Active Edge beneficial likely depended on how much you like using Google Assistant. In my experience, I only ever truly utilised a voice assistant on a regular basis when I had the Pixel 2, simply because it was nearby. It was extremely convenient since the squeeze essentially always worked. Active Edge continued to function even while you were in an app that obscured the navigation buttons or when your phone’s screen was entirely off.

    I would say that Active Edge could have been so much more helpful if you had been allowed to remap it, even though that made it really handy for digging up trivia or performing quick computations and conversions. Even while I liked having the assistant, I would have always had quick access to the most crucial functions of my phone if I could have turned on my flashlight with a squeeze.

    The feature truly existed in this form. The Edge Sense feature on HTC’s U11, which was released a few months before the Pixel 2, was comparable to Pixel 2’s but more flexible. It appears on Google’s smartphones since the two firms collaborated on the Pixel and Pixel 2. Google acquired HTC’s mobile division the same year.

    Google has made previous attempts to offer a different method of controlling your phone besides the touchscreen or physical buttons, such Active Edge. A few years before to the release of the Pixel 2, Motorola offered devices that allowed you to shuffle the songs on an iPod Nano from 2008 by rotating the phone to reveal the camera and turning on the flashlight with a karate chop. During the relatively brief period of time that Google controlled Motorola, the camera shortcut was created.

    But as time went on, phone makers got farther and farther away from allowing users to access a few key functions by a physical motion. Consider the iPhone 12 Mini I use every day as an example. I have to press and hold the power button for a few seconds to activate Siri, which is a nuisance since Apple eliminated the home button. I frequently use the flashlight, therefore I have to wake up the screen and press and hold the button in the left-hand corner in order to turn it on. The camera is a little more practical because it can be accessed by swiping left on the lock screen, but the screen must still be on for it to operate. And if I’m really using the phone, Control Center, which requires swiping down from the top-right corner and attempting to choose one particular item from a grid, is the quickest method to access the flashlight or camera.

    For example, if I glance up from my phone and see my cat doing something amusing, he may have already stopped by the time I really have the camera ready. It’s not that launching the camera or turning on the flashlight is difficult; rather, a dedicated button or squeeze gesture would make it much more easy. When it created a battery cover for the iPhone with a button to activate the camera, Apple even made a passing reference to this. Over the course of a phone’s lifespan, a few seconds saved here and there add up.

    To demonstrate the point, let’s compare how quickly the camera launches on my iPhone to how quickly it launches on the Samsung Galaxy S22, which requires a double-click of the power button:

    Both phones do screen recording and camera previewing admirably, but the S22 opens its camera app before I’ve even touched the iPhone’s camera icon.

    Unfortunately, the disappearance of tactile buttons affects even Google’s smartphones. With the release of the Pixel 4A and 5 in 2020, Active Edge was discontinued. Additionally, Samsung removed a button that it had previously included to bring up a virtual assistant (which, tragically, happened to be Bixby).

    Virtual buttons that you press by interacting with the device have been tried to be added. For instance, Apple’s accessibility feature allows you to touch on the phone’s back to initiate operations or even your own mini-programs in the form of Shortcuts, while Google just added a function of a similar kind to Pixels. But to be quite honest, I just don’t think they’re trustworthy enough. A virtual button with sporadic functionality is not a good button. I found that Active Edge functioned almost always even when my phone was protected by a sturdy OtterBox.

    Physical controls on phones are still present, although not entirely. There are plenty of Android phones that allow you to access the camera or other applications by double-pressing the power button, and as I hinted to earlier, Apple allows you launch things like Apple Pay and Siri through a succession of taps or pushes on the power button.

    But I’d contend that we can’t easily access all we need to have easy access to with only one or two shortcuts set to a single button. To be clear, I’m not asking for my phone to be completely covered with buttons, but I do believe that large manufacturers should take a hint from older models of phones (and, yes, from smaller phone makers — I’m looking at you, Sony lovers) and bring back at least a few tactile shortcuts. That doesn’t necessary call for adding a second physical key that needs to be waterproof, as Google demonstrated. Users may rapidly access functions that they — or, in the case of the Pixel, Google — deem important by doing something as easy as a push on a button.