Techfullnews

Google Continues to Improve Search on Android with New Gestures and Floating Search Bar in QPR1

Google Continues to Improve Search on Android

Google is continuing to improve and rethink how searching for things can work on phones and tablets through a more immersive design and a new gesture in Android 14 QPR1.

Hold handle to search

A new toggle for “Hold handle to search” has been spotted in the Settings app, which would allow users to invoke search by holding down on the navigation handle at the bottom of the screen.

It is unclear exactly how this new screen search capability will work, but it is possible that it could be a quick shortcut to sending an image/screenshot to Google Lens, or even a revival of the once-beloved Google Now on Tap feature.

Floating search bar

Android Authority has also spotted that the Pixel Launcher is once again tweaking the way that search works. Rather than putting the search bar at the top of the screen, it will now appear as a floating button when you open your app drawer. When you’re typing out your query, the search bar hovers just above your keyboard. This makes the search bar easier to reach single-handedly.

This is just a beta release, so it is possible that these features could change or be removed before the final release of Android 14 QPR1. However, it is clear that Google is committed to improving the search experience on Android.

Additional information

The new gesture search feature is particularly interesting, as it could be a way for Google to bring back some of the functionality of Google Now on Tap, which was a popular feature that allowed users to get quick information about what was visible on their screen.

The floating search bar feature is also a nice touch, as it makes it easier to search for apps and files without having to leave the app drawer.

Overall, it is encouraging to see Google continue to invest in improving the search experience on Android. These new features in Android 14 QPR1 could make it even easier and more convenient to find what you’re looking for on your phone or tablet.

ADVERTISEMENT
RECOMMENDED
NEXT UP

In a surprising shift, Google has decided to remove its AI assistant Gemini from the main Google app for iOS devices. This move is aimed at encouraging users to download the standalone Gemini app, positioning it as a direct competitor to other popular AI chatbots like ChatGPTClaude, and Perplexity. While this strategy could help Google streamline its AI offerings, it also risks alienating users who are accustomed to accessing Gemini within the familiar Google app.

In this article, we’ll explore the implications of Google’s decision, how it impacts iOS users, and what this means for the future of AI assistants. Whether you’re a tech enthusiast or a casual user, this guide will break down everything you need to know about this significant change.


Why is Google Removing Gemini from the iOS Google App?

A Strategic Shift

Google’s decision to pull Gemini from the main Google app is part of a broader strategy to position the Gemini app as a standalone product. By doing so, Google aims to:

  1. Compete Directly with AI Rivals: Standalone apps like ChatGPT and Claude have gained significant traction, and Google wants Gemini to be seen as a direct alternative.
  2. Enhance User Experience: A dedicated app allows Google to roll out new features and updates more efficiently.
  3. Monetize AI Services: The standalone app provides a clear pathway for users to upgrade to Gemini Advanced, Google’s paid AI subscription service.

The Risk of Reduced Reach

While this move has its advantages, it also comes with risks. The main Google app is already installed on millions of iOS devices, and many users may not be motivated to download a separate app. This could lead to a drop in Gemini’s overall usage, at least in the short term.


What Does This Mean for iOS Users?

How to Access Gemini Now

If you’re an iOS user who relies on Gemini, here’s what you need to know:

  • Download the Gemini App: The standalone Gemini app is available on the App Store and offers all the features previously accessible through the Google app.
  • Upgrade to Gemini Advanced: For advanced features, users can subscribe to Google One AI Premium, which is available as an in-app purchase.

Features of the Gemini App

The standalone Gemini app offers a wide range of capabilities, including:

  • Voice Conversations: Use Gemini Live to engage in voice-based interactions with the AI assistant.
  • Integration with Google Services: Connect Gemini to apps like SearchYouTubeMaps, and Gmail for a seamless experience.
  • AI-Powered Tools: Create images, plan trips, get summaries, and explore topics with ease.
  • Multiple Interaction Modes: Interact with Gemini via text, voice, or even your device’s camera.

The Challenges Ahead

User Adoption

One of the biggest challenges Google faces is convincing users to download the standalone Gemini app. While tech-savvy users may make the switch, casual users might find the extra step inconvenient.

Competition in the AI Space

The AI chatbot market is highly competitive, with players like OpenAI’s ChatGPT and Anthropic’s Claude already dominating the space. Google will need to differentiate Gemini to attract and retain users.

Potential Drop in Usage

By removing Gemini from the main Google app, Google risks losing users who don’t transition to the standalone app. This could impact Gemini’s overall reach and adoption rates.


Google’s Reminder: Gemini Isn’t Perfect

In its email to users, Google emphasized that Gemini can still make mistakes. The company advises users to double-check the AI’s responses, especially for critical tasks. This transparency is crucial for building trust, especially as AI becomes more integrated into daily life.


Expert Insights: What Analysts Are Saying

We reached out to Michael Chen, a tech analyst at FutureTech Insights, for his perspective on Google’s move.

“Google’s decision to pull Gemini from the main iOS app is a bold but calculated risk. While it may streamline their AI offerings and improve the user experience in the long run, the short-term challenge will be convincing users to adopt the standalone app. If Google can effectively communicate the value of Gemini, this move could pay off significantly.”


How to Make the Most of the Gemini App

Tips for iOS Users

  1. Explore All Features: Take advantage of Gemini’s voice, text, and camera-based interactions to get the most out of the app.
  2. Integrate Google Services: Connect Gemini to your Google apps for a more personalized experience.
  3. Consider Gemini Advanced: If you need advanced features, explore the Google One AI Premium subscription.

A New Chapter for Gemini

Google’s decision to remove Gemini from the main iOS Google app marks a significant shift in its AI strategy. While the move is aimed at strengthening Gemini’s position in the competitive AI landscape, it also comes with challenges, particularly around user adoption and reach.

For iOS users, the standalone Gemini app offers a wealth of features and capabilities, but the transition may require some adjustment. As Google continues to refine its AI offerings, one thing is clear: the future of AI assistants is evolving, and Gemini is poised to play a key role.


Key Takeaways

  • Google is removing Gemini from the main Google app for iOS to encourage users to download the standalone Gemini app.
  • The standalone app offers features like Gemini Live, integration with Google services, and advanced AI tools.
  • Users can upgrade to Gemini Advanced through the Google One AI Premium subscription.
  • The move is a strategic effort to compete with AI rivals like ChatGPT and Claude, but it risks reducing Gemini’s reach.
  • Google reminds users that Gemini can still make mistakes and advises double-checking its responses.

Apple continues to push the boundaries of smartphone innovation, and its latest move is no exception. The tech giant has confirmed that Visual Intelligence, a feature initially introduced with the iPhone 16 lineup, will soon be available on the iPhone 15 Pro via a future software update. This announcement has sparked excitement among Apple enthusiasts, as it brings advanced camera-based capabilities to older devices.

In this article, we’ll explore what Visual Intelligence is, how it works, and what this update means for iPhone 15 Pro users. Whether you’re a tech-savvy Apple fan or simply curious about the latest advancements in smartphone technology, this guide will provide all the details you need.


What is Visual Intelligence?

A Google Lens-Like Tool for iPhone

Visual Intelligence is Apple’s answer to Google Lens, a powerful tool that allows users to point their phone’s camera at objects, text, or scenes to gain more information. For example, you can use it to:

  • Identify plants, animals, or landmarks.
  • Translate text in real time.
  • Scan QR codes or barcodes.
  • Get details about products, artwork, or books.

Originally launched with the iPhone 16 lineup, Visual Intelligence was designed to be activated via the Camera Control button, a new hardware feature introduced with the latest models. However, Apple has now confirmed that the feature will also be available on the iPhone 15 Pro, despite the absence of the Camera Control button.


How Will Visual Intelligence Work on the iPhone 15 Pro?

Leveraging the Action Button

The iPhone 15 Pro doesn’t have the Camera Control button, but it does feature the Action Button, a versatile tool that replaces the traditional mute switch. Apple has confirmed that Visual Intelligence will be accessible through the Action Button, making it easy for users to activate the feature with a single press.

Control Center Integration

In addition to the Action Button, Visual Intelligence will also be available through the Control Center, providing users with multiple ways to access this powerful tool. This flexibility ensures that the feature is both intuitive and convenient to use.


When Will Visual Intelligence Arrive on the iPhone 15 Pro?

Expected Release Timeline

While Apple has confirmed that Visual Intelligence is coming to the iPhone 15 Pro, the company has not specified an exact release date. However, industry experts like John Gruber of Daring Fireball speculate that the feature could debut with iOS 18.4.

  • Developer Beta: The update could roll out to developers in the coming weeks.
  • Public Release: iOS 18.4 is expected to be available to all users by early April 2024.

Why This Update Matters

Extending the Lifespan of Older Devices

By bringing Visual Intelligence to the iPhone 15 Pro, Apple is demonstrating its commitment to supporting older devices with new features. This move not only enhances the user experience but also adds value to existing products, encouraging users to hold onto their devices for longer.

Bridging the Gap Between Generations

The iPhone 15 Pro and iPhone 16 series now share a key feature, bridging the gap between the two generations. This ensures that users of older models don’t feel left behind, even as Apple continues to innovate.


How Visual Intelligence Compares to Google Lens

Apple’s Unique Approach

While Visual Intelligence is similar to Google Lens, Apple’s implementation is deeply integrated into the iOS ecosystem. This means seamless compatibility with other Apple services and apps, such as SafariNotes, and Photos.

Enhanced Privacy

Apple has a strong reputation for prioritizing user privacy, and Visual Intelligence is no exception. The feature is designed to process data on-device whenever possible, minimizing the need to send information to external servers.


What Users Can Expect

A Smarter Camera Experience

With Visual Intelligence, the iPhone 15 Pro’s camera becomes more than just a tool for capturing photos—it becomes a gateway to information. Whether you’re traveling, shopping, or simply exploring the world around you, this feature adds a new layer of functionality to your device.

Improved Accessibility

Visual Intelligence also has the potential to improve accessibility for users with visual impairments. By providing real-time information about the environment, it can help users navigate their surroundings more effectively.


Expert Insights: What Tech Analysts Are Saying

We reached out to Jessica Patel, a tech analyst at TechInsights, for her take on Apple’s latest move.

“Bringing Visual Intelligence to the iPhone 15 Pro is a smart strategy by Apple. It not only enhances the value of older devices but also reinforces the company’s commitment to innovation and user satisfaction. This feature could be a game-changer for how people interact with their smartphones.”


How to Prepare for the Update

Ensure Your Device is Ready

To make the most of Visual Intelligence, iPhone 15 Pro users should:

  1. Update to the Latest iOS Version: Keep your device updated to ensure compatibility with new features.
  2. Familiarize Yourself with the Action Button: Learn how to customize and use the Action Button for quick access to Visual Intelligence.
  3. Explore the Control Center: Get comfortable with accessing features through the Control Center for added convenience.

A Smarter Future for iPhone Users

Apple’s decision to bring Visual Intelligence to the iPhone 15 Pro is a testament to the company’s dedication to innovation and user experience. By extending this powerful feature to older devices, Apple is ensuring that more users can benefit from the latest advancements in smartphone technology.

As we await the official release of iOS 18.4, one thing is clear: the iPhone 15 Pro is about to become even more intelligent, versatile, and indispensable.

ADVERTISEMENT
Receive the latest news

Subscribe To Our Weekly Newsletter

Get notified about new articles