Search results for: “Google chat”

  • Google says iOS 26 borrows key Android features

    Google says iOS 26 borrows key Android features

    Apple recently announced iOS 26, and Google has pointed out that some of its new features are very similar to ones Android has had for a while. Here are the three main features that Google says Apple has borrowed:

    1. Improved Call Recording

    Apple is adding call recording to iPhones with iOS 26. This feature lets users record phone calls and even get a summary of the conversation. Google notes that Android phones have offered call recording for years, and some models even provide automatic summaries.

    2. Smarter Messaging with RCS

    iOS 26 will support RCS (Rich Communication Services) in the Messages app. This means iPhone users can now send high-quality photos and videos, see typing indicators, and enjoy better group chats with Android users. Google has supported RCS on Android for a long time, making texting between different phones easier and more modern.

    3. Customizable Home Screen

    Apple is allowing users to place app icons and widgets anywhere on the home screen, not just in a fixed grid. Android users have enjoyed this flexibility for many years, letting them organize their home screens however they like.

    Google commented on social media, welcoming Apple to these features and playfully reminding everyone that Android has had them first. While Apple fans are excited about these changes, it’s clear that iOS 26 is catching up to some things Android users already know and love.

  • ChatGPT now works as your Safari search engine with a simple add-on

    ChatGPT now works as your Safari search engine with a simple add-on

    OpenAI recently updated the ChatGPT app, bringing a handy new feature: a Safari Extension. This add-on lets you use ChatGPT as the main search tool for anything you type into the Safari search bar. After you get the latest ChatGPT app update, you can turn on the ChatGPT Search Extension. Just head to the Safari settings in your phone’s Settings app and switch it on. Once activated, every question or topic you enter in the Safari search bar will go straight to ChatGPT Search instead of your usual search engine, like Google.

    When you enable this extension, all your Safari searches will flow through ChatGPT Search, making it your go-to search tool in the browser. The same rules that apply to using ChatGPT still count here—no changes there. To make it work, the extension will ask for permission to connect with Google.com or whatever search engine you normally use. Once you allow it, any search you type will skip your regular engine and head to ChatGPT’s search system instead.

    While there’s no direct way to pick ChatGPT as your favorite search engine in Safari’s main options, this extension gives you a smart way to get around that. It’s an easy fix for anyone who wants ChatGPT to handle their searches. This update keeps things simple and smooth, letting you explore the web with ChatGPT’s help right from the Safari bar. Whether you’re looking up quick facts or digging into something bigger, this add-on makes it happen without extra steps.

  • Apple plans to add Google Gemini to Apple Intelligence

    Apple plans to add Google Gemini to Apple Intelligence

    Right now, Apple Intelligence lets Siri pass some questions to ChatGPT for smarter, more detailed answers than Siri can give on its own. During WWDC24, Apple’s software leader, Craig Federighi, hinted in a chat that they’re open to teaming up with other AI systems, like Google Gemini. A fresh leak suggests this teamwork might happen soon.

    A recent update tied to the iOS 18.4 beta shows “Google” and “OpenAI” listed as outside options for Apple Intelligence. This clue comes from code explorer Aaron Perris, who shared it onX. This doesn’t promise Gemini will pop up in iOS 18.4—especially since Apple Intelligence has faced some slowdowns already—but it strongly hints it’s coming eventually. It could land in a later iOS 18 tweak or roll out with iOS 19. Word is, Apple’s also cooking up its own chatty Siri upgrade for iOS 19.

    Google just dropped some shiny new Gemini 2.0 models, including one built for better reasoning. These might soon show up on iPhones, at least if you’ve got an iPhone 15 Pro, iPhone 16 or 16 Pro, or the upcoming iPhone 16e. In short, Apple’s gearing up to mix Google’s brainpower into its tech, giving users more ways to get sharp answers straight from their phones. Stay tuned—big things could be on the way!

  • Five cool features coming to Apple’s new Homepad

    Five cool features coming to Apple’s new Homepad

    Apple just revealed a new smart home device called HomePad, and it’s packed with exciting features designed to make your home smarter and more connected. Here are five key features you should know about:

    1. Easy setup with iPhone
      Setting up HomePad is super simple. Just bring your iPhone close to it, and it automatically syncs your Apple ID, Wi-Fi settings, and more—just like setting up AirPods or a HomePod.

    2. Works with Matter
      HomePad fully supports Matter, the new smart home standard that works with devices from different brands. This means you can control smart devices from Apple, Google, Amazon, and others—all from the same app.

    3. Multi-display support
      HomePad supports multiple displays at once. So, you can control smart lights from the kitchen while watching a camera feed in the living room. It makes multitasking across rooms easy and smooth.

    4. Hand off FaceTime and calls
      You can start a FaceTime call on your iPhone and then transfer it to the HomePad instantly. It even works with other video calling apps, making your video chats more flexible.

    5. Smart home automation with Siri
      Siri on HomePad helps automate tasks like locking doors at night or turning off lights when you leave home. You can also set up routines that adjust based on time or who’s in the house.

    Apple hasn’t given a release date yet, but these features show they’re aiming to make HomePad a powerful hub for your smart home.

  • iOS 19 beta set to launch with cool new features

    iOS 19 beta set to launch with cool new features

    Apple is gearing up to unveil iOS 19, its next major iPhone update, with a beta release expected in June 2025, shortly after the Worldwide Developers Conference (WWDC) kicks off on June 9. The official version will likely drop in September 2025, alongside new iPhones, though some features may trickle out later, possibly into 2026.

    iOS 19 will sport a bold new style inspired by the Vision Pro’s visionOS. Picture a glossy, transparent interface with smoother, curvier app icons and a floating navigation bar in apps. This makeover, the most significant since iOS 7, will also refresh iPadOS 19 and macOS 16, creating a seamless look across Apple’s ecosystem.

    Siri’s getting a major boost in iOS 19, powered by enhanced Apple Intelligence. It’ll dive deeper into your emails, photos, and apps, making tasks feel more intuitive. Some of Siri’s advanced tricks might not show up until iOS 19.4 in spring 2026. There’s also buzz about Google Gemini joining ChatGPT as an optional Siri assistant.

    Expect other perks like upgraded Stage Manager for USB-C iPhones, secure RCS texting, real-time translations via AirPods, and a smarter Health app with AI-powered wellness tips. iOS 19 should support iPhone 11 and later models. Post-WWDC, developers will dive into the beta, with a public beta opening up in the summer for eager testers.

  • Apple joins Anthropic to craft AI-powered coding assistant

    Apple joins Anthropic to craft AI-powered coding assistant

    Apple is collaborating with Anthropic to develop a new AI-driven coding tool named “CodeFlow,” as reported by Bloomberg. This innovative software is designed to assist developers by generating, refining, and testing code seamlessly. Integrated into an enhanced version of Apple’s Xcode platform, CodeFlow leverages Anthropic’s Claude Sonnet model, renowned for its exceptional coding capabilities.

    Currently, Apple intends to use CodeFlow internally to streamline its development process. There’s no confirmation yet on whether it will be released to the public. The tool features a conversational interface, enabling programmers to request code samples or troubleshoot errors effortlessly.

    It also supports testing app interfaces, speeding up the creation process significantly. Apple is actively partnering with multiple AI firms to advance its tech offerings. For instance, OpenAI’s ChatGPT enhances some of Apple’s AI functions, and there’s talk of integrating Google’s Gemini later.

    Anthropic’s Claude is a favorite among coders, widely used on platforms like Cursor and Windsurf for its reliability in programming tasks. This collaboration underscores Apple’s commitment to leading in AI innovation, as competitors increasingly adopt similar tools to boost efficiency.

    Through this partnership with Anthropic, Apple is set to revolutionize its coding workflow and may eventually extend CodeFlow to external developers. This effort reflects the growing role of AI in transforming software development, making it faster and more accessible for creators everywhere.

  • iOS 19 could bring new AI tools from outside companies

    iOS 19 could bring new AI tools from outside companies

    Apple’s next big update, iOS 19, set to arrive in 2025, might shake things up by adding artificial intelligence tools from other companies to iPhones. Sources suggest Apple is exploring ways to include various AI systems, letting users tap into more than just Apple’s tech.

    Currently, iOS 18 lets users interact with OpenAI’s ChatGPT alongside Siri for answering questions or tackling tasks. With iOS 19, Apple could broaden this by adding AI models like Google’s Gemini or offerings from firms like Anthropic.

    This would give iPhone users the freedom to pick AI tools that best suit their needs, whether it’s for writing, problem-solving, or organizing their day. By blending these external AI systems with Apple’s apps and features, iPhones could become more versatile and tailored to individual preferences.

    While Apple continues to develop its own AI, known as Apple Intelligence, including outside tools could make the user experience more dynamic and powerful. These plans are still unconfirmed, and Apple might share more at its WWDC event in June 2025, with a likely release in September. If the rumors hold, iOS 19 could redefine how AI enhances iPhones, offering users smarter, more diverse features.

  • iOS 18.4 brings better messaging and app choices

    iOS 18.4 brings better messaging and app choices

    Apple’s iOS 18.4 update, released on April 2, 2025, makes texting and app use simpler for iPhone users. The Messages app now supports RCS (Rich Communication Services) for more people, especially those on smaller T-Mobile networks like Mint Mobile and Google Fi.

    This means you can send high-quality photos, see when someone’s typing, and enjoy smoother chats with Android friends. To check if it works for you, go to Settings > Apps > Messages > RCS Messaging. If your carrier supports it, you’ll see “RCS” in the text box when messaging Android users.

    Plus, iOS 18.4 lets you pick your favorite apps as defaults in new areas like messaging and calls. Before, you could only set defaults for things like email or browsers, but now you can choose apps like WhatsApp for texting or calling instead of Apple’s built-in options.

    This gives you more control over how your iPhone works. Both updates make everyday tasks easier and more personal, so you can chat and use apps your way. Have you tried these changes yet? They’re a big step forward!

  • Apple’s new AirPods with cameras: What’s Coming?

    Apple’s new AirPods with cameras: What’s Coming?

    Apple is busy working on AirPods that come with cameras, according to Mark Gurman from Bloomberg. Don’t expect to see this in the AirPods Pro 3, which should launch this year. Instead, it’s a plan. Apple wants these earbuds to get smarter about the world around you—here’s why.

    Seeing the World with AirPods

    With the iPhone 16, Apple added a Camera Control button. It’s handy for snapping pictures or tweaking camera options, but it also brought something called Visual Intelligence. This feature helps you figure out what’s around you, like adding a flyer’s event to your calendar or asking Google or ChatGPT about something confusing.

    Gurman says Apple wants AirPods to do similar things. Imagine tiny cameras on your earbuds using artificial intelligence to “see” your surroundings and tell you about them. It’s like having smart glasses but without the glasses! You can ask Siri what’s nearby without even touching your iPhone.

    Better Sound Experience

    Ming-Chi Kuo, a supply chain expert, thinks these cameras could team up with other Apple gadgets, like the Vision Pro headset. They might improve how you hear sounds around you, especially with spatial audio. For example, if you’re watching a video with Vision Pro and turn your head, the sound could shift to match where you’re looking, making it feel more real. Kuo even suggests the cameras might let you control the AirPods with hand gestures in the air—pretty cool, though it sounds a bit unusual!

    When Can We Get Them?

    Bloomberg reports that these camera-equipped AirPods won’t arrive until at least 2027, possibly with the AirPods Pro 4. Apple might also launch smart glasses around then, similar to Meta’s Ray-Bans. The goal? To make use of the Visual Intelligence tech from Vision Pro, which scans your surroundings and gives helpful info.

    In short, Apple’s cooking up something exciting with AirPods. Cameras could make them smarter and more connected to your world, blending sound and sight in fresh ways. Stay tuned for more as 2027 gets closer!

  • Why Siri’s big upgrade needs to be amazing by 2027

    Why Siri’s big upgrade needs to be amazing by 2027

    Apple used to be a top player in smart assistants, but in just 14 years, it’s fallen behind. Back in 2011, Siri felt like the future. Now, in 2025, Apple Intelligence feels weak compared to what’s out there.

    Siri’s 14-Year Journey

    I still remember the iPhone 4S launch when Siri stole the show. It wasn’t even Apple’s idea at first—it started as an app someone else made. Steve Jobs saw its potential, bought it, and put it into the iPhone. That move made smart assistants a must-have for phones. But after 14 years, Siri should be incredible by now.

    It’s not. In 2015, I wished Siri could work with my apps. It’s only starting to do that now, a whole decade later! Worse, in 2018, I listed simple things Siri couldn’t handle—and it still can’t do some of them. Today, Siri feels basic while tools like ChatGPT shine.

    Why Apple’s Behind

    Apple has reasons for lagging. One is reliability. Other companies like OpenAI raced ahead, even if their AI sometimes messed up big time—like ChatGPT inventing fake facts or Google’s Bard flopping in a demo. Siri’s spoken answers can’t afford those mistakes—it’d be risky to hear wrong info without a warning.

    Another reason is privacy. Siri sticks to two rules: process stuff on your phone when it can, and keep your identity hidden when it uses Apple’s servers. That’s safer but less powerful than rivals who use big data centers and know tons about you.

    The Privacy Win

    Last year, I said waiting for a smarter Siri would pay off because of privacy. Our phones hold so much—calendar, messages, health info—and soon, Siri can tap into apps we pick, all without leaving our device. That could make Siri as good as the competition, but safer. I want an assistant who knows me well but stays private. That’s what Apple’s aiming for.

    A Longer Wait

    We thought this new Siri would hit in 2026. Now, the word is it’s delayed to 2027—or later. That’s tough, but if it’s great, we’ll forget the wait. Still, by 2027, other AI like ChatGPT or Amazon’s Alexa will be miles ahead. Siri has to be spectacular to catch up. Apple’s got a big challenge, but I’m hopeful it’ll be worth it.