Search results for: “Dictation”

  • Why iPhones mix up ‘Racist’ with ‘Trump,’ According to Apple

    Why iPhones mix up ‘Racist’ with ‘Trump,’ According to Apple

    Apple has pointed to a sound mix-up as the reason behind a strange issue with the iPhone’s voice-to-text tool. In recent days, this odd glitch has been popping up on social media, especially TikTok. One video shows an iPhone owner speaking the word “racist” clearly into the dictation feature, only for the phone to briefly type “Trump” instead. Moments later, it fixes itself before finishing the transcription.

    An Apple spokesperson explained to The New York Times that the confusion comes from a “sound overlap” between the two words. They’re already working on a solution to stop it from happening. John Burkey, who used to work on Apple’s Siri team and still chats with them regularly, said the trouble started after a recent update to Apple’s servers.

    He’s not buying the “sound overlap” story, though. “It feels more like a big joke from someone inside Apple,” he said. Burkey, now the founder of Wonderrush.ai, an AI startup, thinks it’s unlikely this is just a random error tied to Apple’s AI data. The fact that the word corrects itself hints it’s not purely a tech slip-up. He suspects someone might have sneaked a bit of code into Apple’s system to swap “racist” with “Trump” on purpose.

    “Who’s behind it?” Burkey wondered. “Did they tweak the data or mess with the code?”

    Interestingly, The Wall Street Journal pointed out that other words starting with “r” — like “rampant” or “rampage” — have also briefly turned into “Trump” during dictation. For now, Apple says it’s a simple sound confusion they’re fixing. But with people like Burkey raising eyebrows, it’s hard not to wonder if there’s more to this quirky iPhone hiccup than meets the eye. Either way, it’s got folks online talking — and maybe laughing a little, too.

  • Apple tackles funny iPhone voice typing glitch

    Apple tackles funny iPhone voice typing glitch

    Many iPhone users spotted a strange problem today with the phone’s voice typing feature. When they say “racist” to send a message, the word “Trump” pops up for a second before fixing itself to “racist.”

    This quirky glitch happens when people use the iPhone’s dictation tool. Sometimes, as they speak “racist,” the phone types “Trump” in the Messages app. Then, it quickly changes to the right word after figuring out what was said.

    In our tests, saying “racist” didn’t always turn into “Trump” first, but it happened more often than other mix-ups. We also noticed “Rhett” or “Rouch” showing up briefly before the phone corrected it to “racist.”

    A video showing this odd bug has been making rounds on TikTok and other social media. An Apple spokesperson explained to The New York Times that the mix-up comes from the words sounding a bit alike. It’s unclear if this has been an issue for a while and only got noticed now, or if something changed recently to cause it. Apple assured us they’re working on a solution.

    John Burkey, who used to work on Apple’s Siri team, told The New York Times that there might be something in Apple’s system accidentally turning “racist” into “Trump.” He jokingly called it a “big prank” but wasn’t sure if it was added on purpose or slipped into the data Apple uses for its smart features.

    Note: Since this topic touches on political or social stuff, the chat about it is in our Political News forum. Everyone can read it, but only forum members with 100+ posts can join the conversation.

  • The Truth About Siri and Your Privacy: Debunking the myths

    The Truth About Siri and Your Privacy: Debunking the myths

    The digital age has brought incredible convenience to our fingertips, but it has also sparked privacy concerns. One area that frequently comes under scrutiny is voice assistants like Apple’s Siri. Recently, a settlement reached by Apple regarding past practices related to Siri data has reignited these concerns, leading to a flurry of speculation and misinformation. Let’s delve into the facts and separate them from the fiction surrounding Siri and user privacy.  

    In 2019, reports surfaced alleging that Apple used contractors to review a small percentage of Siri interactions. This practice, intended to improve Siri’s accuracy and understanding, involved human evaluation of audio recordings. While Apple maintained that these recordings were anonymized and subject to strict confidentiality agreements, the reports raised legitimate questions about user privacy. 

    Apple responded swiftly, acknowledging the concerns and implementing significant changes to its Siri privacy protocols. One of the most important changes was making the retention of Siri audio recordings opt-in. This meant that, by default, Apple no longer stored recordings of user interactions. Instead, users could actively choose to contribute their data to help improve Siri. Furthermore, Apple committed to using only its own employees for this review process, eliminating the involvement of third-party contractors. Any recordings accidentally triggered were promptly deleted.  

    Fast forward to the present day, and Apple recently agreed to a settlement related to the 2019 concerns. This settlement, however, has been misinterpreted by some as an admission of wrongdoing or evidence of ongoing privacy violations. In reality, Apple explicitly stated that the settlement was intended to put to rest lingering concerns about past practices that had already been addressed. 

    A key point that Apple has consistently emphasized is that Siri data has never been used to build marketing profiles or sold to any third party for any purpose. This is a crucial distinction that often gets lost in the noise. The company maintains that the data collected from Siri interactions, when users opt-in, is solely used to improve the functionality and accuracy of the voice assistant. 

    To further understand Apple’s commitment to privacy, it’s essential to examine the technical safeguards they have in place. Siri interactions are associated with a random, rotating identifier during processing. This identifier is not linked to a user’s Apple ID, phone number, or any other personally identifiable information. After six months, even this temporary association is severed, ensuring further anonymity.  

    Apple also provides users with direct control over their Siri data. Within the Settings app, users can access and review their Siri and Dictation history, giving them the option to delete past interactions. This transparency and control are fundamental to Apple’s approach to privacy.  

    Moreover, certain Siri requests are processed entirely on-device, meaning the data never leaves the user’s iPhone. For example, when Siri reads unread messages, the content of those messages remains on the device and is not transmitted to Apple servers. This on-device processing further minimizes the potential for data exposure.  

    For more complex tasks that require cloud processing, Apple utilizes what it calls Private Cloud Compute. This infrastructure, built on Apple’s own silicon and subject to independent security audits, provides a secure environment for processing data while maintaining user privacy. The architecture is also open to third-party scrutiny and research.  

    In conclusion, while concerns about data privacy in the digital age are valid, it’s important to base our understanding on facts rather than speculation. Apple has taken significant steps to protect user privacy in relation to Siri, from implementing opt-in data collection to employing robust technical safeguards.

    The recent settlement should be viewed as a resolution of past concerns, not as an admission of current privacy violations. Apple has repeatedly and unequivocally stated that Siri data is not used for marketing or sold to third parties. By understanding the facts and the measures Apple has put in place, we can have a more informed and nuanced conversation about the role of voice assistants in our lives.

    Source