Search results for: “privacy”

  • Why Apple’s fight for data privacy matters more than you think

    Why Apple’s fight for data privacy matters more than you think

    Apple’s Advanced Data Protection (ADP) is a tool that keeps your iCloud data super safe. Not many people used it before—most regular folks didn’t know about it, and only a few tech fans turned it on. But now, Apple’s battle with the UK government over this feature is a bigger deal than it looks. Here’s why it’s so important.

    The UK’s Big Move Against Privacy

    ADP locks up your iCloud info with something called end-to-end encryption. Only you can unlock it on your own devices—Apple can’t, and neither can anyone else, like the government. The UK wanted Apple to break this lock so they could peek at people’s data, not just in the UK but everywhere. That’s a huge overstep.

    Apple rolled out ADP in 2022, but it didn’t get much attention. It’s not on by default, so only super careful users switched it on—until this fight started.

    Apple’s Sneaky Way of Fighting Back

    The UK’s rules say Apple can’t tell anyone if they’re ordered to unlock ADP. So, Apple got smart. Instead of spilling the beans (which would be illegal), they just said, “Sorry, UK users, we’re pulling ADP from your country.” No explanation—just a big hint. Their message was loud and clear: “We won’t build a secret way into your data, and we never will.” They even took the issue to a special UK court, quietly challenging the order.

    From Quiet Feature to Front-Page News

    The UK and US are buddies in a group called Five Eyes, where they share secrets. Back in 2018, this group said privacy isn’t everything and pushed against strong encryption. However, the UK went too far, demanding access to everyone’s iCloud worldwide. Apple’s clever response got people talking—even in the US. American leaders like Tulsi Gabbard called out the UK for breaking trust. Suddenly, ADP isn’t just for tech nerds—it’s a hot topic.

    Why This Matters

    First, more people now understand how encryption keeps their stuff private. Second, governments see that secret demands won’t stay secret—Apple will find a way to let us know. Third, the US can’t easily try the same trick without looking silly after criticizing the UK. Apple’s not just taking on one country—it’s warning them all: your data stays safe with us.

  • Do app privacy tags affect your download choices?

    Do app privacy tags affect your download choices?

    In 2020, Apple rolled out privacy tags to show users what info an app might grab—like stuff tied to you or used to follow you around online. These tags were a game-changer, shining a light on apps that scoop up too much personal data. For example, you could see Signal barely touches your info, while Facebook Messenger slurps up everything it can for ads or upgrades. The goal? Help people pick apps wisely.

    Lately, though, I’ve noticed folks wondering if these self-reported tags—tucked way down on an app’s App Store page—still sway anyone before they tap “Get.” Apple splits these “privacy nutrition tags” into three types:

    • Data Tied to You: This is stuff like your name, address, email, exact location, or shopping history that’s linked to you, often for ads or tailoring the app. Developers have to spill it if it points back to you.
    • Data Not Tied to You: Info collected but kept nameless, usually to tweak the app. Apple makes sure it can’t be traced back to you.
    • Data That Tracks You: This follows you across apps and sites—like Google or Meta using your device ID for custom ads or selling it to data collectors.

    People get loud about privacy when a hot new app, like Threads, sparks worry. Back then, users scratched their heads over why it listed “Health and Fitness” data with no clear reason. Yet Threads still shot to the top of social media downloads. So, do these tags really matter?

    Here’s the catch: developers report this stuff themselves. Apple trusts them to be honest, which speeds up approvals but can blur the truth. For users, the tags are handy—if you dig for them and know what they mean. But just because they’re listed doesn’t guarantee they’re accurate. As Apple pushes privacy hard, the real challenge is teaming up with developers to make things clearer, explain data use better, and keep overstating in check. So, do privacy tags sway your downloads? Maybe—if you’re paying attention.

  • Could Apple lose an important iPhone privacy tool in France?

    Could Apple lose an important iPhone privacy tool in France?

    For almost two years, French officials have been looking into a privacy tool on iPhones called App Tracking Transparency. This feature lets people choose if they want apps to follow their activity for ads or not. You’ve probably noticed those “Ask App Not to Track” messages popping up. Now, according to a recent Reuters story, the investigation is almost over—and it doesn’t look good for Apple.

    France Might Stop This Privacy Feature Soon

    The two-year review of Apple’s App Tracking Transparency in France should finish within the next month. And it seems Apple might not like the result.

    Reuters reporter Foo Yun Chee explains:

    French regulators are likely to tell Apple next month to stop this practice, saying it’s unfair to competition. They’ll probably add a fine too. This would be the first time a government blocks this feature. In France, fines for such issues can reach up to 10% of a company’s yearly earnings worldwide.

    So, two big things could happen when this case ends:

    • Apple may have to turn off this feature for people in France.
    • The company could also face a penalty.

    Back in 2023, when this probe began, we noted: “Apple’s being accused of playing favorites and setting unclear, unfair rules about how user info can be used for ads.”

    Apple shared this comment with 9to5Mac during an earlier debate about the feature:

    “We at Apple think your data is yours. You should decide if it’s shared and with whom. App Tracking Transparency just lets users pick whether apps can track them or pass their info to data collectors. These rules are the same for everyone, including us, and we’ve had lots of support from privacy fans and regulators.”

    Beyond France, Germany and Italy are also checking out this feature. Usually, the worry isn’t the tool itself but how Apple uses it for its apps. In the U.S., companies like Meta have complained the most—not regulators. They say this privacy option has hurt their ad income.

  • AI app banned in South Korea for privacy issues

    AI app banned in South Korea for privacy issues

    In a recent development, users of the AI app DeepSeek in South Korea have hit a snag. The local government has decided to ban the Chinese AI application, citing that it does not follow the country’s data protection rules. As a result, Apple and Google have been instructed to take the app off their app stores in South Korea.

    Temporary Ban on DeepSeek in South Korea

    According to news sources, South Korea’s data protection watchdog has told tech giants like Apple and Google to stop people from downloading this app. However, you can still use DeepSeek through a web browser in South Korea for the time being. The reason given for this ban is that DeepSeek doesn’t meet the requirements set by South Korea’s Personal Information Protection Commission.

    The company behind DeepSeek has admitted they overlooked some aspects of South Korea’s data laws and has appointed lawyers in the country to deal with the situation. Additionally, a spokesperson from China’s Foreign Ministry has stated that China takes data privacy seriously and ensures it is protected by law.

    Path to Reinstatement

    DeepSeek could start working again in South Korea if it makes changes to align with the local privacy laws.

    For a bit of background, DeepSeek is a Chinese tech startup that became famous for its AI model “R1”, which is both effective and resource-efficient compared to others. The app received praise from Apple’s CEO Tim Cook but has also stirred up controversy because of where it comes from.

    Italy was the first to block DeepSeek for similar privacy concerns, and in the U.S., a senator has proposed a law to penalize the use of Chinese AI apps. Last month, DeepSeek climbed to the top of the U.S. App Store, beating out ChatGPT by OpenAI, and it currently sits at number 13 in app downloads.

    Source

  • Apple faces privacy double standards in Germany

    Apple faces privacy double standards in Germany

    Apple has been under scrutiny in Germany for three years because of its App Tracking Transparency (ATT) feature. This feature lets iPhone users choose not to have their activities tracked across different apps.

    The German competition authority, Bundeskartellamt, has now shared its initial thoughts, suggesting that Apple’s ATT rules might not be fair. They say that while third-party apps must follow these strict rules, Apple’s own apps do not have to.

    In their latest statement, the Bundeskartellamt pointed out that since ATT was introduced in April 2021, app makers in the iOS App Store need special permission from users to use their data for ads. However, these same strict rules don’t apply to Apple’s apps.

    According to the Bundeskartellamt, this could be against German competition laws for big tech companies and even broader European Union competition rules. They argue that Apple is applying different privacy standards to itself compared to other app developers.

    Apple has now been allowed to reply to these concerns raised by the German authority.

    Interestingly, while ATT was initially seen as a headache for big advertising apps like those from Meta (formerly known as Facebook), it has turned out to be beneficial. Meta has managed to enhance its advertising strategies, using AI to target users more accurately without relying on broad third-party data tracking.

    This situation in Germany highlights ongoing debates about how tech giants manage user data and privacy, setting a precedent for how privacy policies might be enforced in the future.

    Source

  • WhatsApp fixes privacy issue in new iPhone update

    WhatsApp fixes privacy issue in new iPhone update

    WhatsApp has just released an update for iPhones, version 25.2.3, to fix a big privacy problem with its “View Once” option. This option lets you send photos or videos that should vanish after one view, but because of a mistake, people could see them again.

    The Problem

    This issue was only for iPhone users. It let anyone look at the photos or videos they were supposed to see just once by going into the app’s settings, checking storage details, and finding the newest files. This meant the privacy feature wasn’t working as it should.

    How It Was Found

    A security expert named Ramshath wrote about this problem on Medium, which made WhatsApp aware of it and they worked on a solution. Interestingly, this isn’t the first time; last year, they fixed a similar problem on WhatsApp Web.

    New Features

    Along with fixing this issue, the update brings some new things you can do. Now, you can make calls to numbers you haven’t saved in your contacts, and there are better ways to handle group calls.

    What to Do

    If you use WhatsApp on an iPhone, you should update the app right away from the App Store to keep your private photos and videos safe. This update makes sure that when you send something meant to be seen once, it really does disappear after one view.

  • Apple reaffirms commitment to user privacy amidst Siri lawsuit settlement and Apple cash outage

    Apple reaffirms commitment to user privacy amidst Siri lawsuit settlement and Apple cash outage

    In a move aimed at reassuring users about data privacy, Apple has publicly reiterated its dedication to protecting user information collected through its voice assistant, Siri. This announcement comes on the heels of a $95 million settlement in a class-action lawsuit alleging privacy violations related to Siri recordings. Simultaneously, Apple is addressing an ongoing outage affecting its Apple Cash service, causing frustration for many users. 

    The recent lawsuit centered around claims that Siri inadvertently recorded user conversations following accidental activations. Plaintiffs in the case alleged that snippets of these conversations were then shared with third-party advertisers, resulting in targeted ads based on private discussions. Specific examples included individuals claiming to have seen ads for products they had discussed verbally near their Apple devices, such as specific brands of shoes or restaurants, and even ads related to medical treatments discussed with doctors. 

    Apple has consistently denied these allegations, maintaining that Siri data has never been used to create marketing profiles, shared with advertisers, or sold for any purpose. In a statement released earlier this week, Apple explained that the settlement was a pragmatic decision designed to avoid the prolonged and costly process of further litigation, rather than an admission of wrongdoing. 

    To further emphasize its commitment to privacy, Apple has provided a detailed overview of the privacy safeguards built into Siri. A core element of this approach is prioritizing on-device processing. By handling as much data processing as possible directly on the user’s device, Apple minimizes the amount of information that needs to be collected and transmitted to its servers. 

    Apple also emphasizes that Siri searches and requests are not linked to individual Apple accounts. Instead, a randomized identifier is used to track data during processing, ensuring anonymity and preventing the association of Siri activity with specific users. This measure is designed to protect user identity and prevent the creation of individual profiles based on Siri usage.  

    Furthermore, Apple states that it does not retain audio recordings of Siri interactions unless users explicitly opt in to participate in a program designed to improve Siri’s performance. Even when users consent to this program, the recordings are used solely for the purpose of enhancing Siri’s functionality and are not used for any other purpose, such as advertising or marketing. 

    While addressing privacy concerns surrounding Siri, Apple is also currently dealing with a separate issue affecting its Apple Cash service. Users have reported widespread problems with sending and receiving money through the platform, experiencing difficulties such as infinite loading screens and error messages suggesting that Apple Cash needs to be set up even for established users. 

    This multi-hour outage has disrupted peer-to-peer transactions for many Apple users, sparking complaints on social media platforms. Apple has acknowledged the issue on its System Status webpage, confirming that Apple Cash has been experiencing problems since earlier today. The status update indicates that some users are affected and that Apple is working to resolve the issue. 

    It appears that the outage is specifically limited to Apple Cash, Apple’s peer-to-peer payment system similar to services like Venmo, Zelle, and Cash App. Apple Pay, the company’s contactless payment platform for in-store and online purchases, appears to function normally.

    This confluence of events – the Siri lawsuit settlement and the Apple Cash outage – highlights the challenges large technology companies face in maintaining user trust and ensuring the smooth operation of complex digital services. Apple’s proactive approach to addressing both issues, through public statements and ongoing efforts to resolve the Apple Cash outage, demonstrates its commitment to transparency and user satisfaction. The company’s emphasis on privacy protections within Siri aims to rebuild confidence following the lawsuit, while the prompt response to the Apple Cash outage signals a dedication to restoring service functionality as quickly as possible.

    Source/Via

  • The Truth About Siri and Your Privacy: Debunking the myths

    The Truth About Siri and Your Privacy: Debunking the myths

    The digital age has brought incredible convenience to our fingertips, but it has also sparked privacy concerns. One area that frequently comes under scrutiny is voice assistants like Apple’s Siri. Recently, a settlement reached by Apple regarding past practices related to Siri data has reignited these concerns, leading to a flurry of speculation and misinformation. Let’s delve into the facts and separate them from the fiction surrounding Siri and user privacy.  

    In 2019, reports surfaced alleging that Apple used contractors to review a small percentage of Siri interactions. This practice, intended to improve Siri’s accuracy and understanding, involved human evaluation of audio recordings. While Apple maintained that these recordings were anonymized and subject to strict confidentiality agreements, the reports raised legitimate questions about user privacy. 

    Apple responded swiftly, acknowledging the concerns and implementing significant changes to its Siri privacy protocols. One of the most important changes was making the retention of Siri audio recordings opt-in. This meant that, by default, Apple no longer stored recordings of user interactions. Instead, users could actively choose to contribute their data to help improve Siri. Furthermore, Apple committed to using only its own employees for this review process, eliminating the involvement of third-party contractors. Any recordings accidentally triggered were promptly deleted.  

    Fast forward to the present day, and Apple recently agreed to a settlement related to the 2019 concerns. This settlement, however, has been misinterpreted by some as an admission of wrongdoing or evidence of ongoing privacy violations. In reality, Apple explicitly stated that the settlement was intended to put to rest lingering concerns about past practices that had already been addressed. 

    A key point that Apple has consistently emphasized is that Siri data has never been used to build marketing profiles or sold to any third party for any purpose. This is a crucial distinction that often gets lost in the noise. The company maintains that the data collected from Siri interactions, when users opt-in, is solely used to improve the functionality and accuracy of the voice assistant. 

    To further understand Apple’s commitment to privacy, it’s essential to examine the technical safeguards they have in place. Siri interactions are associated with a random, rotating identifier during processing. This identifier is not linked to a user’s Apple ID, phone number, or any other personally identifiable information. After six months, even this temporary association is severed, ensuring further anonymity.  

    Apple also provides users with direct control over their Siri data. Within the Settings app, users can access and review their Siri and Dictation history, giving them the option to delete past interactions. This transparency and control are fundamental to Apple’s approach to privacy.  

    Moreover, certain Siri requests are processed entirely on-device, meaning the data never leaves the user’s iPhone. For example, when Siri reads unread messages, the content of those messages remains on the device and is not transmitted to Apple servers. This on-device processing further minimizes the potential for data exposure.  

    For more complex tasks that require cloud processing, Apple utilizes what it calls Private Cloud Compute. This infrastructure, built on Apple’s own silicon and subject to independent security audits, provides a secure environment for processing data while maintaining user privacy. The architecture is also open to third-party scrutiny and research.  

    In conclusion, while concerns about data privacy in the digital age are valid, it’s important to base our understanding on facts rather than speculation. Apple has taken significant steps to protect user privacy in relation to Siri, from implementing opt-in data collection to employing robust technical safeguards.

    The recent settlement should be viewed as a resolution of past concerns, not as an admission of current privacy violations. Apple has repeatedly and unequivocally stated that Siri data is not used for marketing or sold to third parties. By understanding the facts and the measures Apple has put in place, we can have a more informed and nuanced conversation about the role of voice assistants in our lives.

    Source

  • Siri’s Silent Listen: Apple’s $95 million privacy settlement and what it means for you

    Siri’s Silent Listen: Apple’s $95 million privacy settlement and what it means for you

    For years, the quiet hum of “Hey Siri” has been a ubiquitous part of the Apple ecosystem. But behind the convenience of voice commands, a legal battle has raged over the privacy of those very interactions. Now, that battle is drawing to a close, with Apple agreeing to a $95 million settlement over allegations of unlawful recording and sharing of Siri conversations. This isn’t just about money; it’s a significant moment in the ongoing conversation about digital privacy in the age of voice assistants.

    The lawsuit, initially filed in 2019, accused Apple of intentionally recording user conversations without explicit consent. These recordings, the plaintiffs argued, were then shared with third-party contractors, potentially leading to targeted advertising and other privacy breaches. The core issue revolved around the “Hey Siri” activation feature, which constantly listens for its trigger phrase, raising concerns about what else it might be capturing in the process.

    This wasn’t a small, isolated incident. The lawsuit represented potentially tens of millions of users who owned Siri-enabled devices, from iPhones and iPads to Apple Watches and HomePods. The settlement, if approved by U.S. District Judge Jeffrey White in Oakland, California, could see individual users receiving up to $20 per affected device. While the exact distribution process is yet to be finalized, the sheer scale of the settlement underscores the seriousness of the allegations.

    Apple, while agreeing to the settlement, has consistently denied any wrongdoing. This is a common legal strategy in such cases, allowing companies to avoid lengthy and costly trials while mitigating potential reputational damage. However, the settlement doesn’t exist in a vacuum. The initial scrutiny surrounding Siri’s privacy practices back in 2019 prompted Apple to undertake significant internal changes.

    These changes included:

    • Internal Review of Siri Practices: Apple conducted a thorough review of its internal processes related to Siri, including how it used contractors for audio analysis and quality control. This suggests that the initial concerns raised by the lawsuit prompted a reassessment of existing procedures.
    • New Permission Prompts: Apple introduced clearer and more prominent permission prompts for Siri audio recording. This gave users greater control over whether their voice interactions were recorded and used for improving Siri’s performance.
    • “Ask App Not to Track”: This feature, a cornerstone of Apple’s privacy push, allows users to prevent apps from tracking their activity across other apps and websites. While not directly related to Siri, it reflects Apple’s broader focus on user privacy in the wake of these concerns.

    These changes, while positive steps towards greater user privacy, shouldn’t be interpreted as an admission of guilt. They represent an evolution in Apple’s approach to data handling, driven in part by the scrutiny brought on by the lawsuit.

    The $95 million figure is significant, but it’s important to put it in perspective. As reported by Reuters, this sum represents roughly nine hours of Apple’s profit. While a substantial amount of money, it’s a relatively small financial hit for a company of Apple’s size. The real impact lies in the message it sends about the importance of user privacy.

    This settlement isn’t just about Apple. It’s a landmark moment in the broader conversation about the privacy implications of voice assistants. As these technologies become increasingly integrated into our lives, questions about data collection, storage, and usage become ever more critical. This case highlights the need for transparency and user control in how our data is handled.

    The details of how users can claim their share of the settlement are still being finalized. Once the settlement receives final approval, information about the claims process will be made available. It’s advisable to stay updated on this development through reliable news sources and legal updates.

    In conclusion, the $95 million settlement between Apple and users over Siri’s privacy practices is more than just a financial transaction. It’s a reflection of the growing importance of digital privacy in the modern world. It underscores the responsibility of technology companies to be transparent and accountable in how they handle user data. And it serves as a reminder that users have a right to control their own information. While “Hey Siri” may continue to be a part of our daily lives, this settlement ensures that the conversation around its privacy implications will continue as well.

  • Questioning the privacy of iOS 18’s enhanced photo search

    Questioning the privacy of iOS 18’s enhanced photo search

    For years, Apple has cultivated an image of unwavering commitment to user privacy, a cornerstone of its brand identity. This dedication has even influenced the integration of AI into its devices, sometimes at the cost of performance, as the company prioritized on-device processing. However, a recent discovery surrounding iOS 18’s “Enhanced Visual Search” feature within the Photos app raises serious questions about whether this commitment is as steadfast as we believe. 

    The “Visual Look Up” feature, introduced previously, allowed users to identify objects, plants, pets, and landmarks within their photos. This functionality enhanced search capabilities within the Photos app, allowing users to find specific pictures using keywords. iOS 18 brought an evolved version of this feature: “Enhanced Visual Search,” also present in macOS 15. While presented as an improvement, this new iteration has sparked a debate about data privacy.  

    A Deep Dive into Enhanced Visual Search: How it Works and What it Means

    The Enhanced Visual Search feature is controlled by a toggle within the Photos app settings. The description accompanying this toggle states that enabling it will “privately match places in your photos.” However, independent developer Jeff Johnson’s meticulous investigation reveals a more complex reality. 

    Enhanced Visual Search operates by generating a “vector embedding” of elements within a photograph. This embedding essentially captures the key characteristics of objects and landmarks within the image, creating a unique digital fingerprint. This metadata, according to Johnson’s findings, is then transmitted to Apple’s servers for analysis. These servers process the data and return a set of potential matches, from which the user’s device selects the most appropriate result based on their search query. 

    While Apple likely employs robust security measures to protect this data, the fact remains that information is being sent off-device without explicit user consent. This default-enabled functionality in a major operating system update seems to contradict Apple’s historically stringent privacy practices.

    The Privacy Paradox: On-Device vs. Server-Side Processing

    The core of the privacy concern lies in the distinction between on-device and server-side processing. If the analysis were performed entirely on the user’s device, the data would remain within their control. However, by sending data to Apple’s servers, even with assurances of privacy, a degree of control is relinquished.

    Johnson argues that true privacy exists when processing occurs entirely on the user’s computer. Sending data to the manufacturer, even a trusted one like Apple, inherently compromises that privacy, at least to some extent. He further emphasizes the potential for vulnerabilities, stating, “A software bug would be sufficient to make users vulnerable, and Apple can’t guarantee that their software includes no bugs.” This highlights the inherent risk associated with transmitting sensitive data, regardless of the safeguards in place.

    A Shift in Practice? Examining the Implications

    The default enabling of Enhanced Visual Search without explicit user consent raises questions about a potential shift in Apple’s approach to privacy. While the company maintains its commitment to user data protection, this instance suggests a willingness to prioritize functionality and convenience, perhaps at the expense of absolute privacy.

    This situation underscores the importance of user awareness and control. Users should be fully informed about how their data is being used and given the choice to opt out of features that involve data transmission. While Apple’s assurances of private processing offer some comfort, the potential for vulnerabilities and the lack of explicit consent remain significant concerns.

    This discovery serves as a crucial reminder that constant vigilance is necessary in the digital age. Even with companies known for their privacy-centric approach, it is essential to scrutinize new features and understand how they handle our data. The case of iOS 18’s Enhanced Visual Search highlights the delicate balance between functionality, convenience, and the fundamental right to privacy in a connected world. It prompts us to ask: how much are we willing to share, and at what cost?