The research, presented at the European Symposium on Research in Computer Security 2025, combined technical, legal, and design-based approaches to understand how digital health tools process personal information. The team examined twenty health and fitness applications commonly used in Germany. These apps, which track activities such as exercise, medication intake, and menstrual cycles, often deal with highly personal medical data.
To measure real behavior, the researchers analyzed both static and dynamic data flows and reviewed privacy policies and consent screens. Their investigation revealed that many applications sent data such as advertising identifiers to external servers even before consent was granted. All tested apps shared information with destinations outside the European Union, mainly in the United States. Some data was also routed to servers in Ireland, China, Sweden, Singapore, and Australia, reflecting how dispersed health data transmission has become.
The study also exposed manipulative design practices that can pressure users into accepting terms quickly. Every app in the sample contained at least one form of deceptive interface design that guided users toward broad consent. In many cases, these patterns made it difficult for people to understand what they were agreeing to or to locate options for limited data use.
Language barriers added another complication. Among the apps with German interfaces, more than half provided privacy information only in English. Even those that did include German versions often used vague wording, referring to recipients merely as “partners” or “service providers” rather than naming specific companies.
The Bremen team concluded that formal compliance with European data protection law does not always translate into meaningful user understanding. Many apps technically satisfy the General Data Protection Regulation’s legal conditions, yet they fail to communicate clearly how data moves or who receives it. The research highlights the need for practical transparency standards that go beyond legal text and ensure people can genuinely comprehend how their health data is handled.
Looking ahead, the researchers plan to develop automated methods for identifying dark patterns and mapping real-time data flows. Their goal is to help regulators and developers detect risks early and design apps that meet privacy requirements without manipulating users into unwanted consent.
The study reinforces a growing concern in digital health technology… that user trust depends not only on compliance, but on whether privacy protections can be clearly seen and understood.

Notes: This post was edited/created using GenAI tools. Image: Sophia Stark/unsplash
Read next: Gemini Struggles Most in Accuracy Test; BBC–EBU Study Exposes Deep Flaws in AI News Replies
link

