Download Apps Ymusic

0 views
Skip to first unread message

Andrew Henson

unread,
Jan 25, 2024, 9:10:54 AM1/25/24
to unjumtibe

Poor information privacy practices have been identified in health apps. Medical app accreditation programs offer a mechanism for assuring the quality of apps; however, little is known about their ability to control information privacy risks. We aimed to assess the extent to which already-certified apps complied with data protection principles mandated by the largest national accreditation program.

Cross-sectional, systematic, 6-month assessment of 79 apps certified as clinically safe and trustworthy by the UK NHS Health Apps Library. Protocol-based testing was used to characterize personal information collection, local-device storage and information transmission. Observed information handling practices were compared against privacy policy commitments.

download apps ymusic


DOWNLOAD ★★★★★ https://t.co/QkfWyUqbOa



Systematic gaps in compliance with data protection principles in accredited health apps question whether certification programs relying substantially on developer disclosures can provide a trusted resource for patients and clinicians. Accreditation programs should, as a minimum, provide consistent and reliable warnings about possible threats and, ideally, require publishers to rectify vulnerabilities before apps are released.

The purpose of the current study was to assess the extent to which accredited apps adhered to these principles. We reviewed all apps available from the NHS Health Apps Library at a particular point in time, and assessed compliance with recommended practice for information collection, transmission and mobile-device storage; confidentiality arrangements in apps and developer-provided online services; the availability and content of privacy policies; and the agreement between policies and observed behaviour.

Assessment involved a combination of manual testing and policy review. Testing was used to characterize app features, explore data collection and transmission behaviour, and identify adherence to data protection principles concerning information security. Policy review identified the extent to which app developers addressed data protection principles concerning disclosure of data uses and user rights. In a final step, policy commitments were compared and contrasted with behaviours actually observed in apps. These processes are described further below.

Apps were subject to a 6-month period of evaluation, from August 2013 to January 2014. Testing incorporated two strategies. To ensure coverage of features relating to information collection and transmission, sequential exploration of all user interface elements was performed for each app. After this, apps were subject to an extended period of testing which included periods of both routine daily use and less frequent, intermittent interaction. The aim of this extended process was to uncover app behaviours that might occur only infrequently but were relevant from a privacy point of view, for example time-delayed solicitation of feedback or transmission of aggregated analytics data.

Prior to the start of the evaluation, we conducted pilot testing using a range of system and user apps not included in the study to ensure that all data would be captured. We anticipated that some test apps might implement certificate pinning [29], a technical security measure designed to prevent man-in-the-middle attacks on encrypted communications. However, in practice, this was only observed for certain communications generated by the mobile operating system and did not affect interception of traffic generated by test apps.

Personal information sent by apps was categorized in a two-part process, using the same coding schema used to analyze data collection (Additional file 1: Table AF1). In the first step, an automated process was used to classify data according to destination and the mechanisms used to secure the content, if at all. Known instances of particular data types were also identified automatically by searching for user details generated during testing such as app-specific simulated email addresses. No data were discarded during automatic coding. In the second step, the content of captured traffic was displayed in a custom software tool for manual review (see Additional file 1: Figure AF3). Although all traffic was inspected, multiple transmissions with identical content (excluding timestamps) were automatically combined for review. The review process allowed study reviewers to check automatic tagging and manually code any personal information not already identified. Coding was performed by two researchers, working independently, and reconciled through discussion.

Coding decisions, as well as any relevant policy text annotations, were captured using custom software (see Additional file 1: Figure AF5). All decisions were reviewed to reach a consensus agreement on policy coverage. The nature of information actually collected and transmitted by apps was then compared to specific commitments made in privacy policies. We also recorded the operating system permissions requested by each app at installation or during subsequent use, for example access to user contacts or geolocation service, as well as configuration options offered by each app to control the transmission of data to developer and third-party services.

Data were compiled into a single dataset for analysis (supplied as Additional file 2). We used simple descriptive statistics to summarize aspects of data collection, mobile-device storage and transmission. Unless otherwise stated, the unit of analysis is the platform-independent app. Expectations that apps available on both iOS and Android would substantially share privacy-related characteristics were confirmed. Therefore, to avoid double counting, we combined these apps for analysis. Because of the potential risk to current users, we have chosen not to identify specific apps with confidentiality vulnerabilities. However, in November 2014, the NHS Health Apps Library was provided with details of the vulnerabilities we identified.

Apps available through the NHS Health Apps Library exhibited substantial variation in compliance with data protection principles, demonstrated both by the availability and content of privacy policies, and adherence to recommended practices for confidentiality enforcement. Over half included functions in which personal details, health-related information, or both, were transferred to online services, but a fifth of such apps, and two-thirds of apps overall, did not have a privacy policy. In this respect, health apps, whether accredited or not [10], appear to be little better than non-medical apps available through general app stores [39], despite greater potential sensitivities surrounding health-related information. While most, but not all, privacy policies explained how information would be used, coverage of other aspects that would enable a user to make an informed choice about which information to disclose was less consistent. For example, a sixth of apps sent information to advertisers and third-party analytics but did not mention secondary uses of information in a policy. While there was no evidence of malicious intent, a fifth of apps shared limited information, including in some cases details of medical topics that users had viewed or search for, with advertising and marketing companies. Procedures enabling user rights afforded by data protection law, such as the ability to view and amend personal data, were inconsistently documented in privacy policies. The observed variation prompts questions about the coverage, and consistency, achieved by the certification process. For example, it was not clear why differences in the likelihood of having a privacy policy by payment model or platform should exist in apps available through a common accreditation framework.

Two cloud-based apps had critical privacy vulnerabilities; weaknesses of design that could be intentionally exploited to obtain user information. As long as these vulnerabilities persist, the privacy of users of these services is in jeopardy. As recent data thefts from high profile online services have shown, the risk is not simply theoretical [40, 41]. Many apps took inadequate steps to secure personal information, whether stored locally on devices or being transmitted to online services. Most concerning was the finding that some apps sent personal information without the use of encryption. Mobile communications may be particularly at risk of interception because, unlike fixed computers, information is sent using public computer networks for which users have little control over confidentiality enforcement arrangements. A small number of apps transmitted both unsecured personal and health information, for example research data pairing device and personal identifiers with details of substance use. However, the bigger potential risk to privacy is probably identity-related. Half of apps transmitting user account details sent usernames and passwords unencrypted. Armed with such information, a malicious user might be able to access other resources, for example email or online bank accounts. We found examples of complete personal datasets, including name, date of birth and contact details, sent as plain text. No apps encrypted local data stores, despite the widespread use of PIN or password security within apps that might reasonably lead a user to believe their information was protected. While recent changes proposed by operating system manufacturers aim to ensure that information stored on devices are encrypted by default, responsibility for ensuing confidentiality during transmission will remain with developers. A failure to implement appropriate technical safeguards of personal information does not only imply a failure of accreditation, it may also represent a violation of data protection law in the UK [23].

By assessing all apps available through an accredited medical app store we were able to sample a wide range of app types including those from health providers and commercial organizations. The frequencies of identified issues reflect the specific population of apps available at the time of assessment. Interpretation should take account of the possibility that new and updated apps will exhibit different privacy-related characteristics. This does not affect the value of specific issues that need to be addressed, nor broader patterns existing within the data, for example inconsistencies in approaches to securing information. However, there is an ongoing requirement to ensure that new issues are identified and prioritized appropriately. The design of the study allowed us to examine local app behaviour and the content of transmissions originating from, and targeted towards, our test devices. However, we did not have access to information once received by either developer or third-party services, nor were we able to observe how data were handled at an organizational level by those services. Either or both of these may be sources of additional privacy risks not directly quantifiable by this study. These may arise as a result of technical and organizational challenges in ensuring the appropriate storage, handling and transfer of information held in online storage [55]. Our approach, instead, relied on the degree to which those practices were affirmed in a suitable privacy policy, which may be an imperfect proxy for actual behaviour. Recent work has illustrated the scope for threats arising from online storage of health information and identified privacy-preserving strategies that could inform future studies that assess compliance more directly [56].

dafc88bca6
Reply all
Reply to author
Forward
0 new messages