Article Text
Abstract
Background Child interaction (including via parent proxy) with mobile apps is common, generating concern about children’s privacy and vulnerability to advertising and other commercial interests. Researchers have conducted numerous app content evaluations, but there is less attention to data sharing or commercial practices.
Objective This scoping review of commercial app evaluation studies describes the nature of such evaluations, including assessments of data privacy, data security and app-based advertising.
Methods We searched Scopus, PubMed, Embase and ACM Digital Library (2005–2020). We included studies that evaluated the properties of apps available through commercial app stores and targeted children, parents of a child (0–18 years) or expectant parents. Data extracted and synthesised were study and app user characteristics, and app privacy, data sharing, security, advertisement and in-app purchase elements.
Results We included 34 studies; less than half (n=15; 44.1%) evaluated data privacy and security elements and half (n=17; 50.0%) assessed app commercial features. Common issues included frequent data sharing or lax security measures, including permission requests and third-party data transmissions. In-app purchase options and advertisements were common and involved manipulative delivery methods and content that is potentially harmful to child health.
Conclusions Research related to the data handling and the commercial features of apps that may transmit children’s data is preliminary and has not kept pace with the rapid expansion and evolution of mobile app development. Critical examinations of these app aspects are needed to elucidate risks and inform regulations aimed at protecting children’s privacy and well-being.
- technology
- data collection
- paediatrics
This is an open access article distributed in accordance with the Creative Commons Attribution 4.0 Unported (CC BY 4.0) license, which permits others to copy, redistribute, remix, transform and build upon this work for any purpose, provided the original work is properly cited, a link to the licence is given, and indication of whether changes were made. See: https://creativecommons.org/licenses/by/4.0/.
Statistics from Altmetric.com
What is already known on this topic?
Mobile app developers encourage users to enter personal information and routinely share collected data with third parties to enhance the user experience or monetise the app.
Apps focused on children may be among the worst in terms of the number of associated third-party data trackers—posing privacy and safety concerns to children.
Child and parent app content analyses are increasingly conducted, but little is known associated data privacy, data security and app-based advertising assessments.
What this study adds?
Comprehensive evaluations of the data privacy and security elements and commercial features of apps that may transmit children’s data are rarely conducted.
When evaluated, child and parent apps show frequent data sharing and lax security measures, including permission requests and third-party data transmissions.
In-app purchase options and advertisements appear common in child and parent apps and involved manipulative delivery methods and content that is potentially harmful to child’s health.
Introduction
Today’s children are growing up in an immersive digital media era where frequent interaction with mobile applications (apps) is the norm. In addition to their own use of technology, children’s data including photographs, videos and personal information are shared via their parents’ online behaviours. Engagement with technology spans childhood, with 49% of parents using parenting apps,1 60% of children less than 3 years having used a mobile device2 and, in the UK, 53% of children aged 7 years and 90% of children aged 11 years reporting mobile phone ownership.3 Unfortunately, children and their parents are generally engaging with apps without a fulsome understanding of the privacy implications of their actions or the commercial interests in monetising their app-based activities.4
Mobile app developers encourage users to enter personal information and routinely share collected data with third parties to enhance the user experience or commercialise the app.5 Adult apps are known to share personal and health information with an array of commercial entities, which are then capable of aggregating data across apps and re-identifying users.6 7 Recognising children’s particular vulnerabilities, regulations designed to protect child privacy include Europe’s General Data Protection Regulation (GDPR) and the United States’ Children’s Online Privacy Protection Act (COPPA). These regulations require operators of online services such as apps to give detailed notice of privacy practices and prohibit the processing of children’s personal information without consent.8 9 Still, evidence suggests that apps containing children’s data are among the worst in terms of the number of associated third-party trackers10—and developers may skirt privacy regulation by claiming their app is targeted at general audiences rather than children.11
This mobile ecosystem and current regulatory situation creates serious risks to children. The ubiquitous online presence and purchasing power of young parents and children mean these groups are now at the centre of the e-commerce market. This is highly problematic as serious child privacy and safety issues may arise if information shared with apps is used for data-driven advertising. Furthermore, there is a real danger that data aggregators may create digital dossiers that follow young people into adulthood and impact their future education, employment and health insurance acquisition opportunities.12
In parallel with these data handling issues, research attention has increasingly turned to app stores and the content and quality of commercially available apps. Given the availability of such evaluations and that these apps may transmit child data to a host of third parties, the objective of this review was to understand the scope of such evaluations, including whether and how researchers are assessing data privacy, data security and app-based advertising and what results they are finding in these areas.
Methods
Design and reporting
We conducted a scoping review according to the framework developed by Levac et al13 using an internal protocol that was based on a previous, similar review by a member of our group.14 Review reporting is in accordance with the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) extension for scoping reviews (PRISMA-ScR) checklist.15
Eligibility criteria
We included studies that evaluated apps available in commercial stores which collected data directly related to children; thus users would be children, a parent of a child (0–18 years) or an expectant parent. We excluded commentaries, topical or systematic literature reviews, protocols, book chapters and conference abstracts. No language restrictions were placed. The search was limited to studies published from 2005 onward—the timeframe where mobile apps have been publicly available.16
Information sources and evidence screening
On 18 November 2020, we conducted searches in the Scopus (Elsevier), PubMed, Embase (Ovid) and ACM Digital Library databases. Our search strategy was developed in consultation with a research librarian (online supplemental appendix 1) and piloted to validate applicability. We supplemented the search with searches of our own databases of mobile app literature. Using Covidence software, duplicates were removed and three authors independently screened titles and abstracts, and then full texts, in duplicate according to the eligibility criteria. Eligibility disagreements were resolved through discussion with a third reviewer.
Supplemental material
Data charting
We developed, piloted and refined a data charting table with reference to those used in our previous research in this topic area14 17 and we charted data into this table. The data items charted are shown in online supplemental appendix 2.
Synthesis of results
Data abstraction fields were grouped according to key data features to enable synthesis. Quantitative data were summarised using descriptive statistics. Where appropriate, qualitative data items were categorised descriptively, and frequencies calculated. Charting and categorisation were conducted by one author and checked by a second author.
Results
Study selection
We identified 15 762 records across all databases (figure 1). After the removal of duplicate and screening of titles and abstracts, we assessed 140 full-text articles for inclusion. Following full-text screening, 34 articles were included in this review.
Study and general app characteristics
The number of published studies meeting our inclusion criteria has increased over time (figure 2). Study details are shown in table 1. Studies were conducted in the USA (n=18; 52.9%), Australia (n=9; 26.5%), Canada (n=2; 5.9%), Iran (n=1; 2.9%), India (n=1; 2.9%) and the UK (n=1; 2.9%). Two studies (5.9%) were conducted across multiple countries. Most commonly, study designs were reported as systematic reviews or evaluations (n=13; 38.2%), descriptive or content analyses (n=10; 9.4%), or reviews (n=5; 14.7%). Stated designs represented the authors’ own labelling, and we did not find meaningful correspondences between reported study designs and the methods used. Study funding was from government agencies (n=10; 29.4%), universities (n=3; 8.8%), non-for-profit organisations (n=1; 2.9%), for-profit organisations (n=1; 2.9%) or a combination of these sources (n=5; 14.7%). Nine studies (26.5%) did not identify the funding source and 5 (14.7%) received no funding.
The median app sample size across studies was 46 (range 4–67 778). Parents were the intended app users in 16 studies (47.1%), children in 12 studies (35.3%), and parents or children in 6 studies (17.6%). Apps were most commonly available through both iTunes (Apple) and Google Play stores (n=19; 55.9%)—followed by Google Play (n=6; 17.6%) or iTunes alone (n=4; 11.8%). To sample apps, authors most commonly used keyword searches in app stores (n=23; 67.6%), store-reported ranking lists (n=5; 14.7%) or software to support searching of app store contents (n=4; 11.8%).
App data privacy and security-related findings
Less than half (n=15; 44.1%) of the studies evaluated any data privacy or security features. A total of two studies (5.9%) evaluated apps’ third-party data sharing practices.18 19 In both cases, studies automated the process of app execution using simulated data inputs and determined the number and domain destination of data transmissions. Results showed that 67%19 and 73%18 of apps transmitted children’s personal data to third parties including those providing advertising-related services. Transmitted data included email addresses, information enabling user geolocation and advertising IDs that can be used to create behaviour profiles for advertising. Third-party transmission counts were not associated with child sex, parent age or marital status, or family income-to-needs ratio. However, transmissions were twofold to threefold higher in the case of children whose parents did not have advanced degrees.19
Table 2 shows other app privacy-related and security-related evaluation data from studies. Eight studies (23.5%) reported on apps’ capacity to share information via social media. These studies did not explicitly evaluate whether the nature of such sharing was active (ie, user-initiated data sharing for purposes including seeking peer support) or passive (ie, data transmission to social media networks unbeknownst to the app user). The potential to share data to social media platforms occurred in 14%–63% of apps (median 28%).
Additionally, three studies (8.8%) documented the presence of privacy policies and single studies (2.9%) evaluated each of privacy policy content and readability. These studies showed 5%–100% of apps (median 63%) had an associated privacy policy. Policy readability was poor20 and often failed to comply with international or federal regulations.21 Two studies (5.9%) documented actual or potential permission requests,22 23 showing that permission requests occurred in up to 100% of apps and may violate jurisdictional privacy regulations such as location data tracking.
App data security features were evaluated in 29.4% (n=10) of studies and included presence of login or password protection element (n=7; 20.6%), login/password and cloud storage option (n=1; 2.9%), or data encryption (n=1; 2.9%), and the application of an investigator-developed security assessment scale (n=1; 2.9%). Security-related results showed: login or password protection presence in 0%–100% of apps (median: 31%), high proportions of apps not protecting data transmissions using standard methods,18 and few apps with high security assessment scale ratings.24
App commercial feature-related findings
Commercial features were assessed in 17 studies (50.0%) (table 3) and included the proportion of apps with in-app purchase options (n=15; 44.1%), the proportion of apps with in-app advertisements (n=10; 29.4%) and the type of advertisements (n=3; 8.8%). In-app purchases and advertisements were present in 0%–46% (median 25%) and 9%–95% (median 51%) of apps, respectively. To evaluate advertisement content, all studies used manual content analysis using a predefined and investigator-developed advertising coding scheme.23 25 26 Content analysis conducted by Meyer et al23 showed advertisements were presented using traditional methods (eg, product videos as shown on television) but also in insidious ways that might prompt further advertising consumption (eg, embedding advertising videos within gamified app features). In the two studies that assessed the relationships of advertisements to health outcomes, advertisements promoted formula-feeding for premature babies, toddlers or older children,25 26 which may be in contravention of the WHO Code on the Marketing of Breast Milk Substitutes27 due to potentially harmful impacts on health.
Discussion
Evaluations of the content and quality of commercially available apps that may transmit child data have proliferated steadily over time. Rigorous, independent evaluations of the data sharing practices and commercial features of these apps remain rare. However, there is rapid methodological development in the field and strategies to evaluate these practices are being increasingly developed and used by interdisciplinary research groups.10 18 19
Study and app characteristics
Reviews of apps that collect, and potentially share, children’s data are conducted most often by investigators in high-income, predominantly English-speaking countries, and commonly include only apps available in English. Most studies focused on understanding the content of apps designed for specific health or education purposes and few examined game-based and other types of apps children commonly engage with. Surrogate measures are largely used to evaluate the privacy and security features of apps as only a handful of studies have examined app data sharing and security practices directly. Still, our data show that, when data privacy and security evaluations are conducted, issues with frequent data sharing or lax security measures are uncovered.
Data sharing practices
Most researchers included only proxy measures for actual data sharing practices, such as permission requests or the presence of a privacy policy. In the few studies that measured actual data sharing, identifying children’s data were provided to third parties.19 This is problematic as aggregation of these data can support the characterisation of parent and child users according to their app interaction patterns or demographics, and these characterisations may be commercially exploited to encourage impulse purchasing or suggest unhealthy products in ways that exacerbate health inequities.19 28
Data sharing policies
Privacy policies in child and parenting apps are variable in terms of both presence and readability. Thus, the data tracking and commercialisation practices of apps, and their associated risks, are generally unknown to children and adults alike29–31—challenging the value of the dominant ‘notice and consent’ privacy framework of the information age. Digital literacy skills-building may mitigate some risks to users and, in the case of children, such programmes have been developed.29 However, lower socioeconomic status, as well as age and gender, may be associated with lower digital literacy,31 suggesting that equitable access to literacy training remains elusive. In addition, even when privacy policies are present, they oftentimes do not reflect actual app data sharing behaviours.32 33
In-app purchasing and advertisements
Half of our included studies evaluated apps’ commercial features with results showing several areas of potential concern. In-app purchase options and advertisements are common, manipulative methods are used to deliver advertisements, and advertising information is potentially harmful to health.26 34 These issues pose a problem as research shows both parents and children may not always be able to distinguish app content from advertising.23 35 The content of advertisements within children’s apps is also often not age-appropriate with advertisement content often exceeding developer-stated app maturity levels.36 Finally, furthering digital disparities, free apps—which parents and children of lower socioeconomic status may more frequently engage with—more frequently contain these in-app purchase options and advertisements.23
Implications
Our results have important implications for regulatory bodies, app developers and parents. Although regulations such as the GDPR and COPPA have been enacted to protect children’s online privacy, our results point to the limits of these efforts. For instance, COPPA is reported as underenforced in the USA11 and, as such, non-compliance with the regulation appears widespread.18 19 These privacy regulations also rely on the idea that an informed consumer can select apps with adequate privacy protections in place.37 However, we show that privacy policies are not always present in children’s apps and, when present, vary greatly in terms of readability. As such, the onus of responsibility for personal data protection is placed on those who may not be adequately equipped for privacy decision making by default (ie, the child or parent). Combined with more stringent regulatory enforcement—app developers, who may not be consistently aware of the destinations of data transmitted from their apps,18 can reduce personal identifier collection in the spirit of data minimisation19 and systematically evaluate app privacy behaviours before release.18 Ahead of these needed regulatory and industry shifts, parents and older children may install apps from trusted developers,19 disable advertisement identifiers, adjust app permissions and use advertisement blockers to reduce the likelihood of privacy breaches.38
Limitations
First, although sensitive, the nature of our research question resulted in a search strategy that was imprecise and identified many irrelevant studies. We used duplicate screening and team discussions to resolve discrepancies and systematically exclude such studies. Second, even though we developed a broad search strategy, the cross-disciplinary nature of our research question may mean that we may not have located all studies accessible in disparate, discipline-focused databases. Third, although not the goal of a scoping review, we did not conduct a methodological quality assessment and instead included all identified studies.
Conclusion
Research related to the data handling behaviours and commercial aspects of apps that may transmit children’s data is emerging but has not kept pace with the rapid expansion and evolution of the mobile ecosystem. The lack of evaluations may be related to the technical difficulty in doing so—an issue that may be solved by collaborative research efforts spanning the disciples of computer science, child health and commercial regulatory policy. These collaborations may be fruitful in rooting out and acting on risks to children’s privacy and well-being within mobile ecosystems.19 Studies are needed to understand the intersection between transmitted data and advertisements within apps and how this commercial exposure effects children’s health and well-being. Ultimately, enforced and stricter regulation may be key to protecting children’s online privacy and dampening any impacts of data sharing.
Ethics statements
Patient consent for publication
Ethics approval
Not applicable.
Acknowledgments
We acknowledge the support of the Government of Canada’s New Frontiers in Research Fund (NFRF) (NFRFE-2019-00806).
References
Supplementary materials
Supplementary Data
This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.
Footnotes
Twitter @lindsayjibb
Correction notice This article has been corrected since it first published. The open access licence type has been changed to CC BY. 17th May 2023.
Contributors QG conceptualised and designed the study, designed the data collection instruments, and reviewed and revised the manuscript. LJ designed the study, designed the data collection instruments, collected and synthesised the data, drafted the initial manuscript, and revised the manuscript. LR designed the search strategy, conducted the search, and reviewed and revised the manuscript. EA and MH designed the data collection instruments and collected and synthesised the data, and reviewed and revised the manuscript. All authors approved the final manuscript as submitted and agree to be accountable for all aspects of the work. LJ acts as guarantor and accepts full responsibility for the finished work and the conduct of the study, had access to the data, and controlled the decision to publish.
Funding This study was funded by the Natural Sciences and Engineering Research Council of Canada, New Frontiers in Research Fund (NREF-2019-00806), Canadian Institutes of Health Research and Social Sciences and Humanities Research Council of Canada.
Disclaimer The Government of Canada had no role in the design or conduct of this study or decision to publish.
Competing interests None declared.
Provenance and peer review Not commissioned; externally peer reviewed.
Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.