Data Science for Context-based User Privacy

Posted on Posted in Networks and Security, Uncategorized

Since the 2008 launch of the App Store, third-party mobile applications have become constitutive of the smartphone user’s experience, creating significant market value for application developers.  Within his market space, Google Inc. and Apple Inc. have emerged as the dominant players in the mobile ecosystem with the mainstream deployment of their mobile operating systems – Android and iOS, respectively.  After the initial release of Apple’s iPhone, built atop an open-source Linux kernel, Google’s Android OS was reconfigured after Apple’s the initial release of the iPhone as an open-source platform free to use by any mobile-device maker, which allowed it to become highly popular with third-party phone makers.  iOS, meanwhile, is a closed platform that only exists on Apple hardware.  Both Android and iOS mobile operating systems use a permission-based access control model to protect sensitive hardware resources from unauthorized use [8].

Background

Over the past decade, Android has expanded rapidly its market share as the most popular mobile platforms, globally [8].  Together, Android’s open source operating system and its unrestricted application market, the Google Play Store, support its popularity as a platform for third-party applications [2].  As of March 2018, the number of Android applications for sale outnumbered iOS applications by a ratio of ~1.6:1. (statistica.com).  While applications bring users value in productivity and entertainment, the existence of unauthorized software (i.e., malware) remains a pervasive threat to mobile security. Further, the permission systems that regulate application access to sensitive resources are not necessarily well-aligned to users’ privacy expectations [7].

Although built atop a Linux kernel, the Android operating system has a unique ability to isolate its running processes via unique identification (UID).  Within this sandboxing feature, each application can access only its own files by default, and each application runs in its own virtual machine [2].  While Android’s API provides applications with access to the device’s hardware, Wi-Fi and cellular networks, user data, and phone setting, its security model puts users in control of their devices through permission granting.  Likewise, Apple’s iOS sandboxes its applications and gives users control of an application’s access to system resources.  Unlike Android, however, Apples’ proprietary mobile OS is closed-sourced, and sold only with Apple’s hardware.

By exercising tight control over both its hardware and its software, Apple maintains an ability to enforce tight security, protecting their users’ data.  Both Google and Apple enforce that all OS apps must be validated before being made available in their official stores.  The ubiquity of the Android OS across a variety of devices, however, lends it a particular susceptibility to hacker attacks.  In particular, the abundance of third-party app stores contributes to a vast bloom of Android-compatible applications that are neither controlled nor validated by the Google.  Although malware does infest Apple’s App store from time to time, both attackers and researchers focused principally on Android’s vulnerability to attack.

Mobile Permissions

Historically, smartphone users control both their privacy and the degree to which external entities may access secure processes through distinct ask-on-install (AOI) permissions.  During installation, the user is informed about what permissions an application will receive; if a user does not want to grant permission to an app, they can cancel the installation process [2].  The emerging ask-on-first-use (AOFU) permissions model improves upon the AOI system by offering users a chance to deny permission while still allowing them to use the application.  However, once granted, both the AOI and AOFU permission models apply the user’s install-time decision to all subsequent cases of the same app accessing the same resources [7].  While these permission systems provide users theoretical control over their privacy, the problem of overprivileged applications exposes users to unnecessary warnings, and increases the impact of potential vulnerabilities [2].

Current research into the problems of Android malware and user-centric permission systems are increasingly turning to machine learning both to detect and to prevent the compromise of user data.  In Android Mobile Security by Detecting Classification of Malware Based on Permissions Using Machine Learning Algorithms, researchers study the performance of various machine learning algorithms for detecting Android malware based on the permissions an application requests from the user [5].  Most recently, several research teams consider Android security from the perspective of the operating system itself, whereby machine learning is applied to intelligently detect user permission preferences, contextually.  By associating each permission request with its application context, approaches to context-sensitive permission enforcement can enforce fine-grained permission control aligned to the user’s changing preferences, without requiring the user to understand the policies and their implications at either the time of install or upon first use.  In the following section, we examine how machine learning is enabling the dynamic regulation of mobile application permissions, and what this means in the user’s experience.

Dynamic Permissions Regulation

Mobile applications abuse user permissions by requesting access to system resources that lie beyond the scope of the functionality provided by the app.  In non-malicious cases, the request for unnecessary permissions is simply due to developer’s error.  Developers may request unrequired permissions with names that sound related to their application’s functionality, or redundancies may exist between an application’s and its deputy’s operational permissions [2].  Further, the copy-paste culture of open-source coding communities may perpetuate developer over-privileging errors across multiple projects.  In user-centric permission systems, the final onus rests upon the user to understand what permissions are being requested by an application, for what potential purposes.

How can the user, lacking technical knowledge of the Android operating system, make decisions better reflecting “their” risk tolerance and privacy preference?   To this question, researchers from the Universities of British Columbia, Vancouver and California, Berkeley pursued an investigation of how the mobile operating system might better aide users in avoiding unexpected or unwanted use of “their” data [6].  Operating from the assumption that the reason permission models fail to protect user privacy is because they fail to account for the context surrounding data flows, the research team proposed the integration of a permissions model based on the notion of contextual integrity.  Dynamically regulating data access based on context requires a higher degree of user involvement; however, a high number of prompts might also encourage a habituated response that does not reflect a user’s true security preference.  To balance these competing interests, the research team designed a machine-learning classifier to automatically predict how users would respond to prompts based on their existing settings, and to act on behalf of users in alignment with their preferences, without over-burdening users with repeated requests.

Related Work

This study of dynamically-granted permissions emerges from within a growing body of research exploring the development of recommendation systems based on users’ privacy preferences.  For instance, in Mobile App Recommendations with Security and Privacy Awareness (2014), Zhu et al.  propose a mobile App recommender system with privacy and security awareness [9].  Other researchers have developed recommendation systems to detect privacy violations.  In ProtectMyPrivacy (2013), Agarwal and Malcolm present a user-driven, crowd-sourced recommendation engine that provides app-specific privacy recommendations for securing their iOS architecture that not only notifies users of access to this sensitive information by individual apps, but also provides a mechanism to allow such accesses or deny them by substituting anonymized shadow data in its place [1].  Similarly, in Expectation and Purpose (2012), Lin et al. demonstrate crowdsourcing as a novel method for evaluating a mobile application’s privacy based on users’ mental models of mobile privacy and their expectations of resource usage [3].

In Follow My Recommendations (2016), Liu et al. presents a personalized privacy-assistant solution aimed at modifying user behaviors to better align with their own privacy preferences [4].  This study, although closely related to Wijesekera et al., differs in several key ways.  While the clustering algorithms employed by Liu et al. require users to self-report privacy preferences, Wijesekera et al. sample specific user events to passively infer user preferences from the user’s established behavior.  Wijesekera et al. further remove sampling bias from their study by extending their dataset from Liu et al.’s highly privacy-conscious users to reflect a wider variety of user preferences. Finally, whereas Liu et al. approach user privacy preferences as static, Wijesekera et al. build upon the assumption that people want to vary their privacy decisions based on contextual circumstances.

Discussion

The unique strength of Wijesekera et al.’s research into the dynamic regulation of permissions aligned to user context is the application of the Helen Nissenbaum’s conceptual framework of contextual integrity as a benchmark of privacy.  In a world of rapid technological change, both the miniaturization and the growing distribution of inexpensive sensing/computing technologies presents a worrisome condition whereby technologies disseminate user information with greater stealth, in greater volume, often without the permission or the awareness of the user.  The framework of contextual integrity responds to the perspective that technologies can be held to standards that respect user privacy preferences.  What more, by applying machine learning algorithms to the recognition of the user’s external context, Wijesekera et al. position the technological device itself as an advocate of the user’s preferences, capable of taking actions best reflecting what the user would want, particularly when the implications of granting permissions cannot always be understood.  In this way, technology is leveraged as a partner with the user for protecting the user against the exploits of technologies.

 

References

[1] Agarwal, Y., & Hall, M. (2013). ProtectMyPrivacy: Detecting and mitigating privacy leaks on iOS devices using crowdsourcing. Proceeding of the 11th Annual International Conference on Mobile Systems, Applications, and Services – MobiSys ’13. doi:10.1145/2462456.2464460

[2] Felt, A. P., Chin, E., Hanna, S., Song, D., & Wagner, D. (2011, October). Android permissions demystified. Proceedings of the 18th ACM Conference on Computer and Communications Security – CCS ’11, 627-637. doi:10.1145/2046707.2046779

[3] Lin, J., Sadeh, N., Amini, S., Lindqvist, J., Hong, J. I., & Zhang, J. (2012). Expectation and purpose: Understanding users’ mental models of mobile app privacy through crowdsourcing. Proceedings of the 2012 ACM Conference on Ubiquitous Computing – UbiComp ’12. doi:10.1145/2370216.2370290

[4] Liu, B., Andersen, M.S., Schaub, F., Almuhimedi, H., Zhang, S., Sadeh, N.M., Agarwal, Y., & Acquisti, A. (2016). Follow My Recommendations: A Personalized Privacy Assistant for Mobile App Permissions. SOUPS.

[5] Varma, P. R., Raj, K. P., & Raju, K. V. (2017). Android mobile security by detecting and classification of malware based on permissions using machine learning algorithms. 2017 International Conference on I-SMAC (IoT in Social, Mobile, Analytics and Cloud) (I-SMAC). doi:10.1109/i-smac.2017.8058358

[6] Wijesekera, P., Baokar, A., Tsai, L., Reardon, J., Egelman, S., Wagner, D., & Beznosov, K. (2017, May). The Feasibility of Dynamically Granted Permissions: Aligning Mobile Privacy with User Preferences. 2017 IEEE Symposium on Security and Privacy (SP). doi:10.1109/sp.2017.51

[7] Wijesekera, P., Baokar, A., Tsai, L., Reardon, J., Egelman, S., Wagner, D., & Beznosov, K. (2018). Dynamically Regulating Mobile Application Permissions. IEEE Security & Privacy, 16(1), 64-71. doi:10.1109/msp.2018.1331031

[8] Zhang, Y., Yang, M., Gu, G., & Chen, H. (2016, October). Rethinking Permission Enforcement Mechanism on Mobile Systems. IEEE Transactions on Information Forensics and Security, 11(10), 2227-2240. doi:10.1109/tifs.2016.2581304

[9] Zhu, H., Xiong, H., Ge, Y., & Chen, E. (2014). Mobile app recommendations with security and privacy awareness. Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining – KDD ’14. doi:10.1145/2623330.2623705

Leave a Reply

Your email address will not be published. Required fields are marked *