Figure 1:

Figure 2:

Figure 3:

Figure 4:

Figure 5:

Figure 6:

Figure 7:

Figure 8:

Figure 9:

Figure 10:

Figure 11:

Figure 12:

Figure 13:

Figure 14:

Figure 15:

Human activity recognition_
| Research | Object | Recognition methods | Data | Application domain | Privacy issue |
|---|---|---|---|---|---|
|
Iwasawa et al. (2017)
| Human activity | Deep learning | Sensor data from smart wearable devices | Daily activity investigation | Training data privacy |
|
Chen et al. (2018)
| Physical activities such as walking and running | Deep learning | Time series sensor data from smart wearable devices | Daily activity investigation | User data privacy |
| Phan et al. (2016) | Human activity | Deep learning | Physical activities, biomarkers, biometric measures | Health social network | Training data privacy |
Privacy concerns in vision-based machine learning_
| Research | Application domain | Privacy concerns |
|---|---|---|
| Chattopadhyay and Boult (2007) | Intelligent surveillance system | Conflict between the purpose of intelligent surveillance systems and the privacy of individuals |
| Wu et al. (2018) | Smart camera application | Private information leakage during device-captured visual data upload to centralized cloud for analysis |
| Gomathisankaran et al. (2013) | Medical image analysis on the Cloud | Private information leakage of medical data transmitted in the network and processed in the cloud |
| Shokri et al. (2017) | ‘Machine learning as a service’ provided by Google and Amazon | Information leakage about training datasets |
| Speciale et al. (2019) | Augmented/Mixed reality (AR/MR) and autonomous robotic system | Confidential information disclosure about captured 3D scene |
Privacy-preserving approaches_
| Research | Privacy issue | Privacy-preserving approach | Protected object |
|---|---|---|---|
|
Garcia and Jacobs (2010)
| Private information leakage (Public dataset privacy) (User data privacy) | Cryptography | Private information (medical image, lifestyle, financial information, face, private location, biometric information, disease information), Human activity (daily life activity, movement) |
|
Butler et al. (2015)
| Private information leakage (Public dataset privacy) | Anonymized videos | |
| Garcia Lopez et al. (2015) | Private information leakage from database (Public dataset privacy) | Local processing | |
|
Liu (2019) | Information leakage from large-scale database (Public dataset privacy) | Differential privacy | |
| Bian et al. (2020) | Information leakage in visual recognition | Secure inference by homomorphic encryption | |
|
Iwasawa et al. (2017)
| Information disclosure by unintentional discriminating of user information during deep learning (Training data privacy) | Adversarial training | |
| Zhang et al. (2019) | Adversarial training which is effective on particular sensitive attributes (Training data privacy) | Image style transformation | |
|
Phan et al. (2016)
| Information leakage during deep learning (Training data privacy) | Differential privacy | |
|
Tramèr et al. (2016)
| Information leakage during deep learning (Model privacy) | Analyze attacker’s queries, Defense against attacks |
Human recognition_
| Research | Object | Recognition methods | Data | Application domain | Privacy issue |
|---|---|---|---|---|---|
|
Chattopadhyay and Boult (2007)
| Human, object | Deep learning | Image, video | Video surveillance | Public dataset privacy |
|
Song and Shmatikov (2020)
| Human face | Deep learning | Image | Binary gender classification | Model privacy |
|
Haris et al. (2014)
| Human location | Deep learning | Sensor data | Location-based services, mobile heath applications | Public dataset privacy |
|
Gomathisankaran et al. (2013)
| Human disease, human health | Deep learning | Clinical records, image | Medical care | Public dataset privacy |