“…To provide data security, differential privacy adds random noise drawn from a distribution such as Laplace , to the functions running on sensitive data. There exist three ways to provide differential privacy guarantee: (a) input perturbation (Ji, Lipton, & Elkan, 2014; Mivule, Turner, & Ji, 2012; Sánchez, Domingo‐Ferrer, Martínez, & Soria‐Comas, 2016; Sarwate & Chaudhuri, 2013; Xu, Yang, & Bai, 2019), (b) objective perturbation (Chaudhuri & Monteleoni, 2008; Chaudhuri, Monteleoni, & Sarwate, 2011; Fukuchi, Tran, & Sakuma, 2017; Ji et al, 2014; Rubinstein, Bartlett, Huang, & Taft, 2009; Zhang, Zhang, Xiao, Yang, & Winslett, 2012), and (c) output perturbation (Bojarski, Choromanska, Choromanski, & LeCun, 2014; Fletcher & Islam, 2015, 2019; Friedman & Schuster, 2010; Gursoy, Inan, Nergiz, & Saygin, 2017; Xu et al, 2019). All the three methods add some random noise during the data analysis process to protect individual's privacy.…”