Privacy-Preserving Machine Learning
2 researchers across 1 institution
Researchers investigate methods to train and deploy machine learning models while protecting sensitive data. This area explores techniques such as federated learning, differential privacy, and homomorphic encryption to enable collaborative model training and data analysis without direct exposure of individual information. Key questions involve developing algorithms that maintain model accuracy and utility while offering robust privacy guarantees, and understanding the trade-offs between privacy, performance, and computational cost. Work also addresses secure multi-party computation and the design of privacy-aware data sharing frameworks.
This research holds particular relevance for Arkansas's growing technology sector and its established industries like agriculture and healthcare. Protecting proprietary data in agricultural technology or patient information in health analytics is crucial for innovation and public trust. Developing privacy-preserving tools can foster secure data-driven advancements in these areas, supporting economic development and improving services for Arkansans by enabling the use of sensitive data for beneficial insights.
The field draws upon and contributes to areas including advanced neural network applications, fairness in machine learning, and causal inference. Collaboration extends across institutions, involving faculty and staff researchers engaged in developing and applying these privacy-enhancing technologies.
Top Researchers
| Name | Institution | h-index | Citations | Career Stage | Badges |
|---|---|---|---|---|---|
| Xintao Wu | University of Arkansas | 39 | 5,823 | Grant PI High Impact | |
| Lewis CL Brown | University of Arkansas | 0 | 0 |