With the push of cloud computing which has both resource and compute scalability, data, which has been exploding in the past years, are often outsourced to a server. To this end, secure and efficient data processing and mining on outsourced private database becomes a primary concern for users. Among different secure data mining and machine learning algorithms, gradient descent method, as a widely used optimization paradigm, aims at approximating a target function to reach a local minimum, which is always deemed as a decision model to be discovered.
In existing methods, users are assumed to hold and process their own data, and all users follow a secure protocol to perform gradient descent algorithm. However, such methods are not applicable to a cloud platform since that data is outsourced to a centralized server after encryption. To address this problem, we propose an Encrypted Gradient Descent Protocol (EGDP) in this paper. In EGDP, both users and server perform collaborative operations to learn and approximate the target function without violating data privacy. We formally proved that EGDP is secure and can return correct result.