CoCoA: A general framework for communication-efficient distributed optimization V Smith, S Forte, C Ma, M Takáč, MI Jordan, M Jaggi Journal of Machine Learning Research 18 (230), 1-49, 2018 | 323 | 2018 |
Distributed optimization with arbitrary local solvers C Ma, J Konečný, M Jaggi, V Smith, MI Jordan, P Richtárik, M Takáč optimization Methods and Software 32 (4), 813-848, 2017 | 232 | 2017 |
Adding vs. averaging in distributed primal-dual optimization C Ma, M Jaggi, MI Jordan, B EDU, P Richtárik, M Takác | 210 | 2015 |
Efficient distributed hessian free algorithm for large-scale empirical risk minimization via accumulating sample strategy M Jahani, X He, C Ma, A Mokhtari, D Mudigere, A Ribeiro, M Takác International Conference on Artificial Intelligence and Statistics, 2634-2644, 2020 | 28 | 2020 |
Linear convergence of randomized feasible descent methods under the weak strong convexity assumption C Ma, R Tappenden, M Takáč Journal of Machine Learning Research 17 (228), 1-24, 2016 | 19 | 2016 |
Partitioning data on features or samples in communication-efficient distributed optimization? C Ma, M Takáč arXiv preprint arXiv:1510.06688, 2015 | 13 | 2015 |
An accelerated communication-efficient primal-dual optimization framework for structured machine learning C Ma, M Jaggi, FE Curtis, N Srebro, M Takáč Optimization Methods and Software 36 (1), 20-44, 2021 | 12 | 2021 |
Underestimate sequences via quadratic averaging C Ma, NVC Gudapati, M Jahani, R Tappenden, M Takác arXiv preprint arXiv:1710.03695, 2017 | 11 | 2017 |
Distributed inexact damped newton method: Data partitioning and load-balancing C Ma, M Takáč arXiv preprint arXiv:1603.05191, 2016 | 10 | 2016 |
Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences M Jahani, NVC Gudapati, C Ma, R Tappenden, M Takáč Computational Optimization and Applications 79, 369-404, 2021 | 4 | 2021 |
Distributed inexact damped newton method: Data partitioning and work-balancing C Ma, M Takác Workshops at the Thirty-First AAAI Conference on Artificial Intelligence, 2017 | 3 | 2017 |
Distributed Methods for Composite Optimization: Communication Efficiency, Load-Balancing and Local Solvers C Ma Ph. D. thesis, Lehigh University, 2018 | 1 | 2018 |
Distributed Restarting NewtonCG Method for Large-Scale Empirical Risk Minimization M Jahani, X He, C Ma, D Mudigere, A Mokhtari, A Ribeiro, M Takac | | 2017 |
Grow Your Samples and Optimize Better via Distributed Newton CG and Accumulating Strategy M Jahani, X He, C Ma, A Mokhtari, D Mudigere, A Ribeiro, M Takác | | |
CoCoA+: Adding vs. Averaging in Distributed Optimization M Takác, C Ma, V Smith, M Jaggi, MI Jordan, P Richtrik | | |