19:53
Week 12: Lecture 46: Objective Inconsistency Problem
NPTEL IIT Bombay
32:32
Week 12: Lecture 47: General Update Rule
21:10
Week 12: Lecture 45: Sources of Computational Heterogenity in FL
30:38
Week 11: Lecture 40: Decentralized Stochastic Gradient Descent
22:27
Week 11: Lecture 43: FedAvg Algorithm
18:27
Week 11: Lecture 44: Convergence Analysis of FL
35:37
Week 11: Lecture 42: Introduction to Federated Learning
40:22
Week 11: Lecture 41: Decentralized Stochastic Gradient Descent -2
28:41
Week 10: Lecture 35: Algorithms for Distributed Optimization
18:26
Week 10: Lecture 36: Algorithms for Distributed Optimization-2
21:56
Week 10: Lecture 37: Continuous-time Distributed Optimization Algorithms
30:20
Week 10: Lecture 38: Introduction to Neural Networks
39:20
Week 10: Lecture 39: Large Scale Machine Learning
35:46
Week 9: Lecture 30: Consensus Algorithms
34:20
Week 9: Lecture 31: Consensus Algorithms-Fixed time
29:15
Week 9: Lecture 32: Distributed Economic Dispatch Problem
25:29
Week 9: Lecture 33: Algorithm for Uncapacitated EDP
21:30
Week 9: Lecture 34: Capacitated EDP
19:09
Week 8: Lecture 27: Basics of Graph Theory-2
24:37
Week 8: Lecture 26: Basics of Graph Theory
33:24
Week 8: Lecture 28: Consensus and Average Consensus
31:54
Week 8: Lecture 29: Consensus and Average Consensus-2
43:34
Week 7: Lecture 23: Method of Multipliers
27:12
Week 7: Lecture 24: Dual Ascent and Dual Decomposition
30:05
Week 7: Lecture 25: ADMM Algorithm
36:54
Week 6: Lecture 19: Advanced Results on PL inequality: Part 1
38:35
Week 6: Lecture 21: Constrained Optimization Problem
36:10
Week 6: Lecture 20: Advanced Results on PL inequality: Part 2
31:59
Week 6: Lecture 22: Augmented Lagrangian
32:11
Week 5: Lecture 18: Rescaled Gradient Flow
25:01
Week 5: Lecture 17: Bregman Divergance
21:02
Week 5: Lecture 16: Exponential stability
42:36
Week 4: Lecture 14: Stability theory
37:37
Week 4: Lecture 15: Connections to optimization problems
27:26
Week 4: Lecture 13: Accelerate the convergence even further
31:13
Week 4: Lecture 12: Acceleration under strong convexity
30:12
Week 3: Lecture 9: Slaters condition
34:09
Week 3: Lecture 10: Analysis of gradient descent algorithm
38:27
Week 3: Lecture 11: KKT conditions
43:54
Week 2: Lecture 7: Implications of strong convexity
38:10
Week 2: Lecture 8: Primal and dual optimization problems
33:27
Week 2: Lecture 6: Strictly and strongly convex functions
6:20
Week 1: Lecture 3: Course Outline
14:15
Week 1: Lecture 2: Analyzing optimization algorithms in continuous time domain
31:14
Week 1: Lecture 1: Introduction to optimization
43:28
Week 1: Lecture 4: Basics of optimization problems
37:58
Week 1: Lecture 5: Convex sets and Convex functions
2:22
Course Introduction - Distributed Optimization and Machine Learning