CVXPYlayers

PyPI GitHub stars License NeurIPS 2019

Differentiable convex optimization layers for deep learning

Embed convex optimization problems directly into your neural networks. CVXPYlayers solves parametrized problems in the forward pass and computes gradients via implicit differentiation in the backward pass.


Frameworks

PyTorch

Full torch.nn.Module integration with autograd support. The most popular choice for deep learning.

PyTorch API
JAX

Works with jax.grad, jax.vmap, and jax.jit (Moreau solver).

JAX API
MLX

Optimized for Apple Silicon. Unified memory architecture for M1/M2/M3 chips.

MLX API

Get Started in 30 Seconds

import cvxpy as cp
import torch
from cvxpylayers.torch import CvxpyLayer

# Define optimization problem
x = cp.Variable(2)
A = cp.Parameter((3, 2))
b = cp.Parameter(3)
problem = cp.Problem(
    cp.Minimize(cp.sum_squares(A @ x - b)),
    [x >= 0]
)

# Wrap as differentiable layer
layer = CvxpyLayer(problem,
    parameters=[A, b],
    variables=[x]
)

# Solve + backprop
A_t = torch.randn(3, 2, requires_grad=True)
b_t = torch.randn(3, requires_grad=True)
(solution,) = layer(A_t, b_t)
solution.sum().backward()  # Gradients flow!

Install

pip install cvxpylayers[torch]

What’s happening?

  1. Define a convex problem with CVXPY

  2. Wrap it as a CvxpyLayer

  3. Use it like any PyTorch layer

  4. Gradients computed automatically

Quickstart Guide


Why CVXPYlayers?

Encode Domain Knowledge

Inject constraints and structure into your models. Physics, fairness, safety — if you can write it as a convex program, you can differentiate through it.

GPU Accelerated

CuClarabel solver keeps everything on GPU. No CPU-GPU transfers for large-scale optimization.

Batched Solving

Solve thousands of problem instances in parallel. First dimension is batch — just like PyTorch.

Multiple Solvers

Clarabel, SCS, and CuClarabel. Pick the right solver for your problem structure.


Used For

Control

MPC, LQR, path planning

Finance

Portfolio optimization

ML

Constrained learning

Robotics

Motion planning

Browse Examples


Research

This library accompanies our NeurIPS 2019 paper:

Agrawal, A., Amos, B., Barratt, S., Boyd, S., Diamond, S., & Kolter, Z. (2019). Differentiable Convex Optimization Layers. Advances in Neural Information Processing Systems.

For an introduction, see our blog post.

BibTeX Citation
@inproceedings{cvxpylayers2019,
  author={Agrawal, A. and Amos, B. and Barratt, S. and Boyd, S. and Diamond, S. and Kolter, Z.},
  title={Differentiable Convex Optimization Layers},
  booktitle={Advances in Neural Information Processing Systems},
  year={2019},
}