This talk will show how the first two steps of the ADMM (the two minimization steps) may be interpreted as the classical forward-backward (proximal gradient) method applied to a dual formulation of the standard augmented Lagrangian subproblem. By substituting other variants of the forward-backward method -- for example, algorithms involving Nesterov-style acceleration -- for the classical one, this observation allows for the creation of new classes of ADMM-like methods, of which this talk will give some examples. It is not yet clear whether they have any computational advantages, but they are still of some theoretical interest. Generically, we call this class of algorithms "DEFBAL", for "Dual Embedded Forward-Backward Augmented Lagrangian".