I've studied and have understanding of the mathematical operations in proximal backpropagation, essentially of the 2 steps (you'd know too):
1) gradient descent of the prediction error
2) backpropagation of the same
This is different from the fact that in the ordinary backpropagation, we find derivatives of the loss function with respect to every intermediate variable using a right to left pass.
What's is possible in a week?
You can have a .m file that can be used to do this right to left pass, using proximal backpropagation, instead. That is, you should be able to feed a neural network, and using this .m file, generate the weight updates, such as the below
w := w - alpha* dvar
Thus, this backpropagation shall output the dvar, using right to left pass, and implicit gradient descent of the propagating error
What's not possible?
The original PAPER has shown many other things, for the sake of publication of the same. It is not possible to implement, everything at such short notice.
Is anything else possible?
We can discuss this over chat
How do I know?
I've coded each and every step, written then line by line on pen and paper several times by this date. I've used this for applications like design of circuits like adder, and, nand logic circuits, computation graphs for logistic regression etc. which is not the usual way backpropagation is known for, however is still applicable, apart from neural networks.
Kindly message me if you're interested, and even if not online please leave your queries. Thank you!
About me
Just another research student in machine learning applications and optimization methods. I use MATLAB Octave, R, mostly, and Python.