Using the examples given in a Mathworks document I implemented my own neural network layer without defining the backward loss, which results in the following error message:
"Abstract classes cannot be instantiated. Class 'weightAdditionLayer' inherits abstract methods or properties but does not implement them. See the list of methods and properties that 'weightAdditionLayer' must implement if you do not intend the class to be abstract."
The meaning of the error message is clear to me. What is unclear is the fact that the mathworks implementation does not define the backward loss function and explicitly says that this is optional, as the class is able to find the derivative itself. Although I haven't tested the Mathworks code I assume it works. I want to know why I am getting this error message after all despite that I am following mathworks' example.