In other words we are talking about quotient spaces. Therefore, we can define a relative homology theory for ReLU neural networks! Let's go back to the (non)circle example, using relative homology we get the right result and we can even see which points are glued together by the network. 6/7
February 6, 2025 at 4:57 PM
In other words we are talking about quotient spaces. Therefore, we can define a relative homology theory for ReLU neural networks! Let's go back to the (non)circle example, using relative homology we get the right result and we can even see which points are glued together by the network. 6/7
Even better, if the intersections between polyhedra and the data manifold are convex, then only the latter has an impact on homology groups. This happens since we can contract such regions and end up with a homotopy-equivalent space. 5/7
February 6, 2025 at 4:57 PM
Even better, if the intersections between polyhedra and the data manifold are convex, then only the latter has an impact on homology groups. This happens since we can contract such regions and end up with a homotopy-equivalent space. 5/7
It turns out that all topological changes to a manifold that occur through the layers of a ReLU network happen for two reasons. 1. The network is low rank over a polyhedron. 2. The network maps different polyhedra to each other by gluing them. 4/7
February 6, 2025 at 4:57 PM
It turns out that all topological changes to a manifold that occur through the layers of a ReLU network happen for two reasons. 1. The network is low rank over a polyhedron. 2. The network maps different polyhedra to each other by gluing them. 4/7
In the specific case of ReLU neural networks we know that they are equivalent to continuous piecewise-linear functions and can be split into affine functions over convex polyhedra. This is called a polyhedral decomposition and looks really crazy (Cubism is the obvious reference here). 3/7
February 6, 2025 at 4:57 PM
In the specific case of ReLU neural networks we know that they are equivalent to continuous piecewise-linear functions and can be split into affine functions over convex polyhedra. This is called a polyhedral decomposition and looks really crazy (Cubism is the obvious reference here). 3/7
This happens since we use don't know the real metric underlying the data manifold. So we end up classifying manifolds like the one below as a circle. Looks kind of right, but we know better. 2/7
February 6, 2025 at 4:57 PM
This happens since we use don't know the real metric underlying the data manifold. So we end up classifying manifolds like the one below as a circle. Looks kind of right, but we know better. 2/7