Hacker News

Algebraic Simplification Neural Nets

Hacker News - Sat, 04/06/2024 - 1:13pm

It seems that there should be a way to algebraically simplify neural nets.

Trivially a node that has a zero weight can be removed as can any links to / from that node.

It should also be possible to eliminate nodes that have a full-value (aka '1' on the 0-1 scale)

I have also seen work where the matrix multiplies during training can have columns "collapsed".

The ultimate question might be applying an "algebraic simplification" of the final network to simplify a post-trained network used for inference.

The idea is to take a path through a network, constructing the equation for that path, reducing it to a shorter path by conbining nodes and weights.

It is certain that a node participates in several (hundred?) paths. In this case it might be useful to "copy" the node so it can be part of a path reduction without affecting other paths.

I believe that in theory some neural networks can be reduced to a single hidden layer[1]. The game would be to algebraically reduce network depth.

[1] Lee, et al. "On the ability of neural nets to express distributions" https://arxiv.org/pdf/1702.07028.pdf (2021)

Comments URL: https://news.ycombinator.com/item?id=39953801

Points: 1

# Comments: 0

Categories: Hacker News

Show HN: BrandMelon Customized Marketing Strategy

Hacker News - Sat, 04/06/2024 - 12:47pm

Article URL: https://www.brandmelon.ai/

Comments URL: https://news.ycombinator.com/item?id=39953614

Points: 1

# Comments: 0

Categories: Hacker News

Ore

Hacker News - Sat, 04/06/2024 - 12:41pm

Article URL: https://github.com/HardhatChad/ore

Comments URL: https://news.ycombinator.com/item?id=39953574

Points: 1

# Comments: 0

Categories: Hacker News

Cows and Carbon for Dummies

Hacker News - Sat, 04/06/2024 - 12:37pm
Categories: Hacker News

Pages