Folding over Neural Networks

Minh H Nguyen, Nicolas G Wu

Research output: Chapter in Book/Report/Conference proceedingChapter in a book

1 Citation (Scopus)

Abstract

Neural networks are typically represented as data structures that are traversed either through iteration or by manual chaining of method calls. However, a deeper analysis reveals that structured recursion can be used instead, so that traversal is directed by the structure of the network itself. This paper shows how such an approach can be realised in Haskell, by encoding neural networks as recursive data types, and then their training as recursion scheme patterns. In turn, we promote a coherent implementation of neural networks that delineates between their structure and semantics, allowing for compositionality in both how they are built and how they are trained.
Original languageEnglish
Title of host publicationLecture Notes in Computer Science
PublisherSpringer
Publication statusPublished - 22 Sept 2022

Publication series

NameLecture Notes in Computer Science
PublisherSpringer Berlin Heidelberg
ISSN (Print)0302-9743

Fingerprint

Dive into the research topics of 'Folding over Neural Networks'. Together they form a unique fingerprint.

Cite this