this post was submitted on 22 Sep 2025
1057 points (99.1% liked)

Microblog Memes

9285 readers
2693 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Gladaed@feddit.org 31 points 3 days ago (1 children)

The simplest neural network (simplified). You input a set of properties(first column). Then you weightedly add all of them a number of times(with DIFFERENT weights)(first set of lines). Then you apply a non-linearity to it, e.g. 0 if negative, keep the same otherwise(not shown).

You repeat this with potentially different numbers of outputs any number of times.

Then do this again, but so that your number of outputs is the dimension of your desired output. E.g. 2 if you want the sum of the inputs and their product computed(which is a fun exercise!). You may want to skip the non-linearity here or do something special™

[–] Poik@pawb.social 1 points 1 day ago (1 children)

Simplest multilayer perceptron*.

A neural network can be made with only one hidden layer (and still, mathematically proven, be able to output any possible function result, just not as easily trained, and with a much higher number of neurons).

[–] Gladaed@feddit.org 1 points 1 day ago* (last edited 1 day ago) (1 children)

The one shown is actually single layer. Input, FC hidden layer, output. Edit: can't count to fucking two, can I now. You are right.

[–] Poik@pawb.social 2 points 1 day ago* (last edited 1 day ago)

It's good. Thanks for correcting yourself. :3

The graphs struck me as weird when learning as I expected the input and output nodes to be neuron layers as well... Which they are, but not in the same way. So I frequently miscounted myself while learning, sleep deprived in the back of the classroom. ^^;;