• Kajika@lemmy.mlOP
    link
    fedilink
    arrow-up
    61
    ·
    9 months ago

    Took me 2 hours to find out why the final output of a neural network was a bunch of NaN. This is always very annoying but I can’t really complain, it make sense. Just sucks.

      • Kajika@lemmy.mlOP
        link
        fedilink
        arrow-up
        23
        ·
        9 months ago

        That could be a nice way. Sadly it was in a C++ code base (using tensorflow). Therefore no such nice things (would be slow too). I skill-issued myself thinking a struct would be 0 -initialized but MyStruct input; would not while MyStruct input {}; will (that was the fix). Long story.

        • fkn@lemmy.world
          link
          fedilink
          arrow-up
          10
          ·
          9 months ago

          I too have forgotten to memset my structs in c++ tensorflow after prototyping in python.

        • TheFadingOne@feddit.de
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          9 months ago

          If you use the GNU libc the feenableexcept function, which you can use to enable certain floating point exceptions, could be useful to catch unexpected/unwanted NaNs

  • affiliate@lemmy.world
    link
    fedilink
    arrow-up
    16
    ·
    9 months ago

    this is just like in regular math too. not being a number is just so fun that nobody wants to go back to being a number once they get a taste of it

  • Omega_Haxors@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    9 months ago

    The funniest thing about NaNs is that they’re actually coded so you can see what caused it if you look at the binary. Only problem is; due to the nature of NaNs, that code is almost always going to resolve to “tried to perform arithmetic on a NaN”

    There are also coded NaNs which are defined and sometimes useful, such as +/-INF, MAX, MIN (epsilon), and Imaginary

    • daniyeg@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      9 months ago

      NaN stands for Not a Number. to simplify very briefly (and not accurate at all), when defining a standard for representing fractional values using binary digits in computers they systematically assigned natural numbers in a range of values to some fractional numbers. some of the possible natural numbers for reasons not worth talking about were unused, so they were designated as NaNs, and the value of the NaN itself is supposed to tell you what went wrong in your calculations to get a NaN. obviously if you use a NaN in an arithmetic operation the result is also Not a Number and that’s what the meme is referring to.

      • vrighter@discuss.tchncs.de
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        9 months ago

        i think the real explanation is simpler and more understandable.

        NaN is what you get when you do something illegal like dividing by zero. There is no answer, but the operation has to result in something. So it gives you NaN, because the result is literally not a number

  • dan@upvote.au
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    9 months ago

    Also applies to nulls in SQL queries.

    It’s not fun tracing where nulls are coming from when dealing with a 1500 line data warehouse pipeline query that aggregates 20 different tables.