p e r s o n a l |
GradIEEEnt half decent
(31 May 2023 at 22:12) |
Well, if you are eagerly awaiting my video projects but somehow you think the first place to hear about them is this here monthly-updated blog-o-sphere, here it is:
GradIEEEnt Half Decent
As noted before, this one is definitely pretty esoteric; refreshing for some after the way-too-relatable Pac Tom video. As usual, I started loathing the project at the end, so it was heartwarming that people cared about the video at all, let alone enoyed it. The video follows the paper pretty closely this time, which of course can be found in the epically long SIGBOVIK 2023 proceedings. I finally got my copy and have been working through it.
But, mostly this month has been a refractory period, compounded by the release of Legend of Zelda: Tears of the Kingdom. I have been taking my time with this one, but as you have probably heard it is a large and good game and so that time has been considerable. Before that I finished off Lone Fungus; it was a good game, although I didn't get into it enough to want to complete all the optional astral fragments (the ones with the swinging spike balls are just irritating?). Ladybugs were enough!
I did work on new projects, of course. Not much to share about those, though. Both of the active ones are of the sort that "this might not work at all," which is kind of thrilling, but also kind of bad for me. So I need to balance it with some things that can definitely succeed but still scratch the project itch, like "make a nice CAD model of this thing even though a napkin sketch would suffice," or "do performance optimization on this library even though it doesn't need to be fast." | |
|
First! |
Man, I know how you feel with scratching that project itch. I have done that with physical items and sorting so much, that I am starting to go stir crazy with nothing to do. I am planning on now doing my digital life, before re-doing my physical life, ect. ect.
Also, just in case you, Tom7, are reading this, you should remove your Google+ link, or move it to an archive. |
It occurs to me that if you represent true and false with the floating point numbers +0 and -0, then you can simulate any boolean operation with linear floating point operations. Add two numbers to simulate anding booleans, multiply a number with constant -1 to simulate noting a boolean. This won't help you in training neural nets, but it would be another way to make NaN gates.
|
|
|
|