After doing some additional research, I decided to get the ATN THOR 4 640 2.5-25x instead of the THOR 5 2-20x. The biggest reason is the 12nm process node may not have the same dynamic range as the older 17nm process node. Take a look at
this video from Pulsar, as why the 12nm process node may have some drawbacks.
not sure where you picked up the term "process node", it is clear you are referring to the size of each individual receptor on the thermal core which is referred to as the pixel pitch of the core. I got half way through the video you linked before I had to stop, too much marketing BS, and a few pure half-truths(lies). Claiming a 17 micron pitch core will always have more sensistivity is a total lie, 1000000% untrue. When 12 micron cores first came out, it was kind of most of the time but not always true, but no longer. There are 12 micron cores that have sensitivity (NETD) of < 20mk, there may be a 17 micron core that good, I have never heard of one. Compare NETD spec of cores, not pixel pitch for sensitivity. Research and improvement follows the money, 12 micron cores (and the next thing on the horizon) are where that money goes, 17 micron cores aren't ever going to get much better because the R&D dollars all go to the 12 micron cores now, and due to the wavelength of light they use that size will not get smaller, it is already theoretically slighly smaller than some of the wavelengths uncooled thermals use, can't go smaller without making some compromises. As far as image presented on the view screen, that depends on the entire system from the lens to the core to the chips processing power, the software, and the display. Everything affects the final image.
And like everyone else, the presenter kept harping on humidity, totally usesless number when talking about thermal capability of any device. Water is the enemy, and humidity does not tell you how much water is between the thermal device and target. Dewpoint tells you how much water by mass is in a given volume of air. Some of the best thermal images I have with my Trijicon are in near 100% humidity (gasp). The dewpoint was in the teens F, needless to say so was the temperature. That same exact air with every single molecule of water still in it warmed to nearly 80 degrees F would be less than 10% humidity. Humidity is bs when talking about thermal devices. Mass of water in a given volume of air is what causes problems, have a dewpoint of 65 and it wouldn't matter if the temp was 66 or 100, the thermal image would be just as bad at both temps with a dewpoint of 65 since there would be the same mass of water in the same volume of air, and at 65 dewpoint that is a lot of water. If everything in two thermals was identical except the cores, the one with the smallest NETD could look better if the difference was significant for the conditions. But everything is not always equal, different lenses, different chipsets, different software and a core with a much worse NETD could look better by far than one with a super low NETD. it's all relative.