Awesome work. Just a few observations from a neutral standpoint on the matter. Essentially, the tests/comparisons aren't a matter of showing the level of accuracy in any fashion, in fact they show that both temperature devices are actually reading VERY accurately and almost equally, to each-other. The differences are explained below:
In tests 1-5, you're going to have disparing differences by default because of the physics involved and the base purpose of each thermometer. Tests 6 and 7 actually prove this with their wide margin, and I believe those two to be the closest appropriate tests for comparison of the group.
The thermometer systems:
Most dial thermometers are designed to only measure air temperature, and only in the general area they're located. They can't measure the physical surface temperature of a basking spot, but when placed as close to the spot as possible to allow airflow (about 1/4" or 1/2" from the base of the spot), they will very accurately measure the air temperature of that area. That's important to know when combined with humidity. With dial thermometers, it's important you allow at least a 1/4" of space all around for airflow to rise and fall on the thermal element. You should never lay it face down or belly up, because that traps the heat inside, allowing the plastic to absorb heat and magnify it against the element (or preventing the element from heating up properly, depending on the orientation).
Probe thermometers are designed to measure very specific, pinpoint locations, typically by being pushed into something such as the substrate. They can measure air temperature but unless they're shielded from the radiant UV heat, you're going to get inaccurate readings because the probes are encapsulated with ABS or polymer plastic. The difference is due to the black plastic, which absorbs radiant heat, and begins to "cook" the probe's sensor. It's sortof like how if you're in arizona in the summer, the blacktop is going to be boiling, whereas the concrete will be really hot but won't dissolve your shoes--both will have different surface temperatures because both have different thermal capacities and reflect/absorb differently.
Here's a simple proof of concept: When the probe is reading 150F* like it was, pick it up by the probe end with your bare finger. I'll bet you can do it with absolute ease, and not get the 2nd-degree burn that 150F* would normally give you. This demonstrates it's not actually 150F*, but the thermister inside the probe is being accelerated by radiant heat. I've picked up probes before which were sitting within a few inches of 100 watt bulbs. Never had to drop them because my finger was on fire.
Dissecting the tests and providing explanations:
In the test with the probe laying flat or being upside down, you'll note the readings are almost exactly the same regardless of orientation. I would expect this, since the thermal probe is entirely exposed to the atmosphere, and is enclosed equally in plastic, so orientation won't matter. Rate of absorbtion/reflection/capacity is going to be equal, so your temperature is for the most part, stable. Stable doesn't mean "more accurate".
On the dial test where it's face down or face up, the temperatures recorded also show exactly what I would expect due to the design and physics. The gauge is inappropriately being used. The elements are designed to be used in a vertical fashion, allowing air to flow up/down past them. What you've done, by placing it face down, is caused hotter air to become trapped near the heating element, and allowed the plastic to absorb heat, radiating it toward the element. Of course it's going to show a higher temperature, than your test with it face-up, where the element is furthest away from the heat source and thus "insulated" against the elements as it were.
To further demonstrate my point, the test with it face down with the sticky on the back, provided lower temperatures than when the face was exposed. Why? Because the sticky pad was insulating the plastic from radiant heat, preventing buildup. It was still hotter than if it was face-up, because air was being trapped on the elements. There was no lateral flow to push air through them.
Those tests on the dial are pretty much worthless at this point, they're invalid because they're using the device inappropriately. It would be like trying to test a meat thermometer's accuracy by not putting it in the meat, merely sitting it on the hot metal frying pan. The data is going to be way, way off.
When the dial was attached to the tank on the bottom row of the first set of data, you'll see the temperature shift isn't much more than about 5% of the probe's test on the floor of the tank. This is what I would deem accurate. The dial is being used properly. The higher temperatures recorded by the probe can be accounted for easily. It's absorbing radiant energy, and cooking the thermal probe on the inside. Also, it's about an inch further away from the dial. You could place the probe right under the dial, so the probe portion is not within direct receipt of radiant energy, but can still measure the ambient air temperature which is flowing into the dial. This would give you the closest-possible reading between the two.
The final test goes on to further establish what I'm saying. Of course there's a significant difference in temperature between the two: you're comparing apples to oranges. Physical temperature of a heat-laden object (the probe), versus air temperature flowing around the dial. Consider that most 60 watt incandescent unfiltered bulbs will provide 150-160F* of radiant heat @ 12", the probe is doing well to absorb and trap most of that energy. The dial gauge is not, and for good reason. Air doesn't trap heat that well unless it's enclosed ,which this isn't.
Like I said before, it's like standing on hot pavement when it's 100F* outside. You're not combusting because the ambient air temperature is only 100F*. You're breathing in the hot air, like a dial gauge would. However, your shoes are screaming bloody murder. The pavement, like the temperature probe, is absorbing and trapping the majority of the radiant heat, probably over 200F*. Important observation: This is also why within the first 60 seconds, both methods show the exact same reading, because neither is laden with the thermal energy, they're merely reporting the ambient at that very instant.
That "instant" reading is the same in all tests except for the one with the sticky-pad, which is naturally going to skew the data. What this shows is that, dial or digital probe, are going to read almost the same within 1-2%, hands-down, every time, provided they're being used properly.
Perhaps a better test:
If you can block the IR and radiant heat from soaking the probe, you'll get a more accurate reading. In fact, if you truly want to compare the two, find a dial gauge which is a bit larger, and stick the probe inside the back of it so that the dial gauge plastic is shielding the probe, but still allowing ambient air flow on both of them. I've done this, and there was next to no difference noted between a $10 digital, a $100 digital and a $10 analog dial.
Alternatively, you could create a hide or two, one on the substrate, and one much closer to your heat source--both almost fully enclosed. Place the dial on the glass inside the hide, and then stick the digital probe to the side of the dial gauge. I'd almost guarantee both are within 1-5% of each other at all times, unless there's a horrible QA flaw in one (or both).
Probably the easiest would be to take both outside on a porch of your house. Hang the probe and dial gauge on the wall, next to each other. You'll see they're similar. You won't be seeing one showing 180F* and another showing 90F*.
What's also not being done is calibration with a known-calibrated-accurate device, to compare the digital and dial gauges to initially. That would truly be awesome.
Important Observation: There's a reason why a lot of folks enjoy digital thermometers in many aspects of science. They provide the most accurate, instant readings. You don't have to let them sit and adjust. Digital provides INSTANTANEOUS readings. If you let it sit there, and the temperature begins an upward/downward trend, it's because the probe itself is absorbing or expelling thermal energy due to exposure of its environment. The only exception to this is: if the probe's physical temperature is much different than the temperature of what is being measured (ie the probe was sitting on a shelf, whos ambient temperature was 80* and you want to measure substrate which is most likely 110*, you're going to want to wait a few moments for the probe to catch up). Not so if the probe is already submerged. Analog gauges are the ones which have to "sit" for a few moments, because their metallic thermal coils have to adjust to the temperature around them. In the 21st century we've come a long way in manufacture of dial gauges using analog elements, they're much more accurate than in the past. 20-40* differences make no sense. 1-5 degrees MAYBE.
In summary:
Dial gauges - Ambient air temperature monitors. Place them RIGHT where you want the air temperature measured (for instance, where the animal's head is where it breathes the most)
Probe gauge - Meant for placing inside of hides, cracks, mediums such as substrate or water. Not for use where the heating device has a direct line of sight to the probe.
IR gauges - for surface temperatures, can also be used for water and air (if yours is so equipped for air).
As a side note, a probe gauge when sat on or in a substrate, should read almost the same as an IR measurement of that location, unless there's a disparity in the materials. Something else to try: Use an IR method to check the surface temperature of your digital probe. If the IR shows a different reading (more than a decimal place) than your digital probe, you're seeing the thermal-entrapment in action on the probe.
For basking spots, use IR temperature checking devices. Cheap ones can be had for $20 which are as accurate as any other similarly-priced digital device. More expensive ones just increase the decimal-point-accuracy. I've got a $15 one and have compared it to a 4-digit-$ one for engine work. They both read the same to the first decimal point.
I hope I've come across as helpful in an analytical sense, I in no way mean any offense to the OP, or to take away from his efforts. I merely looked at the experiment and noted it was done improperly, and didn't take into account how the two systems work and what they're meant for.
Both the OP and myself are looking out for the health of the animals, and the further education of those in the hobby. I just want to help everyone understand the physics and science behind the situation.