According to my sources (which work in meters, not ns), a consumer-grade single-frequency GPS receiver could see an instantaneous timing error of up to ~100 ns on a bad day at sea level. Averaging would reduce that substantially, as would simply waiting for a good day (~5 ns) or, of course, being well above sea level. A dual-frequency receiver would reduce the error dramatically, although it would not reduce it to zero. The GPS almanac (part of the GPS signal) includes some iono/tropo correction information, but I don't know how complete it is. Finally, working from memory, I believe that there are sources that publish measurement-based parameters for iono/tropo correction models that are used for post-processing GPS data if you're not in a hurry.
So <1 ns error due to atmospheric effects sounds very doable to me for a fixed site with a good budget and some time. Of course, that's <1 ns at the antenna of the GPS receiver.
ETA: The above was strictly for atmospheric effects. Of course, there are many sources of error and many mitigations.