krelnik
Graduate Poster
Did I miss a memo?
I've noticed that just in the last six months, several major websites have made the same change to the way their pages are structured. Each has moved their Javascript files from being hosted on the same domain as the page itself, to being hosted on a differently named domain.
Now, there is a well known trick of using different names for subcomponents of a page. For instance, www.example.com is where your HTML is hosted, then you put images on images.example.com and javascripts on scripts.example.com. This gets around a limit of two simultaneous TCP connections at a time that many browsers adhere to. That is not what I am talking about.
What I am talking about is hosting Javascripts specifically on an entirely different domain. Let me be specific. Just in the last six months:
youtube.com has moved its .JS files to ytimg.com
cnn.com has moved its .JS files to cdn.turner.com
weather.com has moved its .JS files to j.imwx.com
Whats odd about this is that they each have their own specifically registered domain for this purpose, and its not the domain of a content delivery network or other infrastructure provider. In fact, if you try to load the home page off most of those domains, they usually redirect back to the main domain of the company.
Now you are probably wondering, why does krelnik care about this? Well, you have to know that I am a security nerd, having worked at several computer security companies in the past. As a result, I have a deep abiding fear of the havoc that Javascript can wreak on your system. It is the primary vector for delivery of most malware at this time.
As a result, I follow the practice of whitelisting sites that I will allow Javascript (and other active content such as Java) to run on. As a result, to make a site like cnn.com work, I have to manually put *.cnn.com into a list. It's a pain in the behind, but I prefer it over the alternative.
When folks used things like scripts.cnn.com to get around the 2-connection limit, that worked fine, and all was well.
Now I have discovered that (as of 2008) this is not enough. Now I have to do View Source on a page I want to whitelist, and figure out what "secret" domain that site is using to store their Javascripts (and sometimes CSS) on.
Can anyone explain to me why all these sites started doing this?
I've noticed that just in the last six months, several major websites have made the same change to the way their pages are structured. Each has moved their Javascript files from being hosted on the same domain as the page itself, to being hosted on a differently named domain.
Now, there is a well known trick of using different names for subcomponents of a page. For instance, www.example.com is where your HTML is hosted, then you put images on images.example.com and javascripts on scripts.example.com. This gets around a limit of two simultaneous TCP connections at a time that many browsers adhere to. That is not what I am talking about.
What I am talking about is hosting Javascripts specifically on an entirely different domain. Let me be specific. Just in the last six months:
youtube.com has moved its .JS files to ytimg.com
cnn.com has moved its .JS files to cdn.turner.com
weather.com has moved its .JS files to j.imwx.com
Whats odd about this is that they each have their own specifically registered domain for this purpose, and its not the domain of a content delivery network or other infrastructure provider. In fact, if you try to load the home page off most of those domains, they usually redirect back to the main domain of the company.
Now you are probably wondering, why does krelnik care about this? Well, you have to know that I am a security nerd, having worked at several computer security companies in the past. As a result, I have a deep abiding fear of the havoc that Javascript can wreak on your system. It is the primary vector for delivery of most malware at this time.
As a result, I follow the practice of whitelisting sites that I will allow Javascript (and other active content such as Java) to run on. As a result, to make a site like cnn.com work, I have to manually put *.cnn.com into a list. It's a pain in the behind, but I prefer it over the alternative.
When folks used things like scripts.cnn.com to get around the 2-connection limit, that worked fine, and all was well.
Now I have discovered that (as of 2008) this is not enough. Now I have to do View Source on a page I want to whitelist, and figure out what "secret" domain that site is using to store their Javascripts (and sometimes CSS) on.
Can anyone explain to me why all these sites started doing this?
Last edited: