When you visit a website, you can open your computer to a lot more danger than you might think. All sites load their own content, some load ads served by an ad network, some load content served by other sites, and some load services hosted by other sites. Often, you’re receiving a pretty motley assortment of visible and invisible code.
Sounds like something you need to worry about only on shady or small sites, right? Wrong: A recent analysis by Menlo Security of the world’s most-visited websites shows nearly half still leave visitors open to vulnerable software, too much active content, and large amounts of code execution — in other words, a lot of potential danger. Ultimately, the researchers deemed 42% of the Alexa Top 100,000 “risky.”
Sites trusting other sites
The reasons also included a bunch of things users can’t control at all — unpatched server software, previous known malware infestation, a past security breach, and the like. Beyond the visited site, the findings revealed that each site calls an average of 25 background sites to fetch various types of content.
That means that when you’re visiting a website you presumably trust, you’re actually dealing with dozens of sites, most of which you never even heard of.
Websites serving content from other sources introduces a degree of risk, but that risk became much more significant once cybercriminals realized they could actually target those sources and make them distribute malware. Your favorite news site might be upright and security-minded, but are all of its providers?
Vulnerable Web software
The report also states that many of the world’s most popular websites don’t have to worry about their partners letting them down; they take care of that part just fine — by using outdated servers. Some hadn’t been updated in years or even decades. Such sites are extremely vulnerable to malware and breaches, which in turn puts their visitors at risk.
If last year’s WannaCry outbreak taught the world anything, it’s that updating software in time is important. Or did it?