Over the past few weeks, I've noticed a couple of interesting changes in the tactics used by those responsible for the BlackHat SEO campaigns. First and foremost, let's trackback a second, to June, which documented one of the campaigns.
The campaign documented in June, references a few characteristics that could be used to identify them, and rip them out of the index. So what's changed? Well first and foremost, the .htm files have been changed to .php files;
Second, 2.js is now only included in the file, if the HTTP REFERER server var, is the Google search string, for example;
There's a few obvious reasons for this, the first being to prevent direct analysis. However, since the referer can be faked, there's nothing they can do to prevent us finding this. Additionally, the .js file can be accessed directly, so this doesn't really help them either.
There's also a new campaign however, that uses similar characteristics. I say similar because;
1. There's the obligatory .js file (in this case Bsrajp.js) that's only included if the referer points to Google
2. The .js file doesn't use the same method as before to obfuscate the code
3. We can no longer just load the URL the decoded script gives us, as it now requires for the SEOREF (which should point to Google) and HTTP_REFERER (which points back to the site that loaded our .js file) vars be properly populated, and point to the correct referers.
The new obfuscation is extremely poor;
Which decodes to;
This gives us;
You can guess where this leads ....
quickstatistic.com = 184.108.40.206
securityscanavailable.com = 220.127.116.11
What these idiots don't seem to realize is, if the victim has to load the file - we can load and analyze it.