![]() You need to scan more times, and in separate days using Google Webmaster Tools until Google trusts that your change (clean-up) is permanent. Google effectively did not find anything wrong with your website by now, but it still does not trust you because it can not understand your intentions as you may be someone who wants to attack your website's visitors, so Google needs time to be sure that you are not playing a game over its head by just pretending you are not attacking any body for the moment but as soon as Google whitelists you then you come back to your nefarious habit(s) (This is said, I am not accusing you to attack people through your website, I just say why Google still does not trust you even if your website is innocuous for the moment) There may be several reasons for this situation: Let us suppose your website is clean as you say. I am not going to be exhaustive, but I will mention briefly the milestones that can come to my mind: There are several scenarii you may think about. And if properly done robots will not see any remaining malware since manual action is required to get the file (but beware: some robots execute Javascript and can click checkboxes). This way your service can not be used for drive-by-downloads because the content can not be directly downloaded from the link but needs an explicit action by the user. inform the user of the possible dangers and require some kind of captcha or click on a checkbox to continue. Since such limitations might easily make the service you provide unsuitable for several users you might at least try to limit the impact of such files by requiring manual action of the user before downloading the file, i.e. This includes any kind of executable files but also office documents or PDF files. You should also severely limit the types of files you allow for upload and not allow any types which typically contain malware. To detect most of the malware you better combine several engines and tune them so they prefer more false positives instead of let some malware through. Also ClamAV relies on the community to develop the product and keep it up-to-date and does not have the manpower and the access to new threats as commercial vendors do.īut commercial vendors miss a lot of new malware too. One reason is that malware authors can easily tune their malware to bypass the detection algorithms ClamAV uses, since these algorithms are publicly known (open software). While ClamAV is free its detection rate is not very good compared to the better commercial antivirus solutions. Using ClamAV, we scan every file for viruses after it is uploaded and before it can be downloaded. ![]() Services which allow larger files might be especially attractive since lots of commercial firewall vendors severely limit the size of the files they scan and let everything else pass through. They are looking for services which are not (yet) in some kind of blacklists so the chances are higher that their malware can reach the target. Unfortunately these kind of services get easily abused by anybody which likes to spread malware. Probably because there is or was malware on this site. Why does Google Safe Browsing keep detecting malware on my website? ![]() Hopefully with the help of this new anti-virus solution my server never ever again serves as a virus source. So the ClamAV detection rate seems to be a lot lower than the one of Sophos. I've installed "Sophos Server Security" on the server in question and see lots and lots of malicious uploads being deleted by Sophos now. These files are all scaned and detected as virus-free. We also do provide Windows and Mac clients. (If this is possible at all, even Dropbox or Google itself are listed there)Īs Schroeder says, there may be other content on our site, not the files.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |