A crawler is an application developed based on malware communication protocols. We use crawlers to collect information on botnet infections.
Crawlers behave like regular bots; they connect to botnets to collect data from their command and control infrastructure or from other bots in the network.
Some botnets don’t use a traditional C&C server, treating each infected host as a server instead. In these instances, a crawler attempts to connect to the infected host using a malware protocol. The crawler may not actually reach the infected host, but will obtain information about other hosts known to be infected in the process. This means that sometimes, a botnet infection is detected without direct traffic between a customer IP and the crawler IP—instead, multiple infected hosts confirm that a customer IP is also a peer in the botnet.
Feedback
0 comments
Please sign in to leave a comment.