Crawlers Jessica A crawler is an application developed based on malware communication protocols. We use crawlers to collect information on botnet infections. Crawlers behave like regular bots; they connect to botnets to collect data from their command and control infrastructure or from other bots in the network. Some botnets don’t use a traditional C&C server, treating each infected host as a server instead. In these instances, a crawler attempts to connect to the infected host using a malware protocol. The crawler may not actually reach the infected host, but will obtain information about other hosts known to be infected in the process. This means that sometimes, a botnet infection is detected without direct traffic between a customer IP and the crawler IP—instead, multiple infected hosts confirm that a customer IP is also a peer in the botnet. Associated Risk Vectors Botnet Infections March 28, 2022: Published. Related articles Data Collection Methods Overview How is the Botnet Infections Risk Vector Observed? Requesting a New Vulnerability Threat Research Process Understanding and Troubleshooting Web Application Security Scanning Feedback 0 comments Please sign in to leave a comment.