- Apr 9, 2018
- 207
I hope it is right subforum to this topic.
Activity @Andy Ful is resulted change of downloading malware to machines during our tests Advanced In The Wild Malware Test, so we are internally considering changes in this regard from future editions. This is only a theory for now, but I am curious about your opinion.
How do we now select malware for the test?
1. Download malware from honeypot, feeds.
A. There is a problem of malware disappearing quickly - server status 404, 500, 503, etc. It's all about short life of URLs in the wild.
B. To solve this, we use hosting malware from our own server.
On the other hand, how can we download malware differently?
1. Prepare a list of URLs with malware in the wild.
Pros:
- Potential for even better replication of so-called real-test or in the wild tests.
- Additional information for each sample: original_source_URL, source_url_scraping
Cons:
- Fewer samples - I can't predict this in advance, because it depends on whether we can find good sources of malicious URLs. Let me remind you that our Dionaea honeypot is not suitable for this.
- Big changes in the backend, so costs for implementation and performance testing.
Dear community, what do you think? Will this way of downloading malware be better? Does it matter to you? Please share additional ideas or what we can still perform better.
If implemented, the methodology will be completed and everything will be public. For now, this is just a theory.
Activity @Andy Ful is resulted change of downloading malware to machines during our tests Advanced In The Wild Malware Test, so we are internally considering changes in this regard from future editions. This is only a theory for now, but I am curious about your opinion.
How do we now select malware for the test?
1. Download malware from honeypot, feeds.
- Scan with Yara rules.
- Scan with matching patterns.
- If NO, reject sample and return to ad1.
A. There is a problem of malware disappearing quickly - server status 404, 500, 503, etc. It's all about short life of URLs in the wild.
B. To solve this, we use hosting malware from our own server.
On the other hand, how can we download malware differently?
1. Prepare a list of URLs with malware in the wild.
- Download the malware from URL1 to the Linux host and perform Yara check and matching. Save the SHA256, original URL, server code to the database.
- Reject the malware, if the code is 404, 500, 503 or other inconsistency.
- Reject the malware if it is not harmful.
- Save the test result.
Pros:
- Potential for even better replication of so-called real-test or in the wild tests.
- Additional information for each sample: original_source_URL, source_url_scraping
Cons:
- Fewer samples - I can't predict this in advance, because it depends on whether we can find good sources of malicious URLs. Let me remind you that our Dionaea honeypot is not suitable for this.
- Big changes in the backend, so costs for implementation and performance testing.
Dear community, what do you think? Will this way of downloading malware be better? Does it matter to you? Please share additional ideas or what we can still perform better.
If implemented, the methodology will be completed and everything will be public. For now, this is just a theory.