Threat Hunting Power Up | Enhance Campaign Discovery With Validin and Synapse
Accelerate adversary tracking and reveal hidden infrastructure with our open-source Synapse Rapid Power-Up for Validin.
Threat Hunting Power Up | Enhance Campaign Discovery With Validin and Synapse
Tracking threat actor infrastructure has become increasingly complex. Modern adversaries rotate domains, reuse hosting, and replicate infrastructure templates across operations, making it difficult to connect isolated indicators to broader activity. Checking an IP, a domain, or a certificate in isolation can often return little of value when adversaries hide behind short-lived domains and churned TLS certificates.
As a result, analysts can struggle to see how infrastructure evolves over time or to identify shared traits like favicon hashes, header patterns, or registration overlaps that can link related assets.
To help address this, SentinelLABS is sharing a Synapse Rapid Power-Up for Validin. Developed in-house by SentinelLABS engineers, the sentinelone-validin power-up provides commands to query for and model DNS records, HTTP crawl data, TLS certificates, and WHOIS information, enabling analysts to quickly search, pivot through, and investigate network infrastructure for time-aware, cross-source analysis.
In this post, we explore two real-world case studies to demonstrate how an analyst can use the power-up to discover and expand their knowledge of threats.
Case Study 1 | LaundryBear APT: Body Hash Pivots
When Microsoft published indicators for LaundryBear (aka Void Blizzard), a Russian APT targeting NATO and Ukraine, the threat report included just three domains. Using the power-up’s HTTP body hash pivots, we can expand this seed set to over 30 related domains, revealing the full scope of the campaign’s infrastructure.
Initial Enrichment of Known Indicators
We begin with the s1.validin.enrich command, which serves as a unified entry point for all Validin data sources. Rather than running separate commands for DNS history, HTTP crawls, certificates, and WHOIS records, this single command executes comprehensive enrichment across all four datasets simultaneously.
The resulting node graph immediately reveals initial pivot opportunities—shared nameservers in DNS records, certificate SAN relationships, registration timing patterns, and HTTP fingerprint clusters—providing multiple investigative paths forward.
This rapid reconnaissance phase surfaces the most promising leads before committing to expensive deep pivots, helping analysts choose the optimal next step based on what patterns emerge from the enriched graph.
// Tag the published spear-phishing domain
[inet:fqdn=<phising domain> +#research.laundrybear.seed]
// enrich the initial domain
inet:fqdn#research.laundrybear.seed | s1.validin.enrich --wildcard
// display all unique fqdns related to this seed
inet:fqdn#research.laundrybear.seed -> inet:fqdn | uniq
[Some of the resulting inet:fqdn nodes after initial workflow in Optic (Synapse UI)]
Some of the resulting inet:fqdn nodes after initial workflow in Optic (Synapse UI)
Pivoting from Crawlr Data
The Validin crawler (Crawlr) is a purpose-built, large-scale web crawler operated by Validin that continuously scans internet infrastructure. Querying Validin through the sentinelone-validin power-up provides access to pre-existing crawl observations, allowing instant analysis without active scanning.
The crawler data for our seed domains was already downloaded during the initial s1.validin.enrich command. This created inet:http:request nodes in Synapse containing multiple HTTP fingerprints stored as custom properties: body hashes (SHA1), favicon hashes (MD5), certificate fingerprints, banner hashes, and CSS class hashes.
Each fingerprint type serves as a pivot point: body hashes reveal identical content, favicon hashes expose shared branding, certificate fingerprints uncover SSL infrastructure, and class hashes detect configuration patterns. Together, these pivots transform the initial three seed domains into a comprehensive infrastructure map.
The query starts with our tagged seed domains, pivots to any related FQDNs discovered during enrichment, follows URL relationships, and lands on the actual HTTP request nodes captured by Validin’s crawler. Each inet:http:request node serves as a rich pivot point connecting to multiple content fingerprints and infrastructure properties.
// List all http requests to all the subdomains
inet:fqdn#research.laundrybear.seed -> inet:fqdn -> inet:url -> inet:http:request
[Group pivot in Optic helps to quickly summarize hashes across lifted inet:http:request nodes]
Group pivot in Optic helps to quickly summarize hashes across lifted inet:http:request nodes
[Collapsed list of nodes yielded from the pivot]
Collapsed list of nodes yielded from the pivot
HTTP Pivot Discovery
Validin’s Laundry Bear Infrastructure analysis identified synchronized HTTP responses across threat actor infrastructure. We can reach the same discovery using Storm’s HTTP pivot with statistical output.
When executing inet:http:request | s1.validin.http --dry-run, the command prints detailed occurrence statistics to the Storm console: how many times each HTTP fingerprint from the input http response (body SHA1, favicon MD5, banner hashes, certificate fingerprints, header patterns) appears in Validin’s database. For example, a body hash might appear on 21 IPs and 55 hostnames, a favicon hash might match 18 IPs and 52 hostnames, while a certificate fingerprint on 15 IPs and 48 hostnames.
The size of these counts is the critical indicator. High counts in the thousands indicate benign infrastructure like CDNs and can be dismissed from consideration. Very low counts (1-5) suggest isolated infrastructure. However, when a particular hash appears in the Validin crawler database with a moderate count (15-55 hosts with the same hash fingerprint in this case), it can indicate synchronized infrastructure provisioning: the exact pattern that characterized Laundry Bear’s coordinated buildout. In short, the --dry-run flag transforms expensive full-graph pivots into rapid statistical reconnaissance.
// Collect all the hash:sha1 indicators gathered in the previous step
and perform a "dry run" with the s1.validin http.pivot to check statistics
inet:fqdn#research.laundrybear.seed
-> inet:fqdn -> inet:url
-> inet:http:request -> hash:sha1 | uniq
| s1.validin.http.pivot --dry-run
[Storm console output indicates interesting pivot hashes]
Storm console output indicates interesting pivot hashes
Materializing and Summarizing Pivots in Synapse
After identifying promising body hash pivots through --dry-run statistics, we need to materialize the actual infrastructure and summarize the results. Consider this command:
// materialize and summarize apex domains form single pivot
hash:sha1=38c47d338a9c5ab7ccef7413edb7b2112bdfc56f
| s1.validin.http.pivot --yield
// pivot to apex domains
| +inet:fqdn -> inet:fqdn +:iszone=true | uniq
[Resulting inet:fqdn nodes from the http pivot]
Resulting inet:fqdn nodes from the http pivot
[...]