In the landscape of threat intelligence, there are many challenges. Of these challenges, data format should not be one of them. However, it is.
As many already know, proprietary data formats cause both integration and scalability issues. In today’s market of seemingly endless cyber security solutions, data interoperability is paramount and we as an industry are spending too much time, money, and effort reinventing the wheel to communicate across systems or products.
Realizing threat intelligence is complex by nature, consuming a data format will never be as simple as something like Bluetooth. However, it is in the industry’s best interest to rigorously move toward a data format that can help vendors, governments, and sharing groups achieve data interoperability.
Enter, STIX. More specifically, STIX 2.1.
From its rigid XML beginnings, STIX 2.1 is the product of a full specification migration leveraging an improved JSON data format, making it easier for machines and humans alike to understand and consume threat data.
At CTA, we were early adopters of STIX 2.0 and understand firsthand, the evolution of STIX has also come at a cost. The spec is large and dynamic, leaving intel orchestration open for interpretation. This creates friction in two different areas. The first, communication between subject matter experts and software engineers. Secondly, sharing groups.
Despite this, it appears as though STIX is becoming, at a minimum, loosely adopted. Sharing communities are doing their best to implement it, but in order to achieve more widespread adoption, I think the industry needs more vendors to introduce STIX as part of their organic workflow. Meaning, a company’s internal interoperability would rely on STIX to eliminate today’s excessive data format conversion necessity. This introduces expensive custom coding as well as opportunity for both implementation and operational issues.
With all of this in mind, we find ourselves at a crossroads. We have a viable format to adopt but it comes with a steeper than average learning curve. That said, to solve a challenge so difficult, a steeper than average learning curve should be expected. In my estimation, to close the gap between learning and implementing, we need better visual training and prototyping tools. I recently open-sourced a STIX 2.1 data modeling user interface to encourage just that.
These kinds of tools will certainly help, but change will not come overnight. It is going to take a concerted effort from the entire industry to make standardization work, but if we want to reduce the cost of operation, increase software interoperability, and eventually get in front of the learning curve, I believe it is imperative we start adopting data format standards sooner rather than later.
Author: Jason Minnick
Keep up to date with CTA
Get the latest news, updates, and event information. You may unsubscribe at any time.
The latest from the cyber threat alliance
Upcoming CTA Webinar – Sharing Over the Long Run: Celebrating 5 Years of Enduring Collaboration
While some sharing organizations have already lasted for multiple decades, long-term collaborative efforts have not been as common [...]