Anúncio

Ed McCabe - Putting the Intelligence back in Threat Intelligence

18 de Apr de 2016
Anúncio

Mais conteúdo relacionado

Apresentações para você(20)

Similar a Ed McCabe - Putting the Intelligence back in Threat Intelligence(20)

Anúncio

Mais de centralohioissa(20)

Anúncio

Ed McCabe - Putting the Intelligence back in Threat Intelligence

  1. Putting the Intelligence back in Threat Intel What the eye sees, the mind believes
  2. Co-Founder @ Rendition InfoSec Iowa Farm Boy Home Schooled until High School Geek in High School Veteran US Navy InfoSec Professional Talk to much Wasn’t always jaded Live Tweet: @edwardmccabe #COISSA_INFOSEC_SUMMIT_2016 #UNSWEET_ICED_TEA_ROCKS Puppies, tickle fights, and walks on the beach Might be a bit controversial at times. speaker.getbackground()
  3. In order to have Threat Intelligence, you need to have “Intelligent” Analysis. BLUF
  4. What is Threat Intel?
  5. • Early Warning • Strategic Planning • Competitive Operations • Security & Counter Intelligence Uses of Threat Intelligence
  6. so. much. this
  7. frustrations
  8. What is threat intelligence analysis like?
  9. Not understanding threat intelligence Not understanding the threat actors Jumping to conclusions Not having the basics of InfoSec to begin with How to fail at Threat Intelligence
  10. Like other security practices… People ProcessesTechnology
  11. TI: People
  12. TI: Processes
  13. TI: Technology
  14. Understanding the threat actors Nation-state Criminal Syndicates Hacktivists Terrorists Lone Gunmen You have zero influence over their agenda and whether or not they target you.
  15. NOT ALL THREATS ARE EQUAL
  16. • Requirements • Focus • Support • Understanding • Objectivity For successful Threat Intelligence Analysis you need…
  17. Requirements Leadership must specifically define what the requirements are:  What kind of troops  What guards they have  Their strength  How many boats Requirements need to be defined from the top down!
  18. Focus
  19. The Threat Intelligence Focal Point Operating Environment Raw Data Processed Information ACTIONABLE INTELLIGENCE Operations A N A L Y S T S ANALYSIS
  20. • Staff • Training & Education • Budget • Resources support
  21. Understanding
  22. Threat Actors
  23. Hollywood “Romanticized” Hacker Script Kiddies Lone Gunmen Hactivists Criminal Syndicates Nation States Motivations
  24. Script Kiddies Lone Gunmen Hactivists Criminal Syndicates Nation States Hollywood “Romanticized” Hacker Capabilities
  25. • Stop jumping to conclusions • Look at the evidence (those are facts) • Recognize you have cognitive biases Objectivity
  26. Cognitive Biases
  27. "Tell me what you know. Tell me what you don't know. Then tell me what you think. Always distinguish which is which." When it comes to intelligence analysis General Colin Powell
  28. Analysis matters!!!
  29. some fundamental science
  30. • Analysis of Competing Hypotheses • Diamond Model of Intrusion Analysis • Social Network Analysis • Sentiment Analysis • Kill Chain Analysis • Advocatus Diaboli (aka 10th Man Rule) • Red Teaming Analysis Methods
  31. • Our perception is biased towards interpretation of information into existing expectations – The IP Address is from China, must be China • Reasoning is subjective – Heuristic psychology – Cognitive Biases – Confirmation Biases • People typically fail to generate hypotheses – Jump to conclusions • Fail to consider all of the evidence • Fail to focus on discrediting an alternate hypotheses Analysis of Competing Hypotheses (ACH)
  32. • Developed by Sergio Caltagirone, Andrew Pendergast, and Christopher Betz • Model for Intrusion Analysis • The model establishes, a formal method applying scientific principles to intrusion analysis – particularly those of measurement, testability, and repeatability – providing a comprehensive method of activity documentation, synthesis, and correlation. But pretty much means you’ve already been p0wned Diamonds Model of Intrusion Analysis Source: http://www.activedefense.org/ Victim Capability Infrastructure Adversary Technology Social Political
  33. Diamond Model
  34. Diamond Model in Action Victim CapabilityInfrastructure Adversary (1) Victim discovers malware (2) Malware contains C2 domain (3) C2 Domain resolves to C2 IP address (4) Firewall logs reveal further victims contacting C2 IP address (5) IP ownership details reveal adversary
  35. The process of investigating social structures through the use of network and graph theories. Visually represents data through Nodes and Edges. Social Network Analysis (SNA)
  36. SOURCE, DESTINATION 192.168.2.10,108.95.55.180 192.168.2.10,178.40.246.154 192.168.2.10,26.46.120.22 192.168.2.10,178.40.246.154 192.168.2.104,82.185.100.54 192.168.2.104,151.48.124.143 192.168.2.104,151.141.249.73 192.168.2.104,151.141.249.73 192.168.2.108,192.168.2.218 192.168.2.108,192.168.2.231 192.168.2.108,176.212.22.60 192.168.2.108,176.212.22.60 192.168.2.108,192.168.2.218 192.168.2.108,192.168.2.231 192.168.2.115,99.16.133.224 192.168.2.115,99.16.133.224 192.168.2.12,125.202.130.34 192.168.2.14,123.141.175.190 192.168.2.14,215.126.166.191 192.168.2.14,176.212.22.60 192.168.2.14,192.168.2.2.98 SRC-DST Analysis SOURCE,DESTINATION 192.168.2.14,215.126.166.191 192.168.2.14,176.212.22.60 192.168.2.14,192.168.2.2.98 192.168.2.15,215.126.166.191 192.168.2.15,82.13.241.79 192.168.2.15,191.148.144.136 192.168.2.15,162.210.26.181 192.168.2.15,191.148.144.136 192.168.2.15,215.126.166.191 192.168.2.16,250.207.68.147 192.168.2.16,123.141.175.190 192.168.2.18,99.101.221.190 192.168.2.18,178.40.246.154 192.168.2.18,218.69.182.106 192.168.2.18,178.40.246.154 192.168.2.18,99.101.221.190 192.168.2.19,227.34.82.123 192.168.2.19,179.64.178.162 192.168.2.19,253.188.168.253 192.168.2.19,179.64.178.162 192.168.2.199,175.90.57.218 192.168.2.199,175.90.57.218 192.168.2.2,176.212.22.60 192.168.2.2,156.64.146.133 192.168.2.2,192.168.2.18 SOURCE,DESTINATION 192.168.2.2,192.168.2.29 192.168.2.2,192.168.2.231 192.168.2.2,44.183.14.57 192.168.2.2,175.181.186.129 192.168.2.2,132.226.125.88 192.168.2.2,192.168.2.231 192.168.2.2,192.168.2.108 192.168.2.2,192.168.2.218 192.168.2.2,132.226.125.88 192.168.2.2,192.168.2.231 192.168.2.2,192.168.2.108 192.168.2.218,250.207.68.147 192.168.2.218,176.212.22.60 192.168.2.218,250.207.68.147 192.168.2.230,91.74.243.52 192.168.2.230,191.148.144.136 192.168.2.231,183.71.34.252 192.168.2.231,175.76.196.20 192.168.2.231,176.212.22.60 192.168.2.231,112.167.63.66 192.168.2.231,192.168.2.2 192.168.2.231,19.70.21.194 192.168.2.231,162.210.26.181 192.168.2.231,112.167.63.66 192.168.2.231,192.168.2.2 SOURCE,DESTINATION 192.168.2.231,192.168.2.2 192.168.2.231,19.70.21.194 192.168.2.231,162.210.26.181 192.168.2.231,112.167.63.66 192.168.2.231,192.168.2.2 192.168.2.231,19.70.21.194 192.168.2.25,231.120.179.155 192.168.2.255,255.255.255.255 192.168.2.255,231.140.73.24 192.168.2.26,69.12.234.244 192.168.2.27,252.214.47.3 192.168.2.27,252.214.47.3 192.168.2.29,176.212.22.60 192.168.2.29,192.168.2.2 192.168.2.29,176.212.22.60 192.168.2.29,192.168.2.2 192.168.2.3,125.200.247.196 192.168.2.3,112.167.63.66 192.168.2.3,112.167.63.66 192.168.2.4,230.228.190.31 192.168.2.4,1.209.233.157 192.168.2.4,237.111.167.150 192.168.2.45,19.70.21.194 192.168.2.45,108.95.55.180 192.168.2.5,91.56.97.211
  37. Trained Staff Threat Intelligence Analysts 1. don’t grow on trees 2. develop skills over a period of years, not days 3. focus on both technical and non-technical threats 4. Can quickly become overwhelmed
  38. Overload Quality Relevancy Trust Focus Challenges
  39. Strategic • Executive Reports • Campaign Analysis • Threat Assessments Tactical • Network Block lists • Firewall/Router Ruleset updates • AV/IDS/IPS Signature updates • Malware Analysis & Reverse Engineering Operational • Indicators of Compromise • C2 infrastructure • Hash sets Threat Intel Products
  40. • Challenge key assumptions • Identify and overcome mental mindsets • Structure substantive uncertainties • Generate alternatives • Reduce the chance of surprise Threat Analysts must remain objective
  41. You can’t always blame…
  42. …just because it walks like a duck… …and it sounds like a duck… Because sometimes…
  43. • Define specific requirements! • Focus • Provide appropriate support • Understand the threat actors • Remain objective • Share!!! How to succeed at Threat Intelligence
  44. And most importantly, a $VENDOR who is pushing lists of “Bad” is really just the Honey Boo Boo of the Threat Intel Industry Take away…
  45. How to get in touch with me Edward McCabe, CISM | CGEIT | CRISC | ISO/IEC 27K1 ISMS LI website: www.renditioninfosec.com – RSEC.US e-mail: edward@renditioninfosec.com Phone: (910) 382-4884 Twitter: @edwardmccabe Google Hangouts: ebmccabe Skype: edwardmccabe LinkedIn: https://www.linkedin.com/in/edwardbmccabe Unofficial Rendition Blog: malwarejake.blogspot.com

Notas do Editor

  1. Intelligence does not come from a tool, it comes from those who analyze information. Threat Intelligence’s primary purpose is to provide information to the business so they can make the most well-informed decisions regarding the risks and effects associated with threats
  2. Threat Intelligence means a lot of different things to a lot of different people Executives tend to look at it as a way to prioritize threats and address perceived risk Security Operations look at is as a way to hunt down known IOCs, respond to alerts, assess anomalies, and attribute incident and events Risk Management look at is as way to help develop their strategic risk priorities and determine what the impact could be Vulnerability Management look for identifying new vulnerability information and comparing it to their latest scan reports Business Units may have OPSEC related issues and make requests like “hey I’m traveling to the Ukraine in two weeks, what should I be concerned about?” So what is it exactly? All of these, and more!!!
  3. Early Warning -Business opportunities -Current and future threats • Strategic Decision making & Plans -Alliance & Acquisitions -Major capital expenditures -New businesses, markets and technology • Competitive Strategies & Operations -Strategies directed at specific competitors -Technology sourcing for product development -Supporting marketing, sales, and manufacturing • Counterintelligence & Security -- Knowing what the competitors know about us -- Protecting our information & intellectual property
  4. 1 - Selling lists of “Bad” Lists come from products they have already sold you (their sensors are your logs) 2 - Cookie Cutter Approach we are aggregating lots of raw data and processing it into lists for you to put into your tools – TI’s not about TOOLS!!!! 3 - Businesses unwilling or not understanding what TI is It’s a cyber thing. It’s done by the NSA. It’s not a business concern. TI, when leveraged properly supports RM efforts. Also, understand the threats, not all threats are equal, but most vendors will provide you their “Lists of Bad” (for a cost) without the needed contextual data required to allow you to form a decision. You have to educate yourself. 4 - One size doesn’t fit all They offer up these lists of bad without context and relation to your organization, why is it a threat to me? 5- Often over priced for what you’re getting One vendor offered “Generic Threat Intelligence feed” for $800,000 USD year, again their sensors are your logs
  5. It’s like having a bunch of puzzle pieces without the box Intelligence is knowledge and foreknowledge of the world around, prelude to management decisions and actions. The process which information is systematically collected, analyzed and disseminated as intelligence to users who can act on it.
  6. There is no Silver bullet Unicorns Turnkey solutions
  7. Core Threat Intelligence Staff - Operators Analysts Managers Support Staff – Firewall - IDS/IPS - Centralized Logging - Incident Responders - Forensic Examiners/Investigators – Reverse Engineers Consumers -- Business Unit Managers/Risk Managers/Executive Leadership Team/Information Technology Operations/CxO Executives – I challenge you to learn about threat intelligence; you need to understand the threats so you can make decisions on how to mitigate risks! This means understanding the motivations of your adversary. You don’t have to be a tech wiz to understand that you have something of value that others would have no problem taking from you if given the opportunity
  8. There are seven core processes associated with Threat Intelligence; they need care and feeding, they need to be managed. You can’t just snap your fingers and “Angels sing, Champagne rains from the heavens and Pegasus arrives to whisk you on to your next meeting. You have to work at it.
  9. Left: Sexy cyborg used a 3D Printer to create a pair of high heels that concealed various implements to gain access to and exfil information Right: Won one pod race, destroyed a franchise with Mittichlorians, became friends with Jar-Jar Binks, and ultimately responsible for the deaths of Billions with the destruction of Alderaan
  10. Last years Ashley Madison breach was interesting for us at Rendition. As with most highly visible data breaches, syndicates and miscreants will take advantage of the headlines. We notified our customers and colleagues if we saw their domains associated with AM. AM did not validate email addresses to account owners BTW, I could have registered as POTUS and they’d have let me register. Now, are these the types of threats you should be concerned about?
  11. TL;DR – YES The implications of being associated with AM, factual or not, could put an organization at risk. Of the 17 organizations we worked with, we collected over 80+ names that were tied to their domains. In follow on discussions with those organizations, 12 reported back to us that they saw an increased rise in phishing attempts; all of which they had been able to block and prevent from ever getting to the recipients; four additional reported back that they saw targeted attacks against executives and senior leadership members who were not listed in the AM-DB we had.
  12. Threat Intelligence is combination of identifying threats to your organization and mining for data. Public & Open Source, Vendor Bulletins & Alerts, your own logs
  13. Develop and define your requirements: What threat actors target you? What are they after? What are your priorities? Do you know the risks to your organization?
  14. Requirements will give you the needed context to start putting the puzzle pieces together.
  15. They have to be able to focus in on threats, both at high and granular levels Focus should be given to answering the Who, Where, Why, When, What, and How of the defined intelligence requirements.
  16. Operating Environment consists of not only your “cyberness” but your geographical locale, business relationships, and support elements. Raw data is just that – raw data, in a variety of forms; paper, digital, verbal Processed information is raw data that has been turned into usable form. Tactical is short range, emphasis on the current operations. Outlines what must be done to be successful to counter an immediate, observed threat. Operations define the general conditions for success; incorporates how and what must be changed as part of routine processes and procedures. Strategic provides the overall direction to the organization, the formulation requires examining where the organization is currently, deciding what actions it should take or where it should go, and determining the best way on how to get it there.
  17. Focus starts to bring things into view
  18. If you’re new to Threat Intelligence, you’re going to need a lot of support. Commercially threat intelligence is just taking off, right now there are only a handful of organizations currently teaching Threat Intelligence courses the SANS Institute, CREST, TrainACE, and Rendition InfoSec
  19. CAMPAIGN: typically thought of as a combination of TTPs, incident/events, and threat actors that will result in a specific, desired outcome of the adversary. COA: A response to an incident or preventative measure taken to mitigate the impact or otherwise hinder the operational capabilities of the adversary. THREAT ACTOR: Information about the adversary; summarized in characteristics, motivations, sophistication, desired goals, and TTPs. TTPs: Tactics, Techniques, and Procedures. Behavioral patterns of the adversary. Who their targets tend to be, attack patterns leveraged, malware employed, resources, personas, and supporting infrastructure. EXPLOIT TARGET: a vulnerability, weakness, or misconfiguration in software, systems, networks, process, procedures, or personnel that may be targeted for exploitation by an adversary. INCIDENT: Information about what occurred, the impact on systems and information, the incident timeline, points of contact, and other descriptive information. Incidents can be related to the threat actors, campaigns, courses of action that were taken or were suggested, indicators used to detect the incident or which were learned during incident response investigation; additionally TTPs used to carry out the attack, and observables captured during the incident.
  20. Script Kiddies Curiosity/Ego Lone Gunmen Financial Gain Hactivists Recognition for a cause Criminal Syndicates Financial Gain Nation States Geopolitical gain Economic dominance
  21. Script Kiddies – homemade labs, youtube, google, and social media can be your friends  Lone Gunmen – abuse of trust, they have access to sensitive information given their longevity; may or may not be technically skilled Hactivists - sheer volume and numbers Criminal Syndicates – motivated, access to resources, skilled, development – tend to operate just like a business (just with a really bad retirement or termination programs) Nation States – motivated, developing resources internally, skilled staff Hollywood’s Romanticized Hacker – knowledge of Albert Einstein, Stephen Hawking, Sir Isaac Newton, Dr. Sheldon Cooper, Gandalf, and Yoda all rolled into one, able to complete in under 60 minutes, have built-in macros that you push F5 to hack the FBI, F6 to hack CIA and F10 to hack $ADVERSARY_OF_CHOICE – these are the baddest of the bad, they have skills that are unimaginable to mere mortals like you and I. PRAY YOU NEVER ARE UP AGAINST THESE TYPES!
  22. 8 different pictures Horse or Frog? Skull or two girls playing? Mind in gutter? Lion or Monkey?
  23. Identification of problem Literature review Specifying the purpose Determine specific questions Conceptual framework (hypotheses) Choice of a methodology for data collection Data collection Verify Data Analyzing and interpreting the data Reporting and evaluating Communicating the research findings and, possibly, recommendations
  24. Originally developed in the 1970’s at the U.S. Central Intelligence Agency by Richards J. Heuer, Jr Designed to address problems with intuitive intelligence analysis that arise from human psychology by helping analysts reduce cognitive limitations which make intelligence analysis difficult to perform. Heuristic psychology are simple, efficient rules, learned or hard-coded by evolutionary processes, that have been proposed to explain how people make decisions, come to judgments, and solve problems typically when facing complex problems or incomplete information. These rules work well under most circumstances, but in certain cases lead to systematic errors or cognitive biases. A cognitive bias is a pattern of deviation in judgment, whereby inferences about other people and situations may be drawn in an illogical fashion. - everyone at school is doing this! - everyone else was speeding, so it was okay for me to as well. - They will never miss these office supplies. Confirmation bias, also called confirmatory bias or myside bias, is the tendency to search for, interpret, favor, and recall information in a way that confirms one's beliefs or hypotheses, while giving disproportionately less consideration to alternative possibilities
  25. The value of ACH is that it guarantees an appropriate process for analysis. Evidence consists of artifacts or arguments where credibility and relevance are considered across the spectrum of hypotheses. Based on the evidence, the diagnosticity of each is ranked between Very Inconsistent (II), Inconsistent (I), Not Applicable (NA), Consistent (C), and Very Consistent (CC) as applied to each hypothetical situation. When considering the effects of a network compromise, based on both evidence and artifacts gathered, in order to assess the intent of the threat actor, multiple hypotheses were generated. In this example we considered that: 1) Nation state was targeting a specific individual 2) An adversary was attempting to expand their command & control capabilities, (grow their real-estate and capabilities) 3) Nation state was attempting to steal intellectual property 4) Someone had “fat fingered” a command on their end and it was all just a coincidence 5) There was an insider who was going to need a stern talking to after this Hypothesis – The first step of the process is to identify all potential hypotheses, preferably using a group of analysts with different perspectives to brainstorm the possibilities. The process discourages the analyst from choosing one "likely" hypothesis and using evidence to prove its accuracy. Cognitive bias is minimized when all possible hypotheses are considered. Evidence – The analyst then lists evidence and arguments (including assumptions and logical deductions) for and against each hypothesis. Diagnostics – Using a matrix, the analyst applies evidence against each hypothesis in an attempt to disprove as many theories as possible. Some evidence will have greater "diagnosticity" than other evidence — that is, some will be more helpful in judging the relative likelihood of alternative hypotheses. This step is the most important, according to Heuer. Instead of looking at one hypothesis and all the evidence ("working down" the matrix), the analyst is encouraged to consider one piece of evidence at a time, and examine it against all possible hypotheses ("working across" the matrix). Refinement – The analyst reviews the findings, identifies any gaps, and collects any additional evidence needed to refute as many of the remaining hypotheses as possible. Inconsistency – The analyst then seeks to draw tentative conclusions about the relative likelihood of each hypothesis. Less consistency implies a lower likelihood. The least consistent hypotheses are eliminated. While the matrix generates a definitive mathematical total for each hypothesis, the analyst must use their judgment to make the final conclusion. The result of the ACH analysis itself must not overrule analysts' own judgments. Sensitivity – The analyst tests the conclusions using sensitivity analysis, which weighs how the conclusion would be affected if key evidence or arguments were wrong, misleading, or subject to different interpretations. The validity of key evidence and the consistency of important arguments are double-checked to assure the soundness of the conclusion's linchpins and drivers. Conclusions and evaluation – Finally, the analyst provides the decision maker with his or her conclusions, as well as a summary of alternatives that were considered and why they were rejected. The analyst also identifies milestones in the process that can serve as indicators in future analyses.[1]
  26. Diamond Model Benefits Enables contextual and relationship-rich indicators improving cyber threat intelligence sharing and increasing the range of applicability of indicators Integrates information assurance and cyber threat intelligence through activity-attack graphs Improves analytic efficiency and effectiveness through easier identification of pivot opportunities and a simple conceptual method to generate new analytic questions Enhances analytic accuracy by enabling hypothesis generation, documentation, and testing, thereby applying more rigor to the analytic process Supports course of action development, planning/gaming, and mitigation strategies by integrating easily with almost any planning framework Strengthens cyber analysis tradecraft development by formalizing first principles upon which new concepts can be explored Identifies intelligence gap through a phase-based approach and the inclusion of external resource requirements as a fundamental meta-feature Supports real-time event characterization by mapping the analytic process to well-understood classification and intrusion detection research Establishes the basis of cyber activity ontologies, taxonomies, cyber threat intelligence sharing protocols, and knowledge management The Diamond Model of intrusion analysis, comprising the core features of an intrusion event: adversary, capability, infrastructure, and victim. The core features are linked via edges to represent the fundamental relationships between the features which can be exploited analytically to further discover and develop knowledge of malicious activity. Adversary: Adversary Operator This is the actual “hacker” or person(s) conducting the intrusion activity. Adversary Customer This entity stands to benefit from the activity conducted in the intrusion. It may be the same as the adversary operator, or it may be a separate person or group. For example, a well resourced adversary customer could at different times or simultaneously direct different operators, each with their own capabilities and infrastructure, to a common victim carrying out common or separate goals. To contrast, a lone adversary operator may have access to fewer capabilities and infrastructure points to carry out their activities while also lacking the ability to bypass simple mitigation. Cognizance of the motivations and resourcing of an adversary operator and their customer, if it exists as a separate entity, will assist in measuring the true threat and risk to the victim resulting in more effective mitigation. Capability: The capability feature describes the tools and/or techniques of the adversary used in the event. The flexibility of the model allows the capability to be described in sufficient fidelity. Capability Capacity All of the vulnerabilities and exposures that can be utilized by the individual capability regardless of victim are considered its capacity. Adversary Arsenal An adversary’s complete set of capabilities, and therefore the combined capacities of their individual capabilities, is the adverary’s arsenal. C2 - Command and control (C2) is the exercise of authority and direction over assets by a commander. While command and control can take many forms, it is ultimately determined by the capability in use. Infrastructure: The infrastructure feature describes the physical and/or logical communication structures the adversary uses to deliver a capability, maintain control of capabilities (e.g., command and control/C2), and effect results from the victim (e.g., exfiltrate data). Type 1 Infrastructure Infrastructure which is fully controlled or owned by the adversary or which they may be in physical proximity. Type 2 Infrastructure Infrastructure which is controlled by an (witting or unwitting) intermediary. Service Providers Organizations which (wittingly or unwittingly) provide services critical for availability of adversary Type 1 and Type 2 infrastructure (e.g., Internet Service Providers, domain registrars, web-mail providers). Victim: A victim is the target of the adversary and against whom vulnerabilities and exposures are exploited and capabilities used. Victim Persona Victim Personae are the people and organizations being targeted whose assets are being exploited and attacked. These include organization names, people’s names, industries, job roles, interests, etc. Victim Asset Victim Assets are the attack surface and consist of the set of networks, systems, hosts, email addresses, IP addresses, social networking accounts, etc. against which the adversary directs their capabilities. Victim assets often exist both inside and outside a persona’s control and visibility but are still available for targeting by an adversary. Common examples of this include webmail accounts and cloud-based data storage. A victim asset can be the end target (e.g., victim) in one event and then leveraged as infrastructure in further events (likely Type 2 Infrastructure as described previously in §4.3). In this way, one must always beware that the apparent target of activity may not necessarily be the victim.
  27. Developed by Sergio Caltagirone, Andrew Pendergast, and Christopher Betz The model establishes, a formal method applying scientific principles to intrusion analysis – particularly those of measurement, testability, and repeatability – providing a comprehensive method of activity documentation, synthesis, and correlation. Diamond Model Benefits Enables contextual and relationship-rich indicators improving cyber threat intelligence sharing and increasing the range of applicability of indicators Integrates information assurance and cyber threat intelligence through activity-attack graphs Improves analytic efficiency and effectiveness through easier identification of pivot opportunities and a simple conceptual method to generate new analytic questions Enhances analytic accuracy by enabling hypothesis generation, documentation, and testing, thereby applying more rigor to the analytic process Supports course of action development, planning/gaming, and mitigation strategies by integrating easily with almost any planning framework Strengthens cyber analysis tradecraft development by formalizing first principles upon which new concepts can be explored Identifies intelligence gap through a phase-based approach and the inclusion of external resource requirements as a fundamental meta-feature Supports real-time event characterization by mapping the analytic process to well-understood classification and intrusion detection research Establishes the basis of cyber activity ontologies, taxonomies, cyber threat intelligence sharing protocols, and knowledge management The Diamond Model of intrusion analysis, comprising the core features of an intrusion event: adversary, capability, infrastructure, and victim. The core features are linked via edges to represent the fundamental relationships between the features which can be exploited analytically to further discover and develop knowledge of malicious activity. Adversary: Adversary Operator This is the actual “hacker” or person(s) conducting the intrusion activity. Adversary Customer This entity stands to benefit from the activity conducted in the intrusion. It may be the same as the adversary operator, or it may be a separate person or group. For example, a well resourced adversary customer could at different times or simultaneously direct different operators, each with their own capabilities and infrastructure, to a common victim carrying out common or separate goals. To contrast, a lone adversary operator may have access to fewer capabilities and infrastructure points to carry out their activities while also lacking the ability to bypass simple mitigation. Cognizance of the motivations and resourcing of an adversary operator and their customer, if it exists as a separate entity, will assist in measuring the true threat and risk to the victim resulting in more effective mitigation. Capability: The capability feature describes the tools and/or techniques of the adversary used in the event. The flexibility of the model allows the capability to be described in sufficient fidelity. Capability Capacity All of the vulnerabilities and exposures that can be utilized by the individual capability regardless of victim are considered its capacity. Adversary Arsenal An adversary’s complete set of capabilities, and therefore the combined capacities of their individual capabilities, is the adverary’s arsenal. C2 - Command and control (C2) is the exercise of authority and direction over assets by a commander. While command and control can take many forms, it is ultimately determined by the capability in use. Infrastructure: The infrastructure feature describes the physical and/or logical communication structures the adversary uses to deliver a capability, maintain control of capabilities (e.g., command and control/C2), and effect results from the victim (e.g., exfiltrate data). Type 1 Infrastructure Infrastructure which is fully controlled or owned by the adversary or which they may be in physical proximity. Type 2 Infrastructure Infrastructure which is controlled by an (witting or unwitting) intermediary. Service Providers Organizations which (wittingly or unwittingly) provide services critical for availability of adversary Type 1 and Type 2 infrastructure (e.g., Internet Service Providers, domain registrars, web-mail providers). Victim: A victim is the target of the adversary and against whom vulnerabilities and exposures are exploited and capabilities used. Victim Persona Victim Personae are the people and organizations being targeted whose assets are being exploited and attacked. These include organization names, people’s names, industries, job roles, interests, etc. Victim Asset Victim Assets are the attack surface and consist of the set of networks, systems, hosts, email addresses, IP addresses, social networking accounts, etc. against which the adversary directs their capabilities. Victim assets often exist both inside and outside a persona’s control and visibility but are still available for targeting by an adversary. Common examples of this include webmail accounts and cloud-based data storage. A victim asset can be the end target (e.g., victim) in one event and then leveraged as infrastructure in further events (likely Type 2 Infrastructure as described previously in §4.3). In this way, one must always beware that the apparent target of activity may not necessarily be the victim.
  28. It characterizes networked structures in terms of nodes (individual actors, people, or things within the network) and the ties or edges (relationships or interactions) that connect them. Social network analysis is used extensively in a wide range of applications and disciplines. Some common network analysis applications include data aggregation and mining, network propagation modeling, network modeling and sampling, user attribute and behavior analysis, community-maintained resource support, location-based interaction analysis, social sharing and filtering, recommender systems development, and link prediction and entity resolution.[39] In the private sector, businesses use social network analysis to support activities such as customer interaction and analysis, information system development analysis,[40] marketing, and business intelligence needs. Some public sector uses include development of leader engagement strategies, analysis of individual and group engagement and media use, and community-based problem solving. Social network analysis is also used in intelligence, counter-intelligence and law enforcement activities. This technique allows the analysts to map a clandestine or covert organization such as a espionage ring, an organized crime family or a street gang. The National Security Agency (NSA) uses its clandestine mass electronic surveillance programs to generate the data needed to perform this type of analysis on terrorist cells and other networks deemed relevant to national security. The NSA looks up to three nodes deep during this network analysis.[41] After the initial mapping of the social network is complete, analysis is performed to determine the structure of the network and determine, for example, the leaders within the network.[42] This allows military or law enforcement assets to launch capture-or-kill decapitation attacks on the high-value targets in leadership positions to disrupt the functioning of the network. SNA allows analysts a way to explore connected data which, on the surface, may not be easily distinguishable. Using a dataset from http://www.uvic.ca/engineering/ece/isot/datasets/index.php, this graph shows botnet traffic between 5000 computers at the University of San Diego. Different colors were used to indicate different protocols. Nodes represent computers and were sized by degree. Edges representt packets, weighted by packet size. Image generated using KeyLines.
  29. Has to be to familiar with not only your industry, but your specific environment. You are a unique snowflake, just like everyone else.
  30. A trained threat intelligence analyst, with clearly defined requirements, who is focused and objective, can leverage their skills to identify the threats; they can put together the pieces to visualize the threat
  31. But as a threat analyst you must always remain objective, processing facts as they become available to ensure that you’re not missing the bigger picture
  32. Having defined requirements ensures that the proper data is collected and processed into information needed for analysis PIRs most critical, specific Irs general threat PIRs and IRs should: Be in the form of a question. Focus on a specific fact, event or activity. Provide resulting intelligence required to support a single decision. Necessary: Is it necessary to answer this question? By answering this question, the intelligence analyst can trend the threat landscape the organization faced in the fourth quarter of 2013 and recommend actions that can be taken to better protect against that threat in the future. Achievable: Can we collect this information? The analyst should have access to the organizations case and incident management system(s) to collect the required data. Specific: Is the requirement specific enough? The requirement is limited to a timeframe and defined subject. Timely: Is the intelligence requirement timely? The analyst will be evaluating the preceding fiscal quarter’s data with the results being applicable to the current quarter.
Anúncio