TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
Fishreel Lessons Learned H4D Stanford 2016
1. The NSA has neither reviewed nor confirmed these slides.
110
Many interviews later...
Hacking for Defense
National
Security
Agency
The NSA has neither reviewed nor confirmed these slides.
Gabbi Fisher
B.S. Computer
Science ‘17
Maya Israni
B.S. Computer
Science ‘17
Chris Gomes
M.B.A. ‘17
Travis Noll
M.B.A. ‘17
Our new problem
statement:
Fishreel uses publicly
available social media
data to provide
information about
potentially anomalous
personas to government
and commercial entities.
Our first problem
statement:
Fishreel automates the
detection of catfishing
attempts to produce
insights useful to
catfishing targets and
entire IC entities.
2. Who is team fishreel?
fishreel
Team Members Gabbi Fisher Chris Gomes Maya Israni Travis Noll
Degree program and
Department/Major
B.S. Computer Science
2017
M.B.A 2017 B.S. Computer Science 2017 M.B.A 2017
LinkedIn www.linkedin.
com/in/gabbifish
https://www.linkedin.
com/in/gomeschris
https://www.linkedin.
com/in/mayaisrani
https://www.linkedin.
com/in/travisalexande
rnoll
Are you the subject
matter expert (SME)
for this team?
Yes
(Engineering, Data
Analysis)
No
(Customer Success,
Product Strategy
No
(Machine Learning, Artificial
Intelligence, Product Design)
No
(Market research,
Simulation software)
How does your
expertise fit the
problem?
+ Experience with data
scraping and analytics
+ Previous Executive
Branch experience in
national security
policy/impl
+ 3 years at DoJ
+ Statistical
programming
+ Management
consulting
+ Software engineering
ML/AI experience at Google
+ Product/user testing
experience at GoldieBlox
+ Conducted 100+
MR interviews
+ Built simulation
software as a
consultant
Experience Solving a
Problem that seemed
Impossible
+ Coordinating rollout of
insider threat detection
tool, still in minimum
viable product (MVP)
state, among international
clients over 13 week
timeframe.
+ Led 120 NGO staff
members in 6 month
workshops to identify
pain points and solicit
input for 5 year strategic
plan
+ At GoldieBlox, helped
develop a prototype to
become a mass-produced
toy on the shelves of Toys ‘R
Us; saw the company from a
one-person startup to a well
established company.
+ Turned around a
small retail store that
was bleeding cash
and achieved most
profitable quarter on
record
Professional
affiliations
The NSA has neither reviewed nor confirmed these slides.
3. The NSA has neither reviewed nor confirmed these slides.
What is the national security concern we’re helping to
solve?
Bad people
make heavy
use of social
media.
Sometimes they
make themselves
obvious.
4. The NSA has neither reviewed nor confirmed these slides.
Sometimes not.
5. The NSA has neither reviewed nor confirmed these slides.
Our original hypothesis was
that the NSA wanted to send
us data on ISIS and help them
mitigate recruitment
Original Mission Model Canvas
heavily emphasized ISIS,
recruitment, accessing NSA data,
and mitigation
6. The NSA has neither reviewed nor confirmed these slides.
Through 100+ interviews, came to
understand broader problem for both
the NSA and other beneficiaries: a
need to better understand who or
what is behind various social media
accounts
New Mission Model Canvas
identifies multiple beneficiaries
and NSA mission achievement
around entity insight
7. The NSA has neither reviewed nor confirmed these slides.
Tracing (1) problem understanding, (2) proposed
solution, and (3) agency relationship over 10 weeks
8. The NSA has neither reviewed nor confirmed these slides.
Pushed
agency for
ISIS data,
won our
first award
1
Tracing (1) problem understanding, (2) proposed
solution, and (3) agency relationship over 10 weeks
9. The NSA has neither reviewed nor confirmed these slides.
“If you haven’t gotten thrown out yet, you’re not trying
hard enough”: We came out swinging
~Spring Break
10. The NSA has neither reviewed nor confirmed these slides.
“If you haven’t gotten thrown out yet, you’re not trying
hard enough”: We came out swinging
Week Zero
Our initial hypothesis was that we
would get a supervised training set of
sensitive data on which to build a
model
Instead, we would be pulling our own
unsupervised data from publicly available
sources (meaning if we wanted to train a
model on recorded instances of catfishing,
we’d have to code them ourselves…)
Steve dropping a major clue before day
one of the class about where our product
would go…
...But it would be a while before we came
around to this point ourselves.
~Spring Break
11. The NSA has neither reviewed nor confirmed these slides.
Tracing (1) problem understanding, (2) proposed
solution, and (3) agency relationship over 10 weeks
Pushed
agency for
ISIS data,
won our
first award
Push to build solution
before understanding
problem, AKA #1 way
to get on Steve’s hit list
1
2
12. The NSA has neither reviewed nor confirmed these slides.
In week one, we were holding out hope that NSA would
still send us *some* data (spoiler alert: we’re still waiting)
Week 1
We had only heard that we were
not getting *supervised* data (i.
e., data that is pre-coded for
instances of catfishing). We
were still holding out hope that
our sponsor would send us at
least an unsupervised data set...
Slide from week 1 presentation
We hadn’t yet learned the difference between
hearing and listening…
… More importantly to Steve, we were worried
about data and technical solutions before we
had even understood the problem.
13. The NSA has neither reviewed nor confirmed these slides.
In week one, we were holding out hope that NSA would
still send us *some* data (spoiler alert: we’re still waiting)
We had only heard that we were
not getting *supervised* data (i.
e., data that is pre-coded for
instances of catfishing). We
were still holding out hope that
our sponsor would send us at
least an unsupervised data set...
Slide from week 1 presentation
We hadn’t yet learned the difference between
hearing and listening…
… More importantly to Steve, we were worried
about data and technical solutions before we
had even understood the problem.
STEVE
BLANK:
“FULL
STOP!”
Week 1
14. The NSA has neither reviewed nor confirmed these slides.
Tracing (1) problem understanding, (2) proposed
solution, and (3) agency relationship over 10 weeks
Pushed
agency for
ISIS data,
won our
first award
Hit it off early with sponsors, they
quickly lined up analysts to help
us understand problem. But it
took a while to understand how to
have productive convos...
Push to build solution
before understanding
problem, AKA #1 way
to get on Steve’s hit list
1
2
3
15. The NSA has neither reviewed nor confirmed these slides.
Sponsors were immediately helpful in setting up calls
with analysts, though communication was hard at first
Weeks 2-3
What is the setup of your work station?
Is it bigger than a breadbox?
Do you, maybe, like, have a computer?
. . .
I would describe that as a not unreasonable assumption.
The challenge:
16. The NSA has neither reviewed nor confirmed these slides.
Sponsors encouraged us to ask lots of questions,
constantly improving our understanding
Weeks 2-3
The process:
INTERVIEWS. Over 50 interviews with NSA employees and
many more with members of intel community
DRAWINGS. “Is this what it
looks like?”
TRIAL AND ERROR. Mapping
sponsor org and roles, getting it
wrong, going back to drawing
board
QUESTIONS.
Sponsors joined our
slack channel,
constantly available to
help
17. A day in the life of an agency analyst: major pain
point is manually digging into “robot-human-liar”
+ Receive docket on desk as part of broader investigation
+ Part of the docket is the “Robot-human-liar” test: Is this identity tag tied to
a bot, an individual telling the truth, or an individual manipulating you?
+ Begin manually investigating userIDs
+ Search through one data set
+ Print / read all data
+ Search through second data set
+ Print / read all data
+ Continue until confidence is extremely high: “If we get it wrong in the report,
it’s on us. It would be a big problem. Like a doctor diagnosing the wrong
illness.”
+ Identity analysis takes the form of a section in a report going back to the
inquiring organization
+ LARP on weekends / Magic the Gathering at lunch
Repeat for
2 days - 4 months
(all userIDs,
redundant
searches)
Repeat for all data
sources
Takeaways
● Monotonous
● Manual
● Slow
● Sisyphean Task
● ML learning can add insight in addition to speed / automation
The NSA has neither reviewed nor confirmed these slides.
Weeks 2-3
The results:
18. The NSA has neither reviewed nor confirmed these slides.
Products
& Services
+Account
consistency tools
+Account linking
tools
+Author
classification
+ Identity matching across
platforms
+ “Consistency” scoring on each
platform and across the
platforms
+ Classify catfishing accurately (4
categories) Customer
Jobs
Produce
intelligence for
reports (internally
or on behalf of
other agencies)
- Using publicly available data, work/
confirmation tends to be manual
- False positives (short-run bad)
- False negatives (long-run bad)
- Data overload - doesn’t have real-time
solution
- Information overload: can’t understand
model output
-Timeliness: Manual nature makes
“responsive” research more difficult
Beneficiary: A/B Intel Analysts
Gains
Pains
Gain
Creators
Pain
Relievers
+ Target research
+ Incremental progress on
understanding entities
+ Understanding adversary strategy
+ Detecting false identities & bots
- Make efficient use of new data
for benefit of rapid updates
-Synthesize output into easily-
understood, non-static reports
- Weigh false positives, false
negatives appropriately
Value Proposition
Demographics
● Average 3-10 years at the Agency
● Majority male (70/30 split)
● Technical background
Agency
Developed understanding of the value proposition for
intelligence analysts after getting in their heads
Note: Specific agency orgs have been masked
Weeks 2-3
19. The NSA has neither reviewed nor confirmed these slides.
Began iterating on our wireframe MVP to test the
hypotheses we were developing
Our first MVP Several iterations later...
Weeks 2-3
20. The NSA has neither reviewed nor confirmed these slides.
Tracing (1) problem understanding, (2) proposed
solution, and (3) agency relationship over 10 weeks
Pushed
agency for
ISIS data,
won our
first award
Hit it off early with sponsors, they
quickly lined up analysts to help
us understand problem. But it
took a while to understand how to
have productive convos...
Push to build solution
before understanding
problem, AKA #1 way
to get on Steve’s hit list
Realized we were just scratching
the surface of necessary buy-in at
agency...
1
2
4
3
21. The NSA has neither reviewed nor confirmed these slides.
We had plateaued in our understanding of the problem -
in part because “deployment” is part of the landscape
A/B Analysts
+ Grades 13/14
+ Technical and non-
technical focuses
+ Mostly male, late 20s-
30s, technical degrees,
5-12 years @ Agency
+ Want better tools to
use at work
This information is
based on student
inferences and
public sources, and
has not been
reviewed or
endorsed by the
NSA.
Integrated Office
+ Mandate: bridge the gap between A/B/C (etc.), and
innovators outside of agency
where we were initially
focused
Note: Specific agency orgs have been masked
Weeks 3-4
22. The NSA has neither reviewed nor confirmed these slides.
We had plateaued in our understanding of the problem -
in part because “deployment” is part of the landscape
A/B Analysts
+ Grades 13/14
+ Technical and non-
technical focuses
+ Mostly male, late 20s-
30s, technical degrees,
5-12 years @ Agency
+ Want better tools to
use at work
A/B Directorate
Seniors
+ Control funding
+ Indirect knowledge of tech
being used/desired
+ Mostly male, 40s-50s, 15+
years @ agency
Integrated Office
+ Mandate: bridge the gap between A/B/C (etc.), and
innovators outside of agency
where we were initially
focused
This information is
based on student
inferences and
public sources, and
has not been
reviewed or
endorsed by the
NSA.
Note: Specific agency orgs have been masked
Weeks 3-4
23. The NSA has neither reviewed nor confirmed these slides.
We had plateaued in our understanding of the problem -
in part because “deployment” is part of the landscape
A/B Analysts
+ Grades 13/14
+ Technical and non-
technical focuses
+ Mostly male, late 20s-
30s, technical degrees,
5-12 years @ Agency
+ Want better tools to
use at work
A/B Directorate C Directorate
Seniors
+ Control funding
+ Indirect knowledge of tech
being used/desired
+ Mostly male, 40s-50s, 15+
years @ agency
C Seniors
+ Controls funding
+ Indirect knowledge of tech
being used/built
C Developers
+ Grades 13/14
+ Technical focus
+ Mostly male,
(demographics TBD)
+Advance the cutting
edge research that
helps analysts do their
job (algo focused)
Integrated Office
+ Mandate: bridge the gap between A/B/C (etc.), and
innovators outside of agency
where we were initially
focused
This information is
based on student
inferences and
public sources, and
has not been
reviewed or
endorsed by the
NSA.
Note: Specific agency orgs have been masked
Weeks 3-4
24. The NSA has neither reviewed nor confirmed these slides.
We had plateaued in our understanding of the problem -
in part because “deployment” is part of the landscape
A/B Analysts
+ Grades 13/14
+ Technical and non-
technical focuses
+ Mostly male, late 20s-
30s, technical degrees,
5-12 years @ Agency
+ Want better tools to
use at work
A/B Directorate C Directorate
Seniors
+ Control funding
+ Indirect knowledge of tech
being used/desired
+ Mostly male, 40s-50s, 15+
years @ agency
C Seniors
+ Controls funding
+ Indirect knowledge of tech
being used/built
C Developers
+ Grades 13/14
+ Technical focus
+ Mostly male,
(demographics TBD)
+Advance the cutting
edge research that
helps analysts do their
job (algo focused)
Integrated Office
+ Mandate: bridge the gap between A/B/C (etc.), and
innovators outside of agency
D Directorate
+ Developers
+ Build tools to make
analyst life easier / more
effective
where we were initially
focused
This information is
based on student
inferences and
public sources, and
has not been
reviewed or
endorsed by the
NSA.
Note: Specific agency orgs have been masked
Weeks 3-4
25. The NSA has neither reviewed nor confirmed these slides.
We had plateaued in our understanding of the problem -
in part because “deployment” is part of the landscape
A/B Analysts
+ Grades 13/14
+ Technical and non-
technical focuses
+ Mostly male, late 20s-
30s, technical degrees,
5-12 years @ Agency
+ Want better tools to
use at work
A/B Directorate C Directorate
Seniors
+ Control funding
+ Indirect knowledge of tech
being used/desired
+ Mostly male, 40s-50s, 15+
years @ agency
C Seniors
+ Controls funding
+ Indirect knowledge of tech
being used/built
C Developers
+ Grades 13/14
+ Technical focus
+ Mostly male,
(demographics TBD)
+Advance the cutting
edge research that
helps analysts do their
job (algo focused)
Integrated Office
+ Mandate: bridge the gap between A/B/C (etc.), and
innovators outside of agency
D Directorate
Team E
(exact location unknown)
+ Renders pre-existing
external tools on high-
side (e.g. Google)
+ Developers
+ Build tools to make
analyst life easier / more
effective
where we were initially
focused
This information is
based on student
inferences and
public sources, and
has not been
reviewed or
endorsed by the
NSA.
Note: Specific agency orgs have been masked
Weeks 3-4
26. The NSA has neither reviewed nor confirmed these slides.
Several weeks of desk research and interviews to arrive
at current understanding of org and key players...
Note: Specific agency orgs have been masked
Weeks 3-4
27. The NSA has neither reviewed nor confirmed these slides.
Products
& Services
Publicly available
social media search
engine and
classifier
+ Develop in capabilities
currently underserved by Agency
technology (publicly available
information, social media)
+ Create algos to directly fulfill
mandate Customer
Jobs
High level guidance to
org, cutting edge
researching,
developing new algos- Trying to balance different
organizational priorities
- Difficulty keeping pace with
technology industry and academia
-Convincing analysts to use latest
methods and algos
C Seniors @ Agency
Gains
Pains
Gain
Creators
Pain
Relievers
+ Keep Agency closer to cutting edge
of social media technology
+ Augmenting tools and processes
with breadth of cutting-edge research
+ Algos that directly meet the
demands of analysts will reduce
time spent by C researching and
developing in-house
Primary beneficiary:
4 of 10
Demographics
● 15+ years at Agency
● Very technical background
Agency
... And value proposition for all agency stakeholders
involved
Weeks 3-4
28. The NSA has neither reviewed nor confirmed these slides.
Tracing (1) problem understanding, (2) proposed
solution, and (3) agency relationship over 10 weeks
Pushed
agency for
ISIS data,
won our
first award
Hit it off early with sponsors, they
quickly lined up analysts to help
us understand problem. But it
took a while to understand how to
have productive convos...
Began exploring beneficiaries at
similar agencies, branched out
to beneficiaries outside gov’t:
first dual-use hypothesis
Push to build solution
before understanding
problem, AKA #1 way
to get on Steve’s hit list
Realized we were just scratching
the surface of necessary buy-in at
agency...
1
2
5
4
3
29. The NSA has neither reviewed nor confirmed these slides.
Came back to Steve’s early clue in week ~4-5, realizing
the need was far greater than just NSA, and a dual-use
solution could be better for all customers & fundraising
Mapped landscape of all
possible beneficiaries,
comparing the needs and
value-proposition for each.
Ultimate goal is to find
customers (a) with high need
and (b) whose data could
make our classifier more
powerful
Began deep dive
into several of
these customers
Weeks 4-5
30. The NSA has neither reviewed nor confirmed these slides.
Product Beneficiary Mission Achievement
Open-
source
agency tool
A/B Analysts Automation, insight, and timeliness in understanding entities
A/B Seniors Minimize pain in bringing in new tools, increase robustness of reports
C Dev Code base that can be adopted painlessly, pass security check
C Seniors Bolster existing algos with publicly available data
E Successful, painless rendering of low-end tool on high side
Agency Director Improve understanding of signal intelligence, information assurance
Classified
agency tool
v1.0 Beneficiaries In addition to above - integrating analysis/insights with classified data
D Directorate Developing easily deployable high-end tool that builds off the success of
user experience in v1.0, managing transition from 1.0 to 2.0
Dual-use
tool
Consumers Notified when interacting with a potentially fake account, or blocked
Commercial IA No employees lose credentials to catfishing attempt. Notified of attempts
Gov’t IA teams All fake account interactions with personnel is flagged and/or stopped
Social media co Fake accounts are flagged and removed before harm is done
Tricky part is that “Mission Achievement” varies by the
beneficiaries - still figuring out how to balance...
Note: Specific agency orgs have been masked
Weeks 4-5
31. The NSA has neither reviewed nor confirmed these slides.
Tracing (1) problem understanding, (2) proposed
solution, and (3) agency relationship over 10 weeks
Pushed
agency for
ISIS data,
won our
first award
Hit it off early with sponsors, they
quickly lined up analysts to help
us understand problem. But it
took a while to understand how to
have productive convos...
Began exploring beneficiaries at
similar agencies, branched out
to beneficiaries outside gov’t:
first dual-use hypothesis
Whoops: Security
concern. Big setback
Push to build solution
before understanding
problem, AKA #1 way
to get on Steve’s hit list
Realized we were just scratching
the surface of necessary buy-in at
agency...
1
2
5
4
3
6
32. The NSA has neither reviewed nor confirmed these slides.
Tracing (1) problem understanding, (2) proposed
solution, and (3) agency relationship over 10 weeks
Pushed
agency for
ISIS data,
won our
first award
Hit it off early with sponsors, they
quickly lined up analysts to help
us understand problem. But it
took a while to understand how to
have productive convos...
Began exploring beneficiaries at
similar agencies, branched out
to beneficiaries outside gov’t:
first dual-use hypothesis
Whoops: Security
concern. Big setback
Picked up momentum in
developing solution for
agency
Push to build solution
before understanding
problem, AKA #1 way
to get on Steve’s hit list
Realized we were just scratching
the surface of necessary buy-in at
agency...
1
2
5
4
3
7
6
33. The NSA has neither reviewed nor confirmed these slides.
Reviewed academic literature on anomaly detection in
social media; conferred with sponsor and mentors
Weeks 8-9
34. The NSA has neither reviewed nor confirmed these slides.
The last several weeks have been spent sprinting on a
non-powerpoint MVP to use with the NSA
Weeks 8-9
35. The NSA has neither reviewed nor confirmed these slides.
Tracing (1) problem understanding, (2) proposed
solution, and (3) agency relationship over 10 weeks
Pushed
agency for
ISIS data,
won our
first award
Hit it off early with sponsors, they
quickly lined up analysts to help
us understand problem. But it
took a while to understand how to
have productive convos...
Began exploring beneficiaries at
similar agencies, branched out
to beneficiaries outside gov’t:
first dual-use hypothesis
Whoops: Security
concern. Big setback
Picked up momentum in
developing solution for
agency
Push to build solution
before understanding
problem, AKA #1 way
to get on Steve’s hit list
Realized we were just scratching
the surface of necessary buy-in at
agency...
1
2
5
4
3
7
6
Weeks 8-9
36. The NSA has neither reviewed nor confirmed these slides.
Tracing (1) problem understanding, (2) proposed
solution, and (3) agency relationship over 10 weeks
Pushed
agency for
ISIS data,
won our
first award
Hit it off early with sponsors, they
quickly lined up analysts to help
us understand problem. But it
took a while to understand how to
have productive convos...
Began exploring beneficiaries at
similar agencies, branched out
to beneficiaries outside gov’t:
first dual-use hypothesis
Whoops: Security
concern. Big setback
Picked up momentum in
developing solution for
agency
Push to build solution
before understanding
problem, AKA #1 way
to get on Steve’s hit list
Realized we were just scratching
the surface of necessary buy-in at
agency...
1
2
5
4
3
7
6
So where are
we now, in
week 10?
37. The NSA has neither reviewed nor confirmed these slides.
We’ve identified needs for both government and non-
government, proactive and defensive use cases
The Hypothesized World of our Addressable Market
Proactive Needs Defensive Needs
Government
● Entity Enrichment: Understanding
online monikers, the authors
behind them, and their behavior
and identities across various social
media platforms
● Traditionally invested in classified
data, need to better leverage
open-source data
Anything to replace ineffective
spear phishing training
“Currently orgs give their employees training
so they can recognize incoming spear
phishing attacks. Well I bet you can guess
how well that works. I mean, how many
people keep it in their pants after taking sex
ed?” -Interview with VC investor
Non-government
● Offering a service to users that is
unmuddied by fake accounts
● Typically make effective use of
proprietary data and some open-
source; unclear how well they take
advantage of data off their
respective platforms
Week 10
38. The NSA has neither reviewed nor confirmed these slides.
We believe the dual-use option can pick up significant
momentum relative to a bespoke solution
Week 10
39. The NSA has neither reviewed nor confirmed these slides.
We believe the dual-use option can pick up significant
momentum relative to a bespoke solution
...But the major, immediate need is verifying hypotheses
outside of the NSA
Week 10
40. IRL 1
IRL 4
IRL 3
IRL 2
IRL 7
IRL 6
IRL 5
IRL 8
IRL 9
First pass on MMC w/Problem Sponsor
Complete ecosystem analysis petal diagram
Validate mission achievement (Right side of canvas)
Problem validated through initial interviews
Prototype low-fidelity Minimum Viable Product
Value proposition/mission fit (Value Proposition Canvas)
Validate resource strategy (Left side of canvas)
Prototype high-fidelity Minimum Viable Product
Establish mission achievement metrics that matterTeam Assessment: IRL 5
Post H4D Course Actions
Team fishreel validated
relevant needs for intel
analysts at the NSA and
mocked up a “Beta”
solution - however, the
next step is validating the
dual-use case by continuing
to test hypotheses with
beneficiaries outside the
NSA
Week 10
41. The NSA has neither reviewed nor confirmed these slides.
We’ve sketched out the minimum resource needs if we
were to continue beyond Week 10
Week 10
42. The NSA has neither reviewed nor confirmed these slides.
Thank
you!
43. The NSA has neither reviewed nor confirmed these slides.
Fishreel is four Stanford student standing on
the shoulders of several giants:
Contact Role Org
SPONSORS - Two really
great public servants in
particular, 50+ others
Sponsor
Brandon Johns DIUX Liaison
Leora Morgenstern Mentor
Tushar Tambay Mentor
Matt Johnson Mentor
Guy Mordecai Mentor
Contact Role Org
Eric Smyth Mentor
Aaron Sander Mentor
Wayne E Chen Mentor
Brad Dispensa Technical SME
Matt Jamieson Technical SME
Lieutenant
Colonel Scott
Maytan
Military Liaison
Lieutenant
Colonel Ed
Sumangil
Military Liaison
And many more...
44. The NSA has neither reviewed nor confirmed these slides.
Backup: MMCS
45. - Data (training and
validation sets)
- Processing power
- Communication with
customer (DoD analyst
needs, existing
infrastructure, etc)
Mission Model Canvas: Week 1
- Software Engineering
- Automated ML model
building
- API integrations
- UI/UX Design
- Information produced
must be easily interpreted
by analyst
- Continued sponsorship by defense
beneficiary
- Government entities with
established catfishing detection
algorithms, databases, infrastructure
- Public streaming data sources (e.
g. Twitter, Facebook, YouTube)
- Possibly, data sources for
private/encrypted communications
(encrypted consumer applications
like Kik and Telegram, popular
among ISIS recruiters)
- Predictive modeling/data analysis
companies
-Technical partners w/ processing
power
- Scholars in behavioral psychology
and interdisciplinary fields relevant
to catfishing
- Primary: Intelligence analysts
trying to find catfishing attempt
underway so they can intervene and
prevent security breaches.
- Secondary: Catfishing targets
pursued by non-state, insurgent
actors. This includes intelligence
analysts who may be subject to
social engineering attacks by
sophisticated/state actors, as well
as young westerners being targeted
by ISIS.
- Secondary: Users interacting with
intelligence analysts, e.g., superiors,
finance officers
- Secondary: Private companies
that provide profile / social
component with incentive to bolster
existing technology
Intelligence Orgs
- Understand adversary
behavior and strategy: gain
knowledge about channels used
by adversaries
- Provide predictive insight:
based on target’s historic
activity, predict likely future
activity
- Improve existing model for
detecting catfishing and
authenticating identity
Catfishing victims
- Prevent recruiting under
false identities: e.g. ISIS uses
twitter to recruit westerners
- Prevent transmission of
sensitive data to bad actors:
- Improve speed and accuracy of identity authentication
- Reduce occurrences of catfishing
-Better communicate predictive power and logic of machine learning
- Pilot test: Use models to
predict likelihood of catfishing
in historical data
- Initial deployment with specific
focus on ISIS within the DoD
- Initial focus on publicly
available data to make case to
DoD
- Broader deployment: more
use cases within DoD
- Combine findings with private
sector, in-house solutions
Fixed:
- Processing costs (AWS servers, virtual machine space for data processing)
Variable:
- Electricity to power servers
- Need access to DoD data
sources
- Support from both public and
encrypted social media and data
service sites
- Need understanding of existing
DoD catfishing prevention
intelligence and infrastructure
Beneficiaries
Mission AchievementMission Budget/Costs
Buy-In/Support
Deployment
Value PropositionKey Activities
Key Resources
Key Partners
The NSA has neither reviewed nor confirmed these slides.
46. - Data Scraping
- Pull publicly available
catfishing data
- Software Engineering
- Automated ML model
building
- API integrations
- UI/UX Design
- Information produced
must be easily interpreted
by analyst
- Results should be able
to be tweaked and further
explored
Mission Model Canvas: Week 2
- Continued sponsorship by
defense beneficiary
- Other Government agencies with
or without established catfishing
detection / defense
- Public streaming data sources (e.
g. Twitter, Facebook, YouTube)
- Possibly, data sources for
private/encrypted communications
(encrypted consumer applications
like Kik and Telegram, popular
among ISIS recruiters)
- Predictive modeling/data analysis
companies
-Technical partners w/ processing
power
- Scholars in CS, behavioral
psychology, and interdisciplinary
fields relevant to catfishing
- Primary: Intelligence analysts
trying to proactively find catfishing
attempt outside the Agency
network so they can intervene and
prevent security breaches.
- Secondary: Agencies who need
to augment network defense
against catfishing
- Secondary: Catfishing targets
pursued by non-state, insurgent
actors. This includes intelligence
analysts who may be subject to
social engineering attacks by
sophisticated/state actors, as well
as young westerners
- Secondary: Users interacting
with intelligence analysts, e.g.,
superiors, finance officers
- Tertiary: Private companies that
provide profile / social component
with incentive to bolster existing
technology
- Provide users investigating catfishing with better speed & accuracy,
and flexibility in classifying online activity
- Better communicate underlying logic of machine learning
- Provide dynamic solution to analysts
Primary users:
- Classify behavior as
“catfishing” or normal
- Understand adversary
behavior and strategy: gain
knowledge about channels
used by adversaries
- Improve existing model for
detecting catfishing and
authenticating identity
- Reduce burden on analysts
by automating review of
catfishing input data
Secondary users: Improve
existing defenses against
catfishing
Tertiary users: Augment private
companies’ ability to detect
catfishing within walled gardens
- Pilot test: Use models to
predict likelihood of catfishing
in historical data
- Initial focus on publicly avail.
data, make case to DoD
- Broader deployment: more
use cases within DoD use
cases w/ publically avail. data
- Combine findings with other
agencies and private sector, in-
house solutions
Fixed:
- Processing costs (AWS servers, virtual machine space for data processing)
Variable:
- Electricity to power servers
-Agency sponsors/legal team
- Need access to DoD data
sources
- Support from both public and
encrypted social media and
data service sites
- Understanding of existing
DoD catfishing prevention
intelligence and infrastructure
Beneficiaries
Mission AchievementMission Budget/Costs
Buy-In/Support
Deployment
Value PropositionKey Activities
Key Resources
Key Partners
- Data (training and
validation sets)
- Processing power
- Communication with
customer (Agency
agencies’ analyst needs,
existing infrastructure,
etc)
The NSA has neither reviewed nor confirmed these slides.
47. dfdsf
- Data
- Processing power
- Communication with
customer (analyst needs,
existing infrastructure, etc)
- Data Scraping
- Pull publicly available
catfishing data
- Software Engineering
- Automated ML model building
- API integrations
- UI/UX Design
- Information produced must be
easily interpreted by analyst
- Results should be able to be
tweaked and further explored
Mission Model Canvas: Week 3
- Continued sponsorship by
defense beneficiary
- Other Government agencies
with or without established
catfishing detection / defense
- Public streaming data
sources (e.g. Twitter,
Facebook, YouTube)
- Predictive modeling/data
analysis companies
-Technical partners w/
processing power
- Scholars in CS, behavioral
psychology, and
interdisciplinary fields relevant
to catfishing
- Primary: Intelligence analysts
trying to proactively find catfishing
attempt outside the Agency
network so they can intervene and
prevent security breaches.
- Secondary: Agencies who need
to augment network defense
against catfishing spearfishing.
This includes the FBI, DoD, and
others within The Agency
- Tertiary: Private companies
(high priority: banks, utilities,
critical infrastructure)
- Users want to grab a *working model* from FIshreel hands
- Provide users investigating catfishing with better speed & accuracy,
and flexibility in classifying online activity
- Better communicate underlying logic of machine learning
- Provide dynamic solution to analysts
Primary users:
- Classify behavior userIDs as
“catfishing” or normal: 1)
Real, 2) Bot, 3) Impersonator,
4) Spoof
- Understand adversary
identity / archetypes,
behavior, and strategy
- Reduce manual burden on
analysts by automating
review of catfishing input
data
Secondary users: Improve
existing defenses against
catfishing spearfishing
Tertiary users: Augment private
companies’ ability to detect
catfishing within walled gardens
- Pilot test: Use models to
predict likelihood of catfishing
in historical data
- Focus on publicly avail. data,
make case to DoD
- Broader deployment: use
cases w/ publically avail. data
- Combine findings with other
agencies and private sector, in-
house solutions
Fixed:
- Processing costs (AWS servers, virtual machine space for data processing)
Variable:
- Electricity to power servers
- Understanding of existing
DoD catfishing prevention
intelligence and infrastructure
- Technical support for MVP
backend infrastructure
Beneficiaries
Mission AchievementMission Budget/Costs
Buy-In/Support
Deployment
Value PropositionKey Activities
Key Resources
Key Partners
The NSA has neither reviewed nor confirmed these slides.
48. dfdsf
- Data
- Processing power
- Communication with
customer (analyst needs,
existing infrastructure, etc)
- Data Scraping
- Pull publicly available
catfishing social media data
- Software Engineering
- Automated ML model building
- API integrations
- UI/UX Design
- Information produced must be
easily interpreted by analyst
- Results should be able to be
tweaked and further explored
Mission Model Canvas: Week 4
- Continued sponsorship by
defense beneficiary
- Other Government agencies
with or without established
catfishing detection / defense
- Public streaming data
sources (e.g. Twitter,
Facebook, YouTube)
- Predictive modeling/data
analysis companies
-Technical partners w/
processing power
- Scholars in CS, behavioral
psychology, and
interdisciplinary fields relevant
to catfishing
- Primary: Intelligence analysts
trying to proactively find catfishing
attempt outside the Agency
network so they can intervene and
prevent security breaches.
- Secondary: Agencies who need
to augment network defense
against spearfishing. This includes
the cybercrimes agents in the FBI,
DoD, and others within The
Agency
- Tertiary: Cybersecurity analysis
in private companies (high priority:
banks, utilities, critical
infrastructure)
- Users want to grab a *working model* from FIshreel hands
- Provide users investigating catfishing with better speed & accuracy,
and flexibility in classifying online activity
- Better communicate underlying logic of machine learning
- Provide dynamic solution to analysts
Primary users:
- Classify userIDs as
“catfishing” or normal: 1) Real,
2) Bot, 3) Impersonator, 4)
Spoof
- Understand adversary identity
/ archetypes, behavior, and
strategy
- Reduce manual burden on
analysts by automating review
of catfishing input data
Secondary users: Improve
existing defenses against
catfishing spearfishing
Tertiary users: Augment private
companies’ ability to detect
catfishing within walled gardens
+ 2 routes: 1) mandated by
top leaders (rare)
2) Small teams pilot, if useful
& good UI word of mouth
Both take ~2 years
+ Dist: Standalone tool,
purchased directly by small
team (hypothesis)
- Pilot test: Use models to predict likelihood
of catfishing in historical data
- Focus on publicly avail. data, make case to
DoD
- Broader deployment: use cases w/
publically avail. data
- Combine findings with other
agencies and private sector, in-
house solutions
Fixed:
- Processing costs (AWS servers, virtual machine space for data processing)
Variable:
- Electricity to power servers
- Understanding of existing DoD catfishing
prevention intelligence and infrastructure
- Technical support for MVP
backend infrastructure: daily
maintenance of data scraping
- Approval of security/legal
depts and buy-in from team
lead / manager
Beneficiaries
Mission AchievementMission Budget/Costs
Buy-In/Support
Deployment
Value PropositionKey Activities
Key Resources
Key Partners
The NSA has neither reviewed nor confirmed these slides.
49. Sponsor agency:
→ S2/S3 Analysts
→ S2/S3 Chiefs
→ R6 Developers
→ R6 Chiefs
→ DoDDIR
Other agencies:
→ FBI - SSA
→ DOD - ?
- Commercial:
→ Banks
→ Utilities
→ Critical infrastructure
→ Social Media
dfdsf
- Data
- Processing power
- Communication with
customer (analyst needs,
existing infrastructure, etc)
- Data Scraping
- Pull publicly available social
media data
- Software Engineering
- Automated ML model building
- API integrations
- UI/UX Design
- Information produced must be
easily interpreted by analyst
- Results should be able to be
tweaked and further explored
Mission Model Canvas: Week 5
- Continued sponsorship by
defense beneficiary
- Other Government agencies
with or without established
catfishing detection / defense
- Public streaming data
sources (e.g. Twitter,
Facebook, YouTube)
- Predictive modeling/data
analysis companies
-Technical partners w/
processing power
- Scholars in CS, behavioral
psychology, and
interdisciplinary fields relevant
to catfishing
- Users want to grab a *working model* from FIshreel hands
- MINIMIZE FALSE POSITIVES AT START, FALSE NEGATIVES LONG RUN
- Provide users investigating catfishing with better speed & accuracy,
and flexibility in classifying online activity
- Better communicate underlying logic of machine learning
- Provide dynamic solution to analysts
S2/S3 Analysts: Use ML
techniques to quickly,
effectively classify userIDs as
“catfishing” or normal: 1) Real,
2) Bot, 3) Impersonator, 4)
Spoof.
S2/S3 Chiefs: Improved speed
of reports emanating from
agency. Increased credibility
by relying on latest methods.
R6 Developers: “version 1.0” is
quick and painless in deploy.
R6 Chiefs: Fulfill mandate to
create / identify pain-relieving
tools
Other agencies: Improve
existing defenses against
spearfishing
Commercial: Augment private
companies’ ability to detect
catfishing within walled gardens
+ 2 routes: 1) mandated by
top leaders (rare)
2) Small teams pilot, if useful
& good UI word of mouth
Both take ~2 years
+ Dist: Standalone tool,
purchased directly by small
team (hypothesis)
- Combine findings with other
agencies and private sector, in-
house solutions
Fixed:
- Processing costs (AWS servers, virtual machine space for data processing)
Variable:
- Electricity to power servers
- Technical support for MVP
backend infrastructure: daily
maintenance of data scraping
- Approval of security/legal
depts and buy-in from team
lead / manager
Beneficiaries
Mission AchievementMission Budget/Costs
Buy-In/Support
Deployment
Value PropositionKey Activities
Key Resources
Key Partners
The NSA has neither reviewed nor confirmed these slides.
50. Sponsor agency:
→ S2/S3 Analysts
→ S2/S3 Chiefs
→ R6 Developers
→ R6 Chiefs
→ DoDDIR
Other agencies:
→ FBI - SSA
→ DOD - ?
- Commercial:
→ Banks
→ Utilities
→ Critical infrastructure
→ Social Media
→ Dating Sites
dfdsf
- Data
- Processing power
- Communication with
customer (analyst needs,
existing infrastructure, etc)
- Data Scraping
- Pull publicly available social
media data
- Software Engineering
- Automated ML model building
- API integrations
- UI/UX Design
- Information produced must be
easily interpreted by analyst
- Results should be able to be
tweaked and further explored
Mission Model Canvas: Week 6
- Continued sponsorship by
defense beneficiary
- Other Government agencies
with or without established
catfishing detection / defense
- Public streaming data
sources (e.g. Twitter,
Facebook, YouTube)
- Predictive modeling/data
analysis companies
-Technical partners w/
processing power
- Scholars in CS, behavioral
psychology, and
interdisciplinary fields relevant
to catfishing
- Users want to grab a *working model* from FIshreel hands
- ASSIGNING COSTS TO FALSE POSITIVES V. FALSE NEGATIVES
- Provide users investigating catfishing with better speed & accuracy,
and flexibility in classifying online activity
- Better communicate underlying logic of machine learning
- Provide dynamic solution to analysts
S2/S3 Analysts: Use ML
techniques to quickly,
effectively classify userIDs as
“catfishing” or normal: 1) Real,
2) Bot, 3) Impersonator, 4)
Spoof.
S2/S3 Chiefs: Improved speed
of reports emanating from
agency. Increased credibility
by relying on latest methods.
R6 Developers: “version 1.0” is
quick and painless in deploy.
R6 Chiefs: Fulfill mandate to
create / identify pain-relieving
tools
Other agencies: Improve
existing defenses against
spearfishing
Commercial: Augment private
companies’ ability to detect
catfishing within walled gardens
+ 2 routes: 1) mandated by
top leaders (rare)
2) Small teams pilot, if useful
& good UI word of mouth
Both take ~2 years
+ Dist: Standalone tool,
purchased directly by small
team (hypothesis)
- Combine findings with other
agencies and private sector, in-
house solutions
Fixed:
- Processing costs (AWS servers, virtual machine space for data processing)
Variable:
- Electricity to power servers
- Technical support for MVP
backend infrastructure: daily
maintenance of data scraping
- Approval of security/legal
depts and buy-in from team
lead / manager
Beneficiaries
Mission AchievementMission Budget/Costs
Buy-In/Support
Deployment
Value PropositionKey Activities
Key Resources
Key Partners
The NSA has neither reviewed nor confirmed these slides.
51. The NSA has neither reviewed nor confirmed these slides.
Version 1.0:
Sponsor agency:
1. A/B Analysts
2. A/B Seniors
3. C Developers
4. C Seniors
5. E
6. DoDDIR
Version 2.0:
→ All of v1.0
7. D
Version 3.0:
8. Consumers
9. Other agencies:
→ FBI - SSA
→ DOD - ?
10. Commercial:
→ Banks
→ Utilities
→ Critical infrastructure
→ Social Media
→ Dating Sites
dfdsf
- Data
- Processing power
- Communication with
customer (analyst needs,
existing infrastructure, etc)
- Data Scraping
- Pull publicly available social
media data
- Software Engineering
- Automated ML model building
- API integrations
- UI/UX Design
- Information produced must be
easily interpreted by analyst
- Results should be able to be
tweaked and further explored
Mission Model Canvas: Week 7
- Continued sponsorship by
defense beneficiary
- Other Government agencies
with or without established
catfishing detection / defense
- Public streaming data
sources (e.g. Twitter,
Facebook, YouTube)
- Predictive modeling/data
analysis companies
-Technical partners w/
processing power
- Scholars in CS, behavioral
psychology, and
interdisciplinary fields relevant
to catfishing
A/B Analysts: Use ML
techniques to better understand
online personas using publicly
available data.
A/B Seniors: Improved speed
of reports emanating from
agency. Increased credibility
by relying on latest methods.
C Developers: “version 1.0” is
quick and painless in deploy.
C Seniors: Mandate to create /
identify pain-relieving tools
E: Minimum technical difficulty
in rendering tool, traffic from
other users
DoDDIR: Enhance
understanding of signal
intelligence, better defense
against cyber attacks
D: Seamless transition to 2.0
Consumer: Protect self
Other agencies: Improve
existing defenses against
spearfishing
Commercial: Augment private
companies’ ability to detect
catfishing within walled gardens
+ 2 routes: 1) mandated by
top leaders (rare)
2) Small teams pilot, if useful
& good UI word of mouth
Both take ~2 years
+ Dist: Standalone tool,
purchased directly by small
team (hypothesis)
- Combine findings with other
agencies and private sector, in-
house solutions
Fixed:
- Processing costs (AWS servers,
virtual machine space for data
processing)
Variable:
- Electricity to power servers
- Technical support for MVP
backend infrastructure: daily
maintenance of data scraping
- Approval of security/legal
depts and buy-in from team
lead / manager
Beneficiaries
Mission AchievementMission Budget/Costs
Buy-In/Support
Deployment
Value PropositionKey Activities
Key Resources
Key Partners
Consumers: Notified when interacting with a
potentially fake account, or blocked
Gov’t IA teams: All fake account interactions with
personnel is flagged and/or stopped
Commercial IA: No employees lose credentials to
catfishing attempt. Notified of attempts
Social media co: Fake accounts are flagged and
removed before harm is done
-A/B Analysts:: Automation, insight, and timeliness in understanding entities
-A/B Seniors: Minimize pain in bringing in tools, up robustness of reports
-C Developers: Code base that can be adopted painlessly, clear security
-C Seniors Bolster existing tools w/ tool oriented toward publicly available data
-E: Successful, painless rendering of low-end tool on high side
-DoDDIR: Improve understanding of signal intelligence
- D: Developing easily deployable high-end tool that builds off the success of user
experience in v1.0, managing transition from 1.0 to 2.0
52. The NSA has neither reviewed nor confirmed these slides.
Version 1.0:
Sponsor agency:
1. A/B Analysts
2. A/B Seniors
3. C Developers
4. C Seniors
5. E
6. DoDDIR
Version 2.0:
→ All of v1.0
7. D
Version 3.0:
8. Consumers
9. Other agencies:
→ FBI - SSA
→ DOD - ?
10. Commercial:
→ Banks
→ Utilities
→ Critical infrastructure
→ Social Media
→ Dating Sites
→ Municipalities
dfdsf
+Time
+Connections
+Publicly available data
+Execution, Product Dev,
Research Talent
+Sponsor connection
+Killer val-prop
+Encryption / 3.0 security
+Money
+Onboarding/HR process
+Servers, architecture
+Dual-use pitch
+3.0 Beneficiaries/Value Prop
+Gather/process ortho data
+Build model on ortho data
+Push 1.0 backend code
+Dev, host, render 1.0
+ID early third-party adopters
+Dev SaaS version
+ID next wave of adopters
+Raise money, build 3.0 team
+Buy addt’l third-party data
+Process all new data
+Build model on new data
+Sell “2.0” as dual-use
Mission Model Canvas: Week 8
+ Channels to more
orthogonal data/ creative
solutions b4 data partnership
● Product Dev XP, e.
g., CTO visionaries
● PhD resources,
Researchers,
Stanford U
+ Early 3rd party adopters
● Smaller co’s that still
have useful data, e.
g., schools, munis,
critical local infr.
+3.0 data security partners
+High-data-value adopters
+VCs
A/B Analysts: Use ML
techniques to better understand
online personas using publicly
available data.
A/B Seniors: Improved speed
of reports emanating from
agency. Increased credibility
by relying on latest methods.
C Developers: “version 1.0” is
quick and painless in deploy.
C Seniors: Mandate to create /
identify pain-relieving tools
E: Minimum technical difficulty
in rendering tool, traffic from
other users
DoDDIR: Enhance
understanding of signal
intelligence, better defense
against cyber attacks
D: Seamless transition to 2.0
Consumer: Protect self
Other agencies: Improve
existing defenses against
spearfishing
Commercial: Augment private
companies’ ability to detect
catfishing within walled gardens
+ 2 routes: 1) mandated by
top leaders (rare)
2) Small teams pilot, if useful
& good UI word of mouth
Both take ~2 years
+ Dist: Standalone tool,
purchased directly by small
team (hypothesis)
- Combine findings with other
agencies and private sector, in-
house solutions
Fixed:
- Processing costs (AWS servers,
virtual machine space for data
processing)
Variable:
- Electricity to power servers
- Technical support for MVP
backend infrastructure: daily
maintenance of data scraping
- Approval of security/legal
depts and buy-in from team
lead / manager
Beneficiaries
Mission AchievementMission Budget/Costs
Buy-In/Support
Deployment
Value PropositionKey Activities
Key Resources
Key Partners
Consumers: Notified when interacting with a
potentially fake account, or blocked
Gov’t IA teams: All fake account interactions with
personnel is flagged and/or stopped
Commercial IA: No employees lose credentials to
catfishing attempt. Notified of attempts
Social media co: Fake accounts are flagged and
removed before harm is done
-A/B Analysts:: Automation, insight, and timeliness in understanding entities
-A/B Seniors: Minimize pain in bringing in tools, up robustness of reports
-C Developers: Code base that can be adopted painlessly, clear security
-C Seniors Bolster existing tools w/ tool oriented toward publicly available data
-E: Successful, painless rendering of low-end tool on high side
-DoDDIR: Improve understanding of signal intelligence
- D: Developing easily deployable high-end tool that builds off the success of user
experience in v1.0, managing transition from 1.0 to 2.0
53. The NSA has neither reviewed nor confirmed these slides.
Version 1.0:
Sponsor agency:
1. A/B Analysts
2. A/B Seniors
3. C Developers
4. C Seniors
5. E
6. DoDDIR
Version 2.0:
→ All of v1.0
7. D
Version 3.0:
8. Consumers
9. Other agencies:
→ FBI - SSA
→ DOD - ?
10. Commercial:
→ Banks
→ Utilities
→ Critical infrastructure
→ Social Media
→ Dating Sites
→ Municipalities
dfdsf
+Time
+Connections
+Publicly available data
+Execution, Product Dev,
Research Talent
+Sponsor connection
+Killer val-prop
+Encryption / 3.0 security
+Money
+Onboarding/HR process
+Servers, architecture
+Dual-use pitch
+3.0 Beneficiaries/Value Prop
+Gather/process ortho data
+Build model on ortho data
+Push 1.0 backend code
+Dev, host, render 1.0
+ID early third-party adopters
+Dev SaaS version
+ID next wave of adopters
+Raise money, build 3.0 team
+Buy addt’l third-party data
+Process all new data
+Build model on new data
+Sell “2.0” as dual-use
Mission Model Canvas Week 9
+ Channels to more
orthogonal data/ creative
solutions b4 data partnership
● Product Dev XP, e.
g., CTO visionaries
● PhD resources,
Researchers,
Stanford U
+ Early 3rd party adopters
● Smaller co’s that still
have useful data, e.
g., schools, munis,
critical local infr.
+3.0 data security partners
+High-data-value adopters
+VCs
A/B Analysts: Use ML
techniques to better understand
online personas using publicly
available data.
A/B Seniors: Improved speed
of reports emanating from
agency. Increased credibility
by relying on latest methods.
C Developers: “version 1.0” is
quick and painless in deploy.
C Seniors: Mandate to create /
identify pain-relieving tools
E: Minimum technical difficulty
in rendering tool, traffic from
other users
DoDDIR: Enhance
understanding of signal
intelligence, better defense
against cyber attacks
D: Seamless transition to 2.0
Consumer: Protect self
Other agencies: Improve
existing defenses against
spearfishing
Commercial: Augment private
companies’ ability to detect
catfishing within walled gardens
Sources
+$100k in first seed funding to
cover basic continuation of work
beyond class
+$400k in winter to cover major
variable cost expansion
+??? in Series A at 12 months
+??? Series B at 24 month
Uses
+Almost entirely variable for first
12 months:
Founders salary: $42
Building model: $48
Develop SaaS: $48
Non-tech labor: $26
Paid data $180
Servers: $40
Additional eng: $96
Total: $480 for first 12 months
+ 2 routes: 1) mandated by
top leaders (rare)
2) Small teams pilot, if useful
& good UI word of mouth
Both take ~2 years
+ Dist: Standalone tool,
purchased directly by small
team (hypothesis)
- Combine findings with other
agencies and private sector, in-
house solutions
- Technical support for MVP
backend infrastructure: daily
maintenance of data scraping
- Approval of security/legal
depts and buy-in from team
lead / manager
Beneficiaries
Mission Achievement
Mission Budget/Costs
Buy-In/Support
Deployment
Value PropositionKey Activities
Key Resources
Key Partners
Consumers: Notified when interacting with a
potentially fake account, or blocked
Gov’t IA teams: All fake account interactions with
personnel is flagged and/or stopped
Commercial IA: No employees lose credentials to
catfishing attempt. Notified of attempts
Social media co: Fake accounts are flagged and
removed before harm is done
-A/B Analysts:: Automation, insight, and timeliness in understanding entities
-A/B Seniors: Minimize pain in bringing in tools, up robustness of reports
-C Developers: Code base that can be adopted painlessly, clear security
-C Seniors Bolster existing tools w/ tool oriented toward publicly available data
-E: Successful, painless rendering of low-end tool on high side
-DoDDIR: Improve understanding of signal intelligence
- D: Developing easily deployable high-end tool that builds off the success of user
experience in v1.0, managing transition from 1.0 to 2.0
54. The NSA has neither reviewed nor confirmed these slides.
Version 1.0:
Sponsor agency:
1. A/B Analysts
2. A/B Seniors
3. C Developers
4. C Seniors
5. E
6. DoDDIR
Version 2.0:
→ All of v1.0
7. D
Version 3.0:
8. Consumers
9. Other agencies:
→ FBI - SSA
→ DOD - ?
10. Commercial:
→ Banks
→ Utilities
→ Critical infrastructure
→ Social Media
→ Dating Sites
→ Municipalities
dfdsf
+Time
+Connections
+Publicly available data
+Execution, Product Dev,
Research Talent
+Sponsor connection
+Killer val-prop
+Encryption / 3.0 security
+Money
+Onboarding/HR process
+Servers, architecture
+Dual-use pitch
+3.0 Beneficiaries/Value Prop
+Gather/process ortho data
+Build model on ortho data
+Push 1.0 backend code
+Dev, host, render 1.0
+ID early third-party adopters
+Dev SaaS version
+ID next wave of adopters
+Raise money, build 3.0 team
+Buy addt’l third-party data
+Process all new data
+Build model on new data
+Sell “2.0” as dual-use
Mission Model Canvas Week 10
+ Channels to more
orthogonal data/ creative
solutions b4 data partnership
● Product Dev XP, e.
g., CTO visionaries
● PhD resources,
Researchers,
Stanford U
+ Early 3rd party adopters
● Smaller co’s that still
have useful data, e.
g., schools, munis,
critical local infr.
+3.0 data security partners
+High-data-value adopters
+VCs
A/B Analysts: Use ML
techniques to better understand
online personas using publicly
available data.
A/B Seniors: Improved speed
of reports emanating from
agency. Increased credibility
by relying on latest methods.
C Developers: “version 1.0” is
quick and painless in deploy.
C Seniors: Mandate to create /
identify pain-relieving tools
E: Minimum technical difficulty
in rendering tool, traffic from
other users
DoDDIR: Enhance
understanding of signal
intelligence, better defense
against cyber attacks
D: Seamless transition to 2.0
Consumer: Protect self
Other agencies: Improve
existing defenses against
spearfishing
Commercial: Augment private
companies’ ability to detect
catfishing within walled gardens
Sources
+$100k in first seed funding to
cover basic continuation of work
beyond class
+$400k in winter to cover major
variable cost expansion
+??? in Series A at 12 months
+??? Series B at 24 month
Uses
+Almost entirely variable for first
12 months:
Founders salary: $42
Building model: $48
Develop SaaS: $48
Non-tech labor: $26
Paid data $180
Servers: $40
Additional eng: $96
Total: $480 for first 12 months
+ 2 routes: 1) mandated by
top leaders (rare)
2) Small teams pilot, if useful
& good UI word of mouth
Both take ~2 years
+ Dist: Standalone tool,
purchased directly by small
team (hypothesis)
- Combine findings with other
agencies and private sector, in-
house solutions
- Technical support for MVP
backend infrastructure: daily
maintenance of data scraping
- Approval of security/legal
depts and buy-in from team
lead / manager
Beneficiaries
Mission Achievement
Mission Budget/Costs
Buy-In/Support
Deployment
Value PropositionKey Activities
Key Resources
Key Partners
Consumers: Notified when interacting with a
potentially fake account, or blocked
Gov’t IA teams: All fake account interactions with
personnel is flagged and/or stopped
Commercial IA: No employees lose credentials to
catfishing attempt. Notified of attempts
Social media co: Fake accounts are flagged and
removed before harm is done
-A/B Analysts:: Automation, insight, and timeliness in understanding entities
-A/B Seniors: Minimize pain in bringing in tools, up robustness of reports
-C Developers: Code base that can be adopted painlessly, clear security
-C Seniors Bolster existing tools w/ tool oriented toward publicly available data
-E: Successful, painless rendering of low-end tool on high side
-DoDDIR: Improve understanding of signal intelligence
- D: Developing easily deployable high-end tool that builds off the success of user
experience in v1.0, managing transition from 1.0 to 2.0