Upload
bob-gourley
View
214
Download
0
Embed Size (px)
Citation preview
7/29/2019 Model Enabled Analytical Framework
1/8
November 2012
White Paper:Model-Enabled Analysis:Factors for Evaluation
CTOlabs.com
Inside:
Background on Model Driven Analysis
Nine Criteria You Can Use In Evaluations
Lessons learned for your enterprise deployment
A White Paper providing context and guidance you can use
7/29/2019 Model Enabled Analytical Framework
2/8
CTOlabs.com
Model-Enabled Analysis: Evaluating Potential SolutionsThis paper provides a checklist you can use to evaluate and improve your enterprise analytic
framework. It is written for enterprises with the most advanced analytical challenges -- those that
require the best possible analysis in order to serve critically important missions. The lessons learned
and conclusions outlined here result from a review of approaches and technology frameworks in place
in both commercial and military settings.
Background: Enduring Challenge and Emerging Approach
Our greatest analytical challenges require model-enabled analysis. This kind of analysis lets humans
do what humans do best and computers do what computers do best. It lets humans analyze, think,
imagine and test scenarios, and it leverages technology strengths like the ability to rapidly process,
compare, work and iterate based on human guidance. This approach has a long history. Model-
enabled analysis is how humanity faced big challenges like breaking codes in World War II or putting a
man on the moon. More modern examples of model-enabled analysis include methods to detect fraud
in fast-moving nancial transactions or to assess adversary intentions during conict. But whether the
example is old or new, the objective is the same: model-driven analysis lets humans create models,
change those models and then evaluate the impact of the changes on the analysis. Model-driven
analysis helps us learn, assess, predict and iterate. It is the most advanced form of analysis, and it is
required for our hardest challenges.
What Exactly Is Model Enabled Analysis?
In the general sense of the term, a model is a reection of reality. These models were pioneered
during the Cold War military wargames as computer-based scenarios that tested elements of our most
strategic war plans. These models helped (and still help), but they run too slowly to contribute to the
urgent demands of operational and tactical analysis. Now there is new software designed from the
ground up to facilitate model-enabled analysis that directly delivers the power of models to users via
easy-to-use interfaces.
How Can You Evaluate Model Enabled Analysis Solutions?
Model-enabled analysis tools can add powerful capabilities to your organizations ability to assess the
truth of a situation and rapidly model how changes can impact assessments of reality. These solutions
can also help you to manage and understand ambiguity and enable you to build plans to reduce that
ambiguity. Because of these capabilities, a model-enabled analysis solution is a very powerful tool to
place in your infrastructure.
1
7/29/2019 Model Enabled Analytical Framework
3/8
A White Paper for the Government IT Community
But how can you choose the right solution for your enterprise? We recommend considering ninemission-focused factors in evaluating model-enabled analysis solutions. These factors are:
Mission functionality/capability
Ease-of-use/interface
Architectural approach
Data architecture
Modeling capability
Licensing
Partner ecosystem
Deployment models
Health of the rm
We expand on these factors below.
Mission Functionality/Capability: You are in the drivers seat here, and the solution you choose must
be able to meet your mission needs. If the analysis package you select does not have the capabilities
that you need and expect, then the following considerations dont matter.
You will be able to evaluate how well a solution will meet your needs much more easily if your
organizations mission and vision are well articulated from the beginning of the process. A exiblemodel-enabled analysis system should be able to support your mission, and you should request
demos and perhaps proof of concept prototypes from the company oering the solution. Giving the
capability provider a clear idea of your mission and vision will ensure that the demos and prototypes
are taking your needs into account. It is also a good idea to provide your concepts of operations
for analysis prior to a proof of concept. For example, do you want a system that has a focus on
collaboration between analysts, one that is focused on the individual analyst or one that is exible
enough to serve anyone in your entire organization? Although a system should be able to do all of
those things, when asking for a proof of concept or demo, it is a good idea to narrow down what is of
most importance to your mission.
Another thing to consider is your goals for how your model-enabled analysis system will work with
data. Will your system have a focus on extracting knowledge from existing data stores? Will it be a
system that extracts data while it is owing in or do you want a capability that automatically correlates
data and sends alerts? Do you want a system that does all the above? Those can be found, but in most
cases the larger and more all encompassing your need is the more complex the solution will be, and in
that case you should not just be evaluating individual solutions but your entire framework.
2
7/29/2019 Model Enabled Analytical Framework
4/8
CTOlabs.com
3
Ease of Use/Interface: When choosing an analytical solution, one of the rst questions you shouldask is: who are your intended users? Do you want to enable your data scientists to dig more deeply
into new questions? Do you want to increase the power of your analysts? Or are you hoping to push
capabilities out to your entire workforce? If you are aiming to provide model-enabled analysis to a
wide swath of your workforce then you are most denitely going to want to evaluate any solution for
scalability. You will also want to evaluate it against as many representative use cases as you can.
The same tools that help your intelligence analysts assess threats can also help your collection
managers, trainers, senior managers and even your HR department with their unique challenges. But
these capabilities can only help if your workforce is willing and able to use them. A powerful tool that
takes several specialized degrees to learn and use will obviously have a limited impactmany non-IT
professionals demand walk-up usability from their information management software. Often entire
departments only use a small fraction of the capabilities that powerful analytics provide because the
tools are intimidating or hard to access.
An intuitive user experience matters as much for specialists as it does for your less tech-savvy
employees. Users of all backgrounds like to pose questions and get answers quickly, and they want
to be able to iterate and ask questions based on new information or results and previous assessments.
. The capabilities you select for your enterprise should meet this need in ways that let your workforce
conduct this rapid assessment and feedback process. This means a natural interface can be moreimportant than any individual functionality. With smooth and ecient interactions between tools
and users, analysts and decision-makers reach better decisions faster, which is the ultimate goal of a
model-enabled analysis approach.
Architecture Approach: Some solutions require that you establish entire architectures just to
support them. This is not a good approach. Other solutions are stand-alone islands, and it becomes
your job to get all of your data into their closed system to perform analysis. This might be workable
for some missions, but in most cases you will want systems that work with your existing enterprise
architecture and that are able to securely move data in and out of the analytical tool. Your architecture
should also help drive the interface into the capability. In most cases, every user in your organizationwill have a browser on their device already, so shouldnt that be the interface for accessing all of your
new analysis capabilities as well?
Data Architecture: Common standards for data are key, and more than likely your enterprise already
has a very active eort to ensure standards that are known and leveraged for your mission (many
7/29/2019 Model Enabled Analytical Framework
5/8
A White Paper For The Federal IT Community
organizations call these standards and implementation guidance their data layer). When consideringnew model-enabled capabilities you should evaluate those capabilities in the context of your data
architecture. Analytical tools of all types should work with your data layer and should not force you to
change to meet their needs. And you should also evaluate the degree to which they work with your
data later. The most basic interoperability might involve humans needing to export and import data
to the system. This old and awkward approach is not optimal. Analytical capabilities should, to the
greatest extent possible, work with all your data without the need for import and export of data from
one store to another.
Even solutions that automate access to data need to be evaluated for what they do after processing
data. Many solutions have local content stores. Those that do should be designed to work with data
from a wide range of sources without re-aggregating all of the datain other words, they should
be data source agnostic (designed to work with any type of data). Systems that force data to be
re-collected and imported into their local store in system-specic formats and indices will limit your
ability to perform your mission with the needed exibility.
Analytical Models: Analytical capabilities designed to perform a single function use models that are
hard-coded. They might be tremendous at that single function, but they will never be exible enough
to solve your most complex analytical challenges. Your analysts must be able to easily produce their
own models. Models help analysts deal with complex issues by reecting associations and meaningsand the relationships between concepts, objects and actions. Models should be easy to create, modify,
analyze and share. Multi-model systems enable better discovery of new conclusions. The ability to
create multiple models allows users to work multiple issuesthis means that many organizations can
make use of the same tool. This exibility lowers overall cost and speeds your return on investment.
Licensing: User organizations should, to the greatest extent possible, push for licensing that is as
economical and predictable as possible. For most analytical tools, an enterprise license based on
number of users is a common approach. If you are forced into a license that is based on the number of
processors or servers or cores, you will be stuck with a high cost even if you have no one using the tool.
You want systems from companies that are motivated to serve users, so the license-per-user model isgenerally the best for this type of tool. When you compare options, this choice can mean a signicantly
lower startup and maintenance cost for you. You should also be careful about hidden licenses when
you buy a tool. For example, are you also required to buy a new Oracle or Sybase license?
Partner and Legacy Ecosystem: Your legacy IT infrastructure comes from a wide range of rms.
4
7/29/2019 Model Enabled Analytical Framework
6/8
CTOlabs.com
5
Any organization of size will have software that operates over datastores from companies like Oracle,Microsoft, Sybase, MySQL, IBM, Cloudera and countless others. Analytical tools from a wide range of
vendors are also in your ecosystem. This means that any model-enabled capability you pick should
oer great exibility in working with other tools in the ecosystem. For example, you might want the
capability to use maps from ESRI, or maybe Google Earth. You might want to leverage datastores from
the open source community and, of course, proprietary stores. You may decide to use search tools like
Solr or discovery tools like Endeca. You might have your own Extract/Transform/Load (ETL) capabilities
or you might need a solution to come with an open enrichment package like openNLP. You very likely
already have a good link display/interaction capability like Analyst Notebook and you probably dont
want to recreate the wheel there. You may have a suite of good capabilities for working with foreign
languages that could include Language Weaver and Basis.
Your model-enabled analytical solution must be able to work with anyone. So the capabilities you
pick should be designed to enable customization and extension. This includes an ability to change
ontologies, change interfaces, change data sources and change the other tools that it interfaces to.
Deployment Models:The capabilities you acquire should be able to run without a large contractor
sta. Specialists are frequently required to install a capability and some level of services and support
to your team can be expected, but if you must buy a large number of engineers to keep the COTS
running, then you really have not bought COTS. You have bought COTS plus engineers, and the costof that will eat you alive. Furthermore, the result of your eorts will not be an open system, you will
end up having something so customized to your needs that it is not interoperable. If you are told that
engineers are required to continue the functioning of your analytical system -- that should send up
alarms. Will there always have to be a wizard behind the curtain?
Your organization has likely already picked a favored deployment model that includes technical
guidance for deploying on browsers to desktops and ways to deploy to mobile devices via browsers
and applications. For example, in the national security community, a framework called the Ozone
Widget Framework provides guidance for developers. If the new solution you are evaluating requires
you to abandon your guidance to developers on topics like this, that is a huge black eye for the rm
you are evaluating. Well-designed solutions should be engineered to meet your development criteria.
Health of the Firm: It is important to know who is selling you the capability. Is it a user-focused
organization who cares and will be with you long-term? What if the rm you are dealing with has the
great reputation of an Enron pre-crash? How do you nd out if the rm has the ethics and abilities that
7/29/2019 Model Enabled Analytical Framework
7/8
A White Paper For The Federal IT Community
you require? Is this rm having trouble staying aoat? If you are relying on the company for support,you may end up losing your investment if its doors close. These factors can be hard to evaluate, but it
is worth the eort to do some homework. This is why the government mandates market research to
be done in Federal Acquisition Regulations. Never skip that step! Research the capability itself, along
with the rm you are doing business with.
Concluding Thoughts
There are many other criteria you may want to consider for evaluating model-enabled analytical tools,
but the nine above are key for ensuring long term mission success. We also believe it is important to
speak with others who have used the tools you are evaluating to get the benet of the lessons theyhave learned. This is especially important in the current budget environment.
More Reading
For more federal technology and policy issues visit:
CTOvision.com- A blog for enterprise technologists with a special focus on Big Data.
CTOlabs.com - A reference for research and reporting on all IT issues.
J.mp/ctonews - Sign up for the Government Technology Newsletters.
About the Authors
Bob Gourley is CTO and founder of Crucial Point LLC and editor and chief of CTOvision.com He is a former
federal CTO. His career included service in operational intelligence centers around the globe where his
focus was operational all source intelligence analysis. He was the rst director of intelligence at DoDs Joint
Task Force for Computer Network Defense, served as director of technology for a division of Northrop
Grumman and spent three years as the CTO of the Defense Intelligence Agency. Bob serves on numerous
government and industry advisory boards. Contact Bob at [email protected]
Ryan Kamauis a technology research analyst at Crucial Point LLC, focusing on disruptive
technologies of interest to enterprise technologists. He writes at http://ctovision.com. He researches
and writes on developments in technology and government best practices for CTOvision.com and
CTOlabs.com, and has written numerous whitepapers on these subjects. Contact Ryan at Ryan@
crucialpointllc.com
6
7/29/2019 Model Enabled Analytical Framework
8/8
CTOlabs.com
For More Information
If you have questions or would like to discuss this report, please contact me. As an advocate for better
IT in government, I am committed to keeping the dialogue open on technologies, processes and best
practices that will keep us moving forward.
Contact:Bob Gourley
703-994-0549
All information/data 2012 CTOLabs.com.