Security evaluation methodology
Coordinated by
UMU
Security evaluation methodology to evaluate the security of an ICT system. The methodology is based on standards such as ISO 31000 standard for Risk Management, the ISO 29119 standard for Security Testing or the MUD standard. The methodology defines a set of high-level steps that should be followed by the security evaluator and is intended to serve as a basis for the security certification. Moreover, the proposed methodology is intended to be generic enough to be instantiated though different techniques and tools.
It addresses several of the identified challenges in current certification and evaluation schemes. In particular, the combination of risk assessment and testing processes provides an objective and empirical measurement that also allows a partial automation of the process, facilitating subsequent recertification in case there is a security change in the system. The methodology also considers the context variable (different security level in different contexts), the definition of a visual label for non-expert consumers and the creation of a behavioural
Description
Describe the innovation content of the result:
Security evaluation methodology for the objective security evaluation of systems as a basis for certification.
Who will be the customer?
The customers are researchers, industry, consumers, public authorities, manufacturers, Conformity Assessment Bodies (Labs and Certification bodies), and National Schemes (NCCA).
What benefit will it bring to the customers?
An evidence-based security evaluation methodology generic enough to be applied through different techniques and tools that could be used as a basis for a certification scheme. Indeed, the methodology copes with some of the major challenges related to security evaluation and certification and is based on well-known standards to facilitate its adoption.
When is the expected date of achievement in the project (Mth/yr)?
Methodology concept in 02/2022 and example of instantiation at the end of the project.
When is the time to market (Mth/yr)?
At the end of the project.
What are the costs to be incurred after the project and before exploitation?
The methodology will be ready for use without further investment after finishing BIECO, but further research based on it will need to be framed on other innovation projects. Sources to secure these resources will be based on Research grants coming from actions like EU H2020 framework.
1year: Around 40K – 60K for final testing, documentation and validation
2year: 100K for fund raising and initial customer traction for testing
Page 64 of 145
Deliverable D9.5.: Exploitation strategy and planning including IP – Final Report
3year: scale up customers up to 3-15 customers or some community
What is the approximate price range of this result/price of licences?
Open source
What are the market size in Millions € for this result and relevant trend?
2y: 20K€
3y: 300K€
How will this result rank against competing products in terms of price/performance?
The methodology addresses some of the major challenges identified among the competitor’s schemes for certification and evaluation.
Who are the competitors for this result?
National and international public and proprietary certification and evaluation schemes.
How fast and in what ways will the competition respond to this result?
We are estimating that creating similar solution will take at least two years.
Who are the partners involved in the result?
In the concept of the methodology no partnership has been foreseen. However, a particular instantiation of the methodology will be provided with the support of 7bulls, GRAD, UTC, RES and CNR.
Who are the industrial partners interested in the result (partners, sponsors, etc.)?
7Bulls, RESILTECH, TTTech, I-FEVS, UNI
Have you protected or will you protect this result? How? When?
Research publications at least after the methodology definition and at the end of the project
Other results
BIECO Integrated Platform
BIECO Integrated Platform will integrate the tools in a loosely coupled way.
Data Collection Tool
Data Collection Tool (DCT) stores information from relevant vulnerability related datasets, providing a single access point to information required by the vulnerability detection and forecasting tools developed in T3.3, as well as for the failure prediction tools developed in T4.2.
Vulnerability Detection Tool
Vulnerability Detection Tool will detect existing vulnerabilities within the source code which may lead to the successful execution of an attack.
Vulnerability Exploitability Forecasting Tool
Vulnerability Exploitability Forecasting Tool will estimate the probability of a vulnerability to be exploited in the next 3, 6 or 12 months.
Vulnerability Propagation Tool
Vulnerability Propagation Tool will calculate and offer the paths affected by a vulnerability in the source code.
Fuzzing Tool
Fuzzing Tool will test System Under Test (SUT) security vulnerabilities or inputs not contemplated that could compromise the system; as a black-box process, by using unintended or incorrect inputs and monitoring their corresponding outputs.