Achieving Tool Qualification in DO-178C Certification Process using Model-Based Design
Overview
In this session, you will learn how to approach tool qualification while developing and verifying applications using Model-Based Design.
DO Qualification Kit provides tool qualification plans, tool operational requirements documents, test cases and procedures, as well as references for soundness of formal methods techniques in the DO-178C certification.
Tools used in Model-Based Design and formal methods analysis require qualification under DO-330 Software Tool Qualification Considerations.
Since DO-330 does require the user to verify the tool in its installed environment, the test procedures are automated to allow the user to easily execute the tests and verify the expected results are correct.
Highlights
- DO qualification kit
- Tool qualification
- Artifacts or Report generation
About the Presenters
Prashant Mathapati, MathWorks
Prashant Mathapati is an application engineer at MathWorks India specialising in signal processing and embedded code analysis and verification. He has 15 plus years of experience in the role. Prior to joining MathWorks, Prashant worked for Trident Infosol and Programming Research as a senior field AE handling products in the signal processing and verification tools domains. He holds a bachelor’s degree in electrical and electronics engineering from Visvesvaraya Technological University (VTU), Karnataka.
Satish Thokala, Industry Marketing Manager, MathWorks
Satish Thokala is Aerospace and Defense Industry manager at MathWorks. He has ~19 years of experience in teaching, public and private aerospace establishments including Hindustan Aeronautics Limited and Rockwell Collins. In his current role, he is responsible to analyze technology adoption in the Aerodef industry, and develop strategies to increase the adoption of Model-Based Design with MATLAB® and Simulink®. His area of expertise is Avionics systems for both military and civil aircrafts. Early in his career, Satish contributed to the design and development of communication radios and field trials of the same for Jaguar and MIG fighters. He led large engineering groups developing software for cockpit displays, engine control and participated in the DO-178 certification audits.
Recorded: 10 Nov 2021
Hi, everyone. I'm Satish Thokala, Aerospace and Defense Industry Manager at MathWorks. I'm joined by two of my colleagues for this talk today. So we have Gaurav Dubey, Principle Application Engineer and expertise in module-based design and dual certification. And we have Prashant Mathapati, who is Senior Team Lead and expertise in module-based design, formal verification methods, and dual certification aspects.
Prashant will be doing most of the talk today and the Gaurav will help us to answer questions that you may have. So in case if you have any specific questions, please post your questions in the Q&A window or chat window. We will try to address as many questions as we can today. So with that, let's get started here.
So as we all know, this is a series of sessions. And so far, we are done with the first three parts of this series where we talked about what is the DO-178C in first session. And then we did deep dive into V and V aspects of the process. And then we talk about code generation. And then how do we make sure the code that we are generating from model-based design workflows is compelling to the DO-178C standards. And in today's talk we are going to focus more on the tool qualification aspects.
So just to recap quickly, DO-178C. We talked about this in our previous sessions, as well. So what the significance of the DO-178C and why there is more buzz about DO-178C And of course, commercial aero is using this for decades, and decades, decades. But why defense and space industry is also talking about DO-178C standards these days? So we discussed some of these points in our previous talks. And also we briefly discussed about what are various supplements, 330, 331, 332, 333, triple three. In our previous talks, we discussed about 331, 333 in more detail. And today we will talk a bit more about 330.
So before I hand it over to Prasant Mathapati to bring in dual-qualification aspects, I would like to spend a few minutes talking about what is the flexibility of model-based design in achieving dual-certification. I'm sure every one of us can connect with this diagram, the V-cycle diagram. So some of the questions that we hear from customers are basically, if I could put them into two categories. One is, how do I adapt model-based design across the complete lifecycle of my project? Starting from capturing system requirements all the way till deploying my systems in that field. That's one kind of question.
Second kind of question is, How my own in-house methods are defined to achieve dual certification? But for some sections of this process, or certain parts of this process, I would like to use a model-based design. That's the flexibility that is built into model-based design where we can address both these sections here. So you can adopt MATLAB and Simulink tools in certain parts of your lifecycle, or as a wholesale across the lifecycle. We talked about some of these examples in the previous sessions like Embraer, Airbus, Rolls-Royce, Korea Air, Bell Helicopter. We discussed these examples in previous talks.
Today I would like to touch upon one specific use case that's coming from Leonardo. So what is the problem here? What was their approach? What actually they are trying to solve here? It's not simple dual problem as such here. So the story goes something like this for this particular project. So they were delivering radar systems, mainly for the navigation for the helicopter, AgustaWestland helicopter. So when they're doing their initial system engineering analysis right before starting the project, well, some of the safety cases do require taking this particular program through DO-178C.
The reason being, as per their safety analysis, this particular system is expected to provide navigational, incidental cues to the helicopter crew. Which means the development process has to go through the DO process. So the development team they were not experienced enough on DO-178C. In fact, this is their first project on DO.
So one of the major requirement is, there shouldn't be too much deviation from their existing process. But still they would like to know it's DO-178C. So it's more in terms of being agile enough to adopt a model-based design into their regular projects.
So that's the major challenge or the requirement for this particular group. Then MathWorks and Leonardo closely worked on this to set up the process and framework to achieve dual-certification, and across the lifecycle model-based design was adopted. I wouldn't talk too much on the results that they have achieved, it's on the screen you can take a few seconds and read it.
But one specific result that I would like to highlight is, the documentation that they have generated. So the number of pages. 250,000 pages of the documentation that is generated, which is, again, more interactively linked documentation. Having links to the requirement, having links to the model, links to the test cases. So everything is connected, as needed, for the traceability requirements of the process.
And in this particular program, around 10,000 lines of MISRA Compliance code was generated that sits on the actual hardware. So that's coming from Leonardo here. So now what we will do is, we will recap some of the highlights from the previous talks and set the pace towards dual-qualification. So I will pass it on to Prashant Mathapati to walk us through the rest of the webinar. Prashant, over to you.
Thanks, Satish. Just a minute, I will share my screen.
While Prashant's screen is coming up, please post if you have any questions.
Let me know, Satish when you see my desktop.
Yes, we can see it, Prashant. We can hear you and see your screen here.
Thanks. My name is Prashant and today I will be talking about the DO-178C compliance. And what do I mean compliance? Compliance is nothing but producing an artifact of the work we have done for the application development. And today DO-178C talk about airborne software. So my contest of the topic will also related to the DO-178C.
So for any program begins, these are a set of documents which would be prepared to inform the planning stage that some, such as PSAC stands for, Plan for Software Aspects of Certification. Then you are developing process, verification process, and overall the software accomplishment summary. So these are the documents which would need as much required to demonstrate that whatever is defined as an object at different phases of V-cycle have been achieved.
This is a key of getting a product certification per DO-178C So you will see that there are different sets of objectives are being made and the user has to demonstrate compliance to each of the objectives in the form of an artifact. For this, either you can do manually, or by using automated tool. When you use automated tool, which automates the processes, then that's where the DO-330, the qualification process falls into that picture.
And what at a high-level it talks about, how do you guarantee what is a confidence level of those tools? Solutions what you have used for those are commissions. That's how you have to qualify that this is a tool, which meets all the criteria, and is qualifiable, and can be used in this process for meeting these objectives.
So I'll go into DO-330 in incoming slides. But before that, I will recap what we have been presented, overall in this webinar series. Before this we had, as Satish told in the second slide, we had three sessions where my colleagues went deep inside of each of these phases and talked about the processes, the artifacts, and overall the workflow involved in it. So what I'm doing in this today is, before I get to the 330 on the qualification side, I'll just touch upon what is DO-331 and what is DO-333, in terms of DO-178C.
So let's start with the model verification testing. This is a DO-331 which talks about model-based design approach for airborne software development. And it goes into two ways. One, based on the model verification testing and which covers high-level requirements and low-level requirements.
What is a high-level requirement? High-level requirement is a nothing but which is in the form of actual format. And low-level requirements, in this case, are the models which are similar to dual, which has the algorithms and which does as per the requirements. And then what it says is that you need to have a traceability between LLR and HLR.
And also, when you are doing a low-level requirement modeling, as per the guideline, you need to follow certain modeling guidelines. And each of the HLR requirement test cases should be linked to each of the LLR models. So with all this activity at a high-level, you can look at how capturing the requirement, model creation, simulation, validation, conformance, and overall model verification.
These are the activities, which will be done at the model level. And in order to do these all each of activities, let's start with LLR accessibility and modeling guidelines. There you have your actual requirements, here are the models which represents low-level requirements.
Then there is this tool box which was demoed called, Simulink requirement, which used to map each of the actual requirements to the design models. Then we have Simulink check to run on the models to run the compliance check, in terms of modeling guidelines. And also it also goes for other tool configuration and options.
For example, if you see if I'm doing DO-178C, the Simulink check has specified checks for DO-331. Similarly, if I'm using Code generation products, it's configuration, there is a way of modeling patterns required for a particular tool to be used and that has been automated using Simulink checks. So this way typically it helps the modeler to stick to certain guidelines to achieve certain objectives.
Then once you do this, you know the requirement, which specifies the input and outputs. And then along with these other simulator models, and you run those tests. And there is some Simulink test which is test management tool provides a framework.
And you can run all those tests along with simulation to ensure that what are defined in the requirements specifications are made. Like your inputs are meeting the outputs. That's how you run those functional tests. And then comes the coverage. Coverage is required to measure the effectiveness of your test cases, or also to identify unreachable part of your model, or model.
So you would use Simulink Coverage. And also you would notice at the same point I'm highlighting the reason more like which particular table are the object used. 1, 4, 5 coming from this table model DO-331, A-4 and A-3, which all objectives can be satisfied using this product. When I say this satisfies those objective, means it also generates an artifact.
Then you will use 333 Formal Method based tools, something called the Simulink Design Verifier. One thing what you did is, based on the requirement, you handed in some test case, you got let's say, around 60% coverage, there is still 40% coverage are missing. Then you can use Formal Method based test cases generation. Based on missing requirement, Design Verifier typically looks at whether it can generate a test case for those missing part of the model.
If yes, it is a test case, if not then that proves that that is a relevant part of the model, which can be eliminated. And also you can use for various other purposes. It uses modern methods called abstract interpretation to identify design error detections and overall. And this wasn't the DO-333, which is a supplementary for DO-178C and talk about automated based verification. And this is how you can also get the credits and also you can prove the robustness of the test cases.
So what I put together at the top, at the model level you can see the tools like Simulink Requirement, Simulink Task, Simulink Check, Simulink Design Verifier, and Simulink Coverage, doing all the various aspects of the model verification and validation. And also there's a Simulink Report Generator, which produces other artifacts, such as Simulink Design, SDD doc design, the software design description. And that these as per DO-333 standards.
So these are the tools which are used for model verification and validation. And as I said earlier, it's very clear that we are replacing a human by these automation tools. And whenever you bring these tools to do certain some of these tasks, you need to provide a qualification kit. We'll talk about what is a qualification kit. In short, qualification kit proves that this tool is qualifiable to use in this process and they do their intended functionalities correctly. So that gives the confidence to the authority, or to the person, or to the audit committee that they can use these artifacts for validating the compliance.
And all this workflow is being demonstrated on 19th October. There was this topic verification validation for DO-178C. If you are not attended you can browse on our website. The recording is available. You will get deeper inside how can you use requirement. How the requirements can be linked from a actual dual models. And how the requirement can also be mapped to the test cases. And how to achieve that bi-directional traceability.
So that's very deep. Our colleagues have gone through and then demonstrated. And it's available on our website. Industry can go on the website and find those details. So this goes with verification validation for DO-178C.
Next comes on the model side. Let's begin on the code generation or on the software implementation side. So before I get on-- sorry this one last slide. This typically highlighting the same thing. Again we use all these tools to produce these are the artifacts, which can be used for claiming the certification under the DO-178C.
The second part, like on the top, we are finished. Now more V nd V is done, how you use the same model for code generation and for getting on an excutable object code. So then again, the process exists like whatever code you generate should be traceable to the source code, to the low-level requirement. That is nothing but the models.
Similarly, the generated code should also conform to the coding standards, such as MISRA And how the structure of verification of the source code was also LLR model. It's talking about SC and LLR model. And also how can you extend, you use the same high-level requirement. Where you have the previously generated test cases at the design level, can be used as a source code to ensure that requirement with testing on the source codes are done and all are in sync.
Same thing can be extended for the Object Code. That is in the pic. So first, we have the Executable Model, that is LLR. Then the Embedded Coder is the product which is used to generate embeddable source code, which is compliant NCC-1999. Or you can generate C or C++ code, which is executable.
Then the same Simulink requirement, which has usually to map on the requirement of the models can also extend the model to the source code accessibility. Which part of the requirement that of the source code belongs to. And there is a Simulink Code Inspector. This is a tool which is used to ensure that all parts of the models are converted into a source code. So this verifies that embedded code did back translation correctly from model to the source code.
And this was a particular topic, which was done on October 27th, last month. Again, if you missed this session, the recordings are available on the website. Go through that where you will see the Embedded Coder and Simulink Code Inspector, used case, and their artifacts.
When it comes to the functional testing, even at the model level, based on the requirements, we had some functional test case you can extend the Simulink Test framework. Again, use the same test case. And again, the source code in the form of software in the loop to perform requirement based verification at source code level. So the advantage is that you can reuse the same test case, which were produced at the model level.
And the Simulink Coverage which was used to find the Model Coverage can also be extended for the same test cases. You can find the Source Code Coverage, which gives you all the metrics, such as MC/DC, Decision Coverage, and Statement Coverage. And this is all executed in the form of software in the Loop.
So when it comes to the Executable Object code verification, then you will do it either in the PIL on Target. Then you will have a compiler dedicated source code. Using compiler, you will build the code, and put it on the hardware, and run in the PIL mode to validate the same thing. And we have Polyspace Bug Finder Finder tool, which is a static analysis tool which only runs on the source code. This is for enforcing the coding standard.
And then for Code Prover, which is formula analysis based tool, is used to prove the presence or absence of runtime errors. Again, this Code Prover comes with a formula method based engine and by default, it falls under the DO-333 qualification process.
So what are the Polyspace? You have two things, one is Code Prover and Bug Finder. Bug Finder is mainly for the enforcing coding rules, and Code Prover is ensuring the absence of runtime error and ensuring the robustness of your code. So this was also demonstrated back in October 27th. So more details you would find on our website. But if you have any questions, feel free to post it in our Q&A. Michael is happy to answer those in Q&A or else, keep it in the end. Maybe we can pick up some of those questions.
And here you have Simulink report generated. Talk templates are available to generate artifacts. And I'm summarizing for each of the product. And at each of the cycle, you have the table and the objectives which can be satisfied.
So again, just to highlight summarizing, you use the Simulink Code Inspector and Bug Finder to validate the reporter. And you will use this report to ensure to qualify Embedded Coder product generated code. And Polyspace Code Prover and PIL that is possible in the loop report. Simulink Coverage report will be used to meet the criteria for object code verifications.
So this is a overall, I think you'll hear is the slides that will be shared. You will have a link to download this overall browser, which has the same details, but very clear in detail. Along with the products and what table, and which part of the supplementary that it falls in. Everything is captured. So far, at a brief level I just covered on the top. What are the model level activities? What are the source level activities? Together, you have that overall workflow along with the quantifiable objectives and the artifacts produced by individual tools.
This slide summarizes each of these tools. And you see that why the tool links are shown in green is because these are quantifiable tool. And as Satish has shown one user story, Leonardo. Similarly, there are many more who are use these tools at different programs to qualify their products. And the same information will find more on our website. And also feel free to get in touch with us.
So this brings us to the main topic of today. What is Tool Qualification 330? So far, I just touched about the workflow, and required tools, and which category of objectives they satisfies, and what artifact they produce. Now let's dig on these tools qualifications side.
What does it mean to qualify a tool? So again, anything if you are automating, you're using automation, that particular tool or the process should be qualified. And now why do you need to qualify? In order to increase the confidence. And that's mainly for safety critical applications.
Some people might use it for internal mainly to see that industry for safety critical applications, where you have an independent authorities, who would audit the compliance. If you say that you are using an automation tool, then you need to produce a qualification kit, which gives them the confidence that tool is qualified to use in these processes.
Now whether it's a airborne, or ground-based, or drone-based software, such as DO-278A, the tool qualification is required. And this is based on in two criteria. At earlier before the DO-178C had DO-178B. In that there are two criteria. One is development tools and the other is verification tools.
Development tools requires more rigorous qualification. For example, compiler, which used to produce the software. That requires rigorous because there is always a threat that it could add some forms of a equivalence or introduce some errors in the system. And that goes in different.
And verification tools which can introduce errors, but they may fail to identify them. The verification tool always claim that they will identify different categories of the bugs or the mistakes in the core. But there is always possibility that they might miss it. But they're necessarily not add any unintended functionality to the software.
But that's how the category that goes on different way. But what DO-178C has done is that it added third criteria. The third criteria is that, it just says that you have a tool which does ask for its scope of intended use. It doesn't add or anything or doesn't detect anything.
But it just used for some intended functionality. Such as, let's say for example, report generated. So the intention is to just generate a report on artifacts. So that falls under Criteria 3. And based on the Software Assurance level like A B C D, you will select based on the development verification. And again, this is a third criteria based on the different particular levels of the policy we used.
So from the MathWorks we have all the tools, which have mostly fall under TQL-5, except Polyspace Code Prover, which falls under TQL-4. So for MathWorks, this qualification kit is nothing but a bunch of documents, and templates, the test case, and the expected results.
So before I get on to this slide, I just open up the MATLAB and show you how the qualification kit looks like. So you can just on Qual Kit. The Qual Kit is dependent on each version and release of the MATLAB and Simulink. So whenever you upgrade, you should also upgrade the Qual Kit along with that.
And you can see that these are the tools which I've shown you and we've been showing in the overall embedding process for DO-178C And they have a set of documents. The first thing we show you are supporting artifacts. Over all this media of the docs talks about the DO-178 workflow as per DO-331 and DO-330.
And then you have PSAC document, operator requirement document, and then overall the user guide, as the video says. Let's pick any of the tool for example, Polyspace Bug Finder. If you remember, this tool is used to enforce coding standard. So along with that, you'll see that each of these products you will see the docs. So it talks about tool operations, tool installation, and tool user guide, and reference guide.
And then test cases. Now test case for each of the features what this tool is used for. Now the Bug Finder used for claiming cert-c standard, coding metrics, MISRA ACAGC, and MISRA 2012, 2004, MISRA C+ 2008, and some other device from coming from programming and code metrics. And it has its own other set of defects that falls under concurrency, data flow, numerical, good practice.
So these are the checks. And when I expand it you will see that each of these features has a test case associated with it. And you can see each of these test case. And that's how you have to prove that this tool claims that it identifies cryptography based programming errors under cryptography.
There are so many number of defects are categorized. For each of those defects there is the equivalent test cases is written. And it is clumped together. Similarly, if I go into let's say, Simulink test or any other tool box you would see similarly, the test based on the features of those particular tool.
And along with that you will see is you have this how to run those test case. It's very easy. All the automation is done. Is very tightly integrated. You can just go and click for copying those. For example, if I just click on this-- give me a second.
They can just choose the location. And you can see that for particular for Bug Finder, it all captured copies the test cases along with the expected results. And also the tool qualification tool operational data. In order to execute, all you need to do is open the test folder.
So another the test, you see we give all the batch files. Either you can run in MATLAB or on a command prompt. And once all these automation leave, I just click. Like for example, if I place this command on the MATLAB screen it will start running each of these test cases. And that will be compared against the expected results. And then finally produce overall results.
Shows that all these tests pass all the test cases are being passed. And all those test scores of cases being passed is proof that the tool does what it is intended. And that's how it qualifies to be used in DO-178 process of certifying particular objectives.
And if you want to go further deeper, more on that. So we have more on the tool test procedures. Like how this testing. This is what authority would ask you to provide. So this is everything is added all the details are mentioned. How you would see the results and the licensing. Licensing part varies again. But you can see here each part of the expected results are being highlighted in this document. Which can be used to prove that dual confidence level. And also its qualification level for the usage under the DO-178C process.
So same goes for all products, whichever falls under the DO Qual Kit. So I'll go back to the presentation. What we do is see. We saw the set of documents and templates. And which could be based on the supporting artifacts you have that reference workflow guidance. And then under each tool you have a reference guide, user guide. And also they were test cases, along with the expected results.
And then you have tool operational data and test procedures. As per the DO-178C mentions about that are described in the 330. Similarly, each of those test cases I showed you about Polyspace, which is a static code verification tool. Similarly, here in this is shown talking about code inspectors. So each of the modeling pattern it shows the test case. There is the expected results and it shows how the expected results could look like.
So just give you a summary. You go from high-level to low-level, then Source Code, and Executable. We do a lot of traceability. Goes from high-level to the Object Code. And different set of tools will be used to do traceability. To use the mapping from requirement to the low-level. And from the genetic code again you need to provide accessibility to the source code. And then source code back to the high-level requirement.
So this all can be achieved using the different tools such as Simulink requirements, Simulink Coverage Simulink Check, and Simulink Requirements. And similarly, the other tools used at the bottom of the code generation time.
So this is also a great tool. Satish, can you hear me?
Yes, Prashant. We can hear you, Prashant. Please go ahead.
Oh sorry. I got some pop-ups in the screen. I thought I got this So this is it. I think it's a lot of summarizes the DO-330. And also the tool from the MathWorks, which are used and which are quantifiable tools. And being qualified in various use by various our users or other customers, in various different programs.
I think that each one has produced it. And we have a lot of details available on our website. If you browse around the DO-178 and look for user stories, we have many, many customer cases. And the program begins comes up where they are successfully able to use a model-based design approach with MATLAB, Simulink platform. And then you're able to use this tool and successfully qualify them.
So this only talks about the program level. Now how about an organization who is doing multiple programs? Would like to use model based approach to organization level or an enterprise edition level. So I'm talking about model-based design and options for all the programs. Like a different people there was. For that, Satish, do you have any stories? Would you please, to bring it?
Yes, Prashant. Great question. So I was actually thinking about Airbus. So that was very good adoption of model-based design to achieve certification. Yeah, there you go. Yeah. So this is the one. So what's one good thing here is, we had seen a lot of our customers using model-based design enabled by MATLAB and Simulink. Including, dual-quality and associated products to achieve certification.
In this particular account, the same approach is used across multiple programs. Thus, leading to something like a decline approach. So I have mentioned some of the program names. So starting from A33- RT, A400M, or C-295. Even including some of the latest programs like A380. So it's a similar workflow that is used across these large programs by multiple groups.
So if you see the products, it's a wide spread. Starting with capturing the initial designs all the way to code generation. And also achieving tool qualification with the Qual Kit. And also doing formal verification methods at both model, as well as code level. At model level with Simulink Design Verifier. And code level it is using Polyspace. Hope that answers the question that you are asking Prashant.
Yes, yes That answers that. I get it that the one set of those can be adopted or organized by an organizing label. And could be used as many different programs. And it's most popular a particular task. But thanks, Satish for that. And also important things I like anybody who are new to model-based design approach.
Not only today. We did a brief session giving you an overall idea about how the process looks like. What all the documents is required and what comes in future. But if you want to use it in your work for development, and if you are new to that, we do have training surface available. That gets you get started with it.
And if you are looking something more very specific. Where you see that you need to add up to some of the best practices of the process compliance or assessment. Which we also do through using our consulting offerings. Where we do provide our consulting services to help you to get started with the process. From planning to the execution cycles.
And also there are some workshops. And we do typically carry out. And if you are a population level if you are interested to use habitat. You are outside of or you are European. So we are interested to know more much deeper than we have based on ARP475A, DO-178C and DO-254 is from application engineering we do that.
And this gives you an overall overview of the workflow. Today in this short time we just talked about the tool and their artifacts. But we can begin these workshops provide you more hands-on. And the participants mainly, can be coming from their teams, and safety quality validations, or even project managers. And you are willing to know about what the process and artifacts are. And how qualifications will be done.
And you see that this is an interactive demo where you will see our engineers carrying out step-by-step execution of the overall workflow. I'm talking about managing those model-based design, as well as doing code generation and then verification validation. And along with that you will see what all the artifacts and how it can be managed.