Difference between revisions of "Roles and Competencies"

From SEBoK
Jump to navigation Jump to search
Line 59: Line 59:
 
=====5. Advanced: Applies the competency in considerably complex situations.=====
 
=====5. Advanced: Applies the competency in considerably complex situations.=====
  
====6. Expert: Applies the competency in exceptionally complex situations. ====
+
=====6. Expert: Applies the competency in exceptionally complex situations.=====
  
 
Other examples of proficiency levels include the INCOSE (INCOSE 2010b) competency model proficiency levels of Awareness, Supervised Practitioner, Practitioner, and Expert, and the APPEL competency model levels of participate, apply, manage, and guide, respectively (Menrad and Lawson 29 September-3 October, 2008).  NASA as part of the APPEL has also defined the proficiency levels of: I) Technical Engineer/Project Team Member, II) Subsystem Lead/Manager, III) Project Manager/Project Systems Engineer, and IV) Program Manager/Program Systems Engineer.
 
Other examples of proficiency levels include the INCOSE (INCOSE 2010b) competency model proficiency levels of Awareness, Supervised Practitioner, Practitioner, and Expert, and the APPEL competency model levels of participate, apply, manage, and guide, respectively (Menrad and Lawson 29 September-3 October, 2008).  NASA as part of the APPEL has also defined the proficiency levels of: I) Technical Engineer/Project Team Member, II) Subsystem Lead/Manager, III) Project Manager/Project Systems Engineer, and IV) Program Manager/Program Systems Engineer.
  
 
When using application as a measure of competency, it is important to have a measure of goodness.  Just because someone is applying a competency in an exceptionally complex situation, it does not mean they are doing well in this application.  Likewise, just because a person is managing and guiding, it does not mean they are doing this well. An individual might be fully competent in an area, but not be given an opportunity to use that competency.  Decouple competency and application, unless the application has a measure of goodness for that application and the assumption is made that resources are utilized to their full potential.
 
When using application as a measure of competency, it is important to have a measure of goodness.  Just because someone is applying a competency in an exceptionally complex situation, it does not mean they are doing well in this application.  Likewise, just because a person is managing and guiding, it does not mean they are doing this well. An individual might be fully competent in an area, but not be given an opportunity to use that competency.  Decouple competency and application, unless the application has a measure of goodness for that application and the assumption is made that resources are utilized to their full potential.
 
  
 
====Article Discussion====
 
====Article Discussion====

Revision as of 17:55, 12 May 2011

Introductory Paragraph(s)

Existing SE Competency Models

Several systems engineering competency frameworks have been developed. (Ferris 2010 (expected)) provides a summary and evaluation of the existing frameworks for personnel evaluations and for defining systems engineering education. Table XX shows multiple SE competency definitions that have been developed to date, along with the characteristics of each of these. This table is provided as a summary and reference. Each model was developed for a unique purpose within a specific context and validated in a particular way. It is important to understand the unique environment surrounding each competency model to understand whether the findings can be generalized.

A few published SE competency models that can be used for reference: the INCOSE UK Advisory Board model published in 2005 (Cowper and et al. 2005), and updated in 2010 (INCOSE 2010b); the MITRE SE Competency model published in 2007 (MITRE 2007, 1-12); the SPRDE-SE/PSE model (DAU 2010b); and the APPEL model (Menrad and Lawson 29 September-3 October, 2008). System engineering competencies can also be inferred from standards such as ISO-15288 (ISO/IEC 2008) and from sources such as the INCOSE Systems Engineering Handbook (INCOSE 2010a), the INCOSE Systems Engineering Certification Program, and CMMI criteria (SEI 2007b; SEI 2007a; SEI 2010, 1).

For the INCOSE Certification Program, the implication is that achieving this certification is the equivalent of having systems engineering competency. To a limited extent this is true. However, an accomplished systems engineer needs to have competencies equivalent to the INCOSE certification and much more. The INCOSE certification can therefore be considered an essential subset of systems engineering competency.

Most competency models recognize that systems engineering requires certain behaviors, skills, and knowledge. These models also consider the need for discipline-specific capabilities related to systems engineering and domain-specific capabilities related to the industry or business in which one employs the system engineering discipline. The typical model, while recognizing the need for domain knowledge, does not define the competencies or skills related to a specific domain.

Some domain- and industry-specific models have been created, such as the Aerospace Industry Competency Model, published in draft form October 15, 2008 and now available on-line, developed by the Employment and Training Administration (ETA) in collaboration with the Aerospace Industries Association (AIA) and the National Defense Industrial Association (NDIA). (ETA 2010) This model is a comprehensive competency model for the aerospace industry and is designed to evolve along with changing skill requirements. The ETA also provides numerous competency models for many other industries through the ETA web sites. (ETA 2010)

The NASA Competency Management System (CMS) Dictionary is predominately a dictionary of domain-specific expertise required by NASA to accomplish their space exploration mission. (NASA 2006)

Systems engineering-related models published generally agree that systems thinking, taking a holistic view of the system that includes the full life cycle, and specific knowledge of system engineering methods both technical and managerial are required to be a fully capable systems engineer. It is also generally accepted that an accomplished systems engineer will have expertise in at least one domain of practice. Most organizations should and will create their own competency models for systems engineering that include the general aspects of the discipline and the domain expertise that apply to their business.

To elaborate on specific examples for illustration, three competency model examples follow. The reader is encouraged to explore in full all the models summarized in the previous table. The INCOSE model developed by a working group in the UK was the first collaboratively developed and released system engineering competency framework. As Table XX shows, the INCOSE framework is divided into three theme areas with a number of competencies in each. The highest-level themes are systems thinking, holistic life cycle view and systems management. The model for U.S. Department of Defense systems engineering acquisition professionals (SPRDE) (DAU 2010a) includes 29 competency areas shown in Table XX. Each is grouped according to a “Unit of Competence” as listed in the left hand column. For this model the three top-level groupings are analytical, technical management, and professional. The life cycle view used in the INCOSE model is evident in the SPRDE analytical grouping, but is not cited explicitly. Technical management is the equivalent of the INCOSE system engineering management, but additional competencies are added, including software engineering competencies. Some general professional skills have been added to meet the needs for strong leadership required of the systems engineers and program managers who will be assessed against this model. The APPEL web site provides a competency model that covers both Project Engineering and Systems Engineering. (Academy of Program/Project & Engineering Leadership (APPEL) 2009) There are three parts to the model, one that is project engineering-unique, one that is systems engineering-unique, and a third that is common to both disciplines. Table XX below shows the systems engineering-unique aspects of the model. The project management-unique items include project conceptualization, resource management, project implementation, project closeout, and program control & evaluation. The common competency areas are NASA internal and external environments, human capital and management, security, safety and mission assurance, professional and leadership development, and knowledge management. This 2010 model is adapted from earlier versions. Squires, Larson, and Sauser demonstrates a method that can be used to analyze the degree to which an organization’s systems engineering capabilities meet government-industry defined systems engineering needs, using an earlier version of this model. (2010, 246-260)

Relationship of SE Competencies to Other Competencies

Systems engineering is just one of many engineering disciplines. A competent systems engineer must possess KSAAs that are unique, as well as many other KSAAs that are shared with other engineering and non-engineering disciplines. One approach for a complete engineering competency model framework has multiple dimensions where each of the dimensions has unique knowledge, skills, abilities, and attitudes that are independent of the other dimensions. (Wells 2008) The number of dimensions depends on the engineering organization and the range of work performed within the organization. The concept of creating independent axes for the competencies was presented in (Jansma and Derro 2007), using technical knowledge (domain/discipline specific), personal behaviors, and process as the three axes. An approach that uses process as a dimension is presented in (Widmann et al. 2000), where the competencies are mapped to process and process maturity models. For a large engineering organization that creates complex systems solutions, there are typically four dimensions:

* Discipline (e.g. electrical, mechanical, chemical, systems, optical, etc.),
* Life Cycle (e.g. requirements, design, testing, etc.),
* Domain (e.g. aerospace, ships, health, transportation, etc.), and
* Mission (e.g. air defense, naval warfare, rail transportation, border control, environmental protection, etc.).

These four dimensions are built on the concept defined in (Jansma and Derro 2007) and (Widmann et al. 2000) by separating discipline from domain and by adding mission and life cycle dimensions. Within many organizations, the mission may be consistent across the organization and this dimension would be unnecessary. A three-dimensional example is shown in Figure XX, where the organization works on only one mission area so that dimension has been eliminated from the framework. The discipline, domain, and life cycle dimensions are included in this example, and some of the first-level areas in each of these dimensions are shown. At this level, an organization or an individual can indicate which areas are included in their existing or desired competencies. The sub-cubes are filled in by indicating the level of proficiency that exists or is required. For this example, blank indicates that the area is not applicable, and colors (shades of gray) are used to indicate the levels of expertise. The example shows a radar electrical designer that is an expert at hardware verification, is skilled at writing radar electrical requirements, and has some knowledge of electrical hardware concepts and detailed design. The radar electrical designer would also assess his or her proficiency in the other areas, the foundation layer, and the leadership layer to provide a complete assessment.

Levels of Expertise

Competency levels can be based on “Levels of Cognition”. One example is through the use of the revised model of Bloom’s Taxonomy (Bloom 1984), presented below in rank order, from least complex to most complex.

* Remember – Recall or recognize terms, definitions, facts, ideas, materials, patterns, sequences, methods, principles, etc.
* Understand – Read and understand descriptions, communications, reports, tables, diagrams, directions, regulations, etc.
* Apply – Know when and how to use ideas, procedures, methods, formulas, principles, theories, etc.
* Analyze – Break down information into its constituent parts and recognize their relationship to one another and how they are organized; identify sublevel factors or salient data from a complex scenario.
* Evaluate – Make judgments about the value of proposed ideas, solutions, etc., by comparing the proposal to specific criteria or standards.
* Create – Put parts or elements together in such a way as to reveal a pattern or structure not clearly there before; identify which data or information from a complex set is appropriate to examine further or from which supported conclusions can be drawn.

Competency levels can also be situationally based. The levels for the SPRDE competency model are based on the complexity of the situation to which the person can appropriately apply the competency to (DAU 2010a):

1. No exposure to or awareness of this competency.
2. Awareness: Applies the competency in the simplest situations.
3. Basic: Applies the competency in somewhat complex situations.
4. Intermediate: Applies the competency in complex situations.
5. Advanced: Applies the competency in considerably complex situations.
6. Expert: Applies the competency in exceptionally complex situations.

Other examples of proficiency levels include the INCOSE (INCOSE 2010b) competency model proficiency levels of Awareness, Supervised Practitioner, Practitioner, and Expert, and the APPEL competency model levels of participate, apply, manage, and guide, respectively (Menrad and Lawson 29 September-3 October, 2008). NASA as part of the APPEL has also defined the proficiency levels of: I) Technical Engineer/Project Team Member, II) Subsystem Lead/Manager, III) Project Manager/Project Systems Engineer, and IV) Program Manager/Program Systems Engineer.

When using application as a measure of competency, it is important to have a measure of goodness. Just because someone is applying a competency in an exceptionally complex situation, it does not mean they are doing well in this application. Likewise, just because a person is managing and guiding, it does not mean they are doing this well. An individual might be fully competent in an area, but not be given an opportunity to use that competency. Decouple competency and application, unless the application has a measure of goodness for that application and the assumption is made that resources are utilized to their full potential.

Article Discussion

[Go to discussion page]