Difference between revisions of "Assessing Individuals"
Line 15: | Line 15: | ||
==Assessing Individual SE Competency== | ==Assessing Individual SE Competency== | ||
− | In order to enable improvement or | + | In order to enable improvement or fulfillment of the required SE competencies identified by the organization, it must be possible to assess the existing level of competency for individuals. This assessment informs the interventions needed to further develop individual SE competency. Listed below are possible methods which may be used for assessing an individual's current competency level; an organization should choose the correct model based on their context, as identified previously. |
===Proficiency Levels=== | ===Proficiency Levels=== |
Revision as of 21:05, 5 September 2012
A critical aspect of Enabling Individuals is the ability to fairly assess individuals. This article describes how to assess systems engineering (SE) competencies needed by, actual SE competencies of, and SE performance of individuals.
Assessing Competency Needs
If an organization wants to use its own customized competency model, an initial decision is make vs. buy. If there is an existing SE competency model that fits the organization's context and purpose, the organization might want to use the existing SE competency model directly. If existing models must be tailored or a new SE competency model developed, the organization should first understand its context.
Determining Context
Prior to understanding what SE competencies are needed, it is important for an organization to examine the situation in which it is embedded, including environment, history, and strategy. As Figure 1 shows, MITRE has developed a framework characterizing different levels of systems complexity. (MITRE 2007, 1-12) This framework may help an organization identify which competencies are needed. An organization working primarily in the traditional program domain may need to emphasize a different set of competencies than an organization working primarily in the messy frontier. If an organization seeks to improve existing capabilities in one area, extensive technical knowledge in that specific area might be very important. For example, if stakeholder involvement is characterized by multiple equities and distrust, rather than collaboration and concurrence, a higher level of competency in being able to balance stakeholder requirements might be needed. If the organization's desired outcome builds a fundamentally new capability, technical knowledge in a broader set of areas might be useful.
In addition, an organization might consider both its current situation and its forward strategy. For example, if an organization has previously worked in a traditional systems engineering context (MITRE 2007) but has a strategy to transition into enterprise systems engineering (ESE) work in the future, that organization might want to develop a competency model both for what was important in the traditional SE context and for what will be required for ESE work. This would also hold true for an organization moving to a different contracting environment where competencies, such as the ability to properly tailor the SE approach to right size the SE effort and balance cost and risk, might be more important.
Determining Roles and Competencies
Once an organization has characterized its context, the next step is to understand which specific SE roles are needed and how those roles will be allocated to teams and individuals. In order to be able to assess the performance of individuals, it is essential to explicitly state the roles and competencies required for that individual. The references from the section on SE Roles and Competencies provide guides to existing SE standards and SE competency models which can be leveraged.
Assessing Individual SE Competency
In order to enable improvement or fulfillment of the required SE competencies identified by the organization, it must be possible to assess the existing level of competency for individuals. This assessment informs the interventions needed to further develop individual SE competency. Listed below are possible methods which may be used for assessing an individual's current competency level; an organization should choose the correct model based on their context, as identified previously.
Proficiency Levels
One approach to competency assessment is the use of proficiency levels. Proficiency levels are frameworks to describe the level of skill or ability of an individual on a specific task. One popular proficiency framework is based on the “levels of cognition” in Bloom’s taxonomy (Bloom 1984), presented below in order from least complex to most complex.
- Remember – Recall or recognize terms, definitions, facts, ideas, materials, patterns, sequences, methods, principles, etc.
- Understand – Read and understand descriptions, communications, reports, tables, diagrams, directions, regulations, etc.
- Apply – Know when and how to use ideas, procedures, methods, formulas, principles, theories, etc.
- Analyze – Break down information into its constituent parts and recognize their relationship to one another and how they are organized; identify sublevel factors or salient data from a complex scenario.
- Evaluate – Make judgments about the value of proposed ideas, solutions, etc., by comparing the proposal to specific criteria or standards.
- Create – Put parts or elements together in such a way as to reveal a pattern or structure not clearly there before; identify which data or information from a complex set is appropriate to examine further or from which supported conclusions can be drawn.
Other examples of proficiency levels include the INCOSE competency model, with proficiency levels of: awareness, supervised practitioner, practitioner, and expert. (INCOSE 2010) The Academy of Program/Project & Engineering Leadership (APPEL) competency model includes the levels: participate, apply, manage, and guide, respectively (Menrad and Lawson 2008). The U.S. National Aeronautics and Space Administration (NASA), as part of the APPEL (APPEL 2009), has also defined proficiency levels: technical engineer/project team member, subsystem lead/manager, project manager/project systems engineer, and program manager/program systems engineer.
Situational Complexity
Competency levels can also be situational-based. The levels for the U.S. Department of Defense (DoD) Systems Planning Research, Development, and Engineering (SPRDE) competency model are based on the complexity of the situation to which the person can appropriately apply the competency (DAU 2010):
- No exposure to or awareness of this competency
- Awareness: Applies the competency in the simplest situations
- Basic: Applies the competency in somewhat complex situations
- Intermediate: Applies the competency in complex situations
- Advanced: Applies the competency in considerably complex situations
- Expert: Applies the competency in exceptionally complex situations.
Quality of Competency Assessment
When using application as a measure of competency, it is important to have a measure of "goodness". If someone is applying a competency in an exceptionally complex situation, they may not necessarily be successful in this application. An individual may be "managing and guiding", but this is only helpful to the organization if it is being done well. In addition, an individual might be fully proficient in a particular competency, but not be given an opportunity to use that competency; for this reason, it is important to understand the context in which these competencies are being assessed.
Individual SE Competency versus Performance
Even if an individual possesses exemplary proficiency in an SE competency, the specific context in which the individual is embedded may preclude exemplary performance of that competency. For example, a highly proficient individual in risk management may be embedded in a team which does not use that talent or in an organization with flawed procedural policies which do not fully utilize this individual's proficiency. Developing individual competencies is not enough to ensure exemplary SE performance. The final execution and performance of SE is a function of competency, capability, and capacity. The sections on Enabling Teams and Enabling Businesses and Enterprises address this context. If the SE roles are clearly defined, performance assessment can be an objective evaluation of the individual's performance. However, it is most often a team of individuals tasked with accomplishing the SE tasks on a project, and it is the team's performance which might be assessed. (See Team Capability).
References
Works Cited
Academy of Program/Project & Engineering Leadership (APPEL). 2009. "NASA's Systems Engineering Competencies". Washington, DC, USA: US National Aeronautics and Space Administration. Available at: http://www.nasa.gov/offices/oce/appel/pm-development/pm_se_competency_framework.html.
Bloom, B. S. 1984. Taxonomy of Educational Objectives. New York, NY, USA: Longman.
DAU. 2010. SPRDE-SE/PSE Competency Assessment: Employee's user's guide, 5/24/2010 version. in Defense Acquisition University (DAU)/U.S. Department of Defense [database online]. Available at: https://acc.dau.mil/adl/en-US/406177/file/54339/SPRDE-SE-PSE%20Competency%20Assessment%20Supervisors%20Users%20Guide_DAU.pdf.
INCOSE. 2010. Systems Engineering Competencies Framework 2010-0205. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2010-003.
Menrad, R. and H. Lawson. 2008. "Development of a NASA Integrated Technical Workforce Career Development Model Entitled: Requisite Occupation Competencies and Knowledge – The ROCK." Paper presented at the 59th International Astronautical Congress (IAC). 29 September-3 October 2008. Glasgow, Scotland.
MITRE. 2007. Enterprise Architecting for Enterprise Systems Engineering. Warrendale, PA, USA: SEPO Collaborations, SAE International. June 2007.
Primary References
Academy of Program/Project & Engineering Leadership (APPEL). 2009. "NASA's Systems Engineering Competencies". Washington, DC, USA: US National Aeronautics and Space Administration. Available at: http://www.nasa.gov/offices/oce/appel/pm-development/pm_se_competency_framework.html.
DAU. 2010. SPRDE-SE/PSE Competency Assessment: Employee's user's guide, 5/24/2010 version. in Defense Acquisition University (DAU)/U.S. Department of Defense [database online]. Available at: https://acc.dau.mil/adl/en-US/406177/file/54339/SPRDE-SE-PSE%20Competency%20Assessment%20Supervisors%20Users%20Guide_DAU.pdf.
INCOSE. 2010. Systems Engineering Competencies Framework 2010-0205. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2010-003.
Additional References
None.
SEBoK Discussion
Please provide your comments and feedback on the SEBoK below. You will need to log in to DISQUS using an existing account (e.g. Yahoo, Google, Facebook, Twitter, etc.) or create a DISQUS account. Simply type your comment in the text field below and DISQUS will guide you through the login or registration steps. Feedback will be archived and used for future updates to the SEBoK. If you provided a comment that is no longer listed, that comment has been adjudicated. You can view adjudication for comments submitted prior to SEBoK v. 1.0 at SEBoK Review and Adjudication. Later comments are addressed and changes are summarized in the Letter from the Editor and Acknowledgements and Release History.
If you would like to provide edits on this article, recommend new content, or make comments on the SEBoK as a whole, please see the SEBoK Sandbox.
blog comments powered by Disqus