Have a personal or library account? Click to login
A Conceptual Enterprise Framework for Managing Scientific Data Stewardship Cover

A Conceptual Enterprise Framework for Managing Scientific Data Stewardship

Open Access
|Jun 2018

Figures & Tables

dsj-17-749-g1.png
Figure 1

Diagram of pathway from raw material to informed decisions, adapted from Figure 1.1 of Mosely et al. (2009) with permission.

Table 1

Examples of NFRs on federally funded digital scientific data and our mapping to information quality dimensions defined by Lee et al. (2002) and Ramapriyan et al. (2017).

NFRsDescriptionDimension based on Lee et al. (2002)Dimension based on Ramapriyan et al. (2017)
AccessibilityThe quality or fact of being accessibleAccessibilityStewardship; Service
AccuracyThe quality or fact of being correctIntrinsicScience
AvailabilityThe quality or fact of being availableAccessibilityProduct
CompletenessThe quality or fact of being completeContextualProduct
FindabilityThe quality or fact of being findableN/AStewardship; Service
IntegrityThe quality or fact of being intactIntrinsicProduct; Stewardship; Service
InteroperabilityThe quality or fact of being interoperableRepresentationalProduct; Stewardship; Service
ObjectivityThe quality or fact of being objectiveIntrinsicScience
PreservabilityThe quality or fact of being preservableN/AStewardship
ReproducibilityThe quality of fact of being reproducibleN/AProduct; Stewardship
RepresentativenessThe quality of fact of being representationalRepresentationalProduct; Stewardship
SecurityThe quality or fact of being secureAccessibilityStewardship; Service
SustainabilityThe quality or fact of being sustainableN/AProduct; Stewardship; Service
TimelinessThe quality or fact of being done at a useful timeContextualProduct; Service
TraceabilityThe quality or fact of being traceableN/AProduct; Stewardship; Service
TransparencyThe quality or fact of being transparentN/AProduct; Stewardship
UsabilityThe quality or fact of being easy to understand and use; being usableRepresentationalProduct; Stewardship; Service
UtilityThe quality or fact of being utilizedIntrinsicProduct
dsj-17-749-g2.png
Figure 2

Conceptual diagram of proposed data-centric, enterprise scientific data stewardship framework. The staggered pyramid on the left represents interconnection between federal regulations, mandatory controls, recommendations, and instructions. The MM-tags beneath the pyramid represent quality assessments through the entire data product life cycle. The text on the right represents each step of the PDCA cycle and a summary of high-level outcomes.

Table 2

A summary of the PDCS cycle as defined by Nayab and Richter (2013) and adapted by ESDSF.

The PDCA Cycle based on Nayab and Richter (2013)The PDCA Cycle adapted by ESDSF
Plan/Define (planning the required changes)Integrated non-functional requirements from federal directives, agency policies, organizational strategy, and user requirements (referred to as the requirements) are defined and documented. (They may be referred to “mission parameters”.)
Functional areas, controls, and standards necessary for compliance with the requirements are defined and documented. (They are required changes.)
They are communicated within the organization across different entities.
Do/Create (making the changes)The guidelines, processes, procedures, and best practices to enable the compliance with the requirements are created, documented, and implemented.
They are communicated within the organization across different entities to ensure consistency and efficiency.
Check/Assess (checking whether the implemented changes have the desired effect)Check the results of implementations of processes and procedures using consistent assessment models that are based on community best practices, yielding quantifiable evaluation results.
The results are captured and presented in ways suitable to both human and machine end-users.
Areas for improvement are identified with a roadmap forward based on where they are and where they need to be.
Act/Improve (adjusting or institutionalizing the changes)Steps are taken based on the roadmap forward to improve current processes, procedures, and practices, circling back to the Do/Create stage if necessary.
If the requirements need to be updated, circle back to the Plan/Define stage.
The processes, procedures, and practices of implementations are standardized within the organization once a desired maturity for the requirements is achieved. Monitoring is in place to trigger a new PDCA improvement cycle if a new requirement or a new area of improvement has been identified.
dsj-17-749-g3.png
Figure 3

Diagram of an integrated team of stewards from multiple fields serving as a centralized knowledge and communication hub for effective long-term scientific data stewardship. SMEs denote domain subject matter experts. The concept of this diagram is based on Peng et al. (2016a).

Table 3

Roles, Knowledge, and Capability: Provided or Required (with input from Chisholm 2014).

RoleMinimum Knowledge RequiredMinimum Responsibility or Capability Provided
Point-Of-Contact (POC)Basic, very limited knowledge in a particular subjectServing as a focal point of information concerning an activity or program; limited knowledge input
SpecialistHighly skilled with extensive knowledge in a particular subjectPOC + good subject knowledge input
Subject Matter Expert (SME)Extensive knowledge and expertise in a specific domainPOC + extensive subject or domain knowledge input
StewardExtensive knowledge and expertise in a specific domain and general knowledge in other relevant domains, e.g., science/business and technology.SME + effective trans-disciplinary communication + mindset of caring and improving other’s assets + prompting for good stewardship practices
Table 4

Maturity Assessment Categories and Descriptions.

Category NumberDescription
Category 1No assessment done.
Category 2Self-assessment—preliminary evaluation carried out by an individual for internal or personal use; abiding to non-disclosure agreement.
Category 3Internal assessment—complete evaluation carried out by an individual non-certified entity (person, group, or institution) and reviewed internally with the assessment results (ratings and justifications) publicly available for transparency.
Category 4Independent assessment—Category 3 + reviewed by an independent entity, that has expertise in the maturity model utilized for the evaluation.
Category 5Certified assessment—Category 4 + reviewed and certified by an established authoritative entity. Maturity update frequency is defined and implemented.
Language: English
Submitted on: Sep 15, 2017
|
Accepted on: Jun 11, 2018
|
Published on: Jun 28, 2018
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2018 Ge Peng, Jeffrey L. Privette, Curt Tilmes, Sky Bristol, Tom Maycock, John J. Bates, Scott Hausman, Otis Brown, Edward J. Kearns, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.