
Figure 1
Diagram of pathway from raw material to informed decisions, adapted from Figure 1.1 of Mosely et al. (2009) with permission.
Table 1
Examples of NFRs on federally funded digital scientific data and our mapping to information quality dimensions defined by Lee et al. (2002) and Ramapriyan et al. (2017).
| NFRs | Description | Dimension based on Lee et al. (2002) | Dimension based on Ramapriyan et al. (2017) |
|---|---|---|---|
| Accessibility | The quality or fact of being accessible | Accessibility | Stewardship; Service |
| Accuracy | The quality or fact of being correct | Intrinsic | Science |
| Availability | The quality or fact of being available | Accessibility | Product |
| Completeness | The quality or fact of being complete | Contextual | Product |
| Findability | The quality or fact of being findable | N/A | Stewardship; Service |
| Integrity | The quality or fact of being intact | Intrinsic | Product; Stewardship; Service |
| Interoperability | The quality or fact of being interoperable | Representational | Product; Stewardship; Service |
| Objectivity | The quality or fact of being objective | Intrinsic | Science |
| Preservability | The quality or fact of being preservable | N/A | Stewardship |
| Reproducibility | The quality of fact of being reproducible | N/A | Product; Stewardship |
| Representativeness | The quality of fact of being representational | Representational | Product; Stewardship |
| Security | The quality or fact of being secure | Accessibility | Stewardship; Service |
| Sustainability | The quality or fact of being sustainable | N/A | Product; Stewardship; Service |
| Timeliness | The quality or fact of being done at a useful time | Contextual | Product; Service |
| Traceability | The quality or fact of being traceable | N/A | Product; Stewardship; Service |
| Transparency | The quality or fact of being transparent | N/A | Product; Stewardship |
| Usability | The quality or fact of being easy to understand and use; being usable | Representational | Product; Stewardship; Service |
| Utility | The quality or fact of being utilized | Intrinsic | Product |

Figure 2
Conceptual diagram of proposed data-centric, enterprise scientific data stewardship framework. The staggered pyramid on the left represents interconnection between federal regulations, mandatory controls, recommendations, and instructions. The MM-tags beneath the pyramid represent quality assessments through the entire data product life cycle. The text on the right represents each step of the PDCA cycle and a summary of high-level outcomes.
Table 2
A summary of the PDCS cycle as defined by Nayab and Richter (2013) and adapted by ESDSF.
| The PDCA Cycle based on Nayab and Richter (2013) | The PDCA Cycle adapted by ESDSF |
|---|---|
| Plan/Define (planning the required changes) | Integrated non-functional requirements from federal directives, agency policies, organizational strategy, and user requirements (referred to as the requirements) are defined and documented. (They may be referred to “mission parameters”.) |
| Functional areas, controls, and standards necessary for compliance with the requirements are defined and documented. (They are required changes.) | |
| They are communicated within the organization across different entities. | |
| Do/Create (making the changes) | The guidelines, processes, procedures, and best practices to enable the compliance with the requirements are created, documented, and implemented. |
| They are communicated within the organization across different entities to ensure consistency and efficiency. | |
| Check/Assess (checking whether the implemented changes have the desired effect) | Check the results of implementations of processes and procedures using consistent assessment models that are based on community best practices, yielding quantifiable evaluation results. |
| The results are captured and presented in ways suitable to both human and machine end-users. | |
| Areas for improvement are identified with a roadmap forward based on where they are and where they need to be. | |
| Act/Improve (adjusting or institutionalizing the changes) | Steps are taken based on the roadmap forward to improve current processes, procedures, and practices, circling back to the Do/Create stage if necessary. |
| If the requirements need to be updated, circle back to the Plan/Define stage. | |
| The processes, procedures, and practices of implementations are standardized within the organization once a desired maturity for the requirements is achieved. Monitoring is in place to trigger a new PDCA improvement cycle if a new requirement or a new area of improvement has been identified. |

Figure 3
Diagram of an integrated team of stewards from multiple fields serving as a centralized knowledge and communication hub for effective long-term scientific data stewardship. SMEs denote domain subject matter experts. The concept of this diagram is based on Peng et al. (2016a).
Table 3
Roles, Knowledge, and Capability: Provided or Required (with input from Chisholm 2014).
| Role | Minimum Knowledge Required | Minimum Responsibility or Capability Provided |
|---|---|---|
| Point-Of-Contact (POC) | Basic, very limited knowledge in a particular subject | Serving as a focal point of information concerning an activity or program; limited knowledge input |
| Specialist | Highly skilled with extensive knowledge in a particular subject | POC + good subject knowledge input |
| Subject Matter Expert (SME) | Extensive knowledge and expertise in a specific domain | POC + extensive subject or domain knowledge input |
| Steward | Extensive knowledge and expertise in a specific domain and general knowledge in other relevant domains, e.g., science/business and technology. | SME + effective trans-disciplinary communication + mindset of caring and improving other’s assets + prompting for good stewardship practices |
Table 4
Maturity Assessment Categories and Descriptions.
| Category Number | Description |
|---|---|
| Category 1 | No assessment done. |
| Category 2 | Self-assessment—preliminary evaluation carried out by an individual for internal or personal use; abiding to non-disclosure agreement. |
| Category 3 | Internal assessment—complete evaluation carried out by an individual non-certified entity (person, group, or institution) and reviewed internally with the assessment results (ratings and justifications) publicly available for transparency. |
| Category 4 | Independent assessment—Category 3 + reviewed by an independent entity, that has expertise in the maturity model utilized for the evaluation. |
| Category 5 | Certified assessment—Category 4 + reviewed and certified by an established authoritative entity. Maturity update frequency is defined and implemented. |
