4.0 25 October 2022: Evaluation

Agenda

Measurement and Evaluation discussion: Lizanne DeStefano, ACO Evaluation Lead

  • Some thoughts and background to seed the discussion:

    • Evaluation background:  how to best measure progress against goals?

    • Scope of evaluation:

      • Do we want to measure against what we have control over (or at least fund effort in), or do we want more holistic assessment of progress in community building and engagement, for ACCESS program as a whole?

      • What communities are we considering, and what does it mean to build and engage each community?

      • How can we measure progress for improving diversity, equity and inclusion?

    • State of plans (Background)

      • Overarching Community Building and Engagement plans, including plans for improving Diversity, Equity and Inclusion: just getting started.

      • Allocations:

        • Initial plan in development

        • Community building through community outreach (events, conferences, hackathons…)

        • DEI plan is a subset of continuous improvement for Allocations: survey committee initially to gather demographics and committee’s perspective on DEI

          • Deploy training for allocations committee

        • Repeat survey for each allocation meeting to see progress

      • Support

        • Community outreach and promotion (SC, PEARc, Tapia, etc)

        • DEI: no plan within support, want to collaborate with all service tracks and ACO to develop a cross-track DEI plan

      • Operations

        • Internship:

          • summer/immersive

          • Longer term/remote

          • Collaborate with allocations to disseminate

        • Building community:

          • Concierge integration experts to help bring new RPs online

          • Scalability of integrating new RPs via integration roadmaps

      • Measurement

        • No real DEI and community engagement plans

        • Interested in helping to build and implement a broader plan

    • Community Connectors (Background)

      • NSF CyberTraining CIP projects: effort funded to integrate with Support Computational Science Support Network (CSSN)

        • CSSN: community knowledge base: post documentation, answer questions

      • NSF CC* - regional computing projects

        • Integrate with metrics to help NSF measure impact of investments

        • Potential integration point with Operations to join community of resource providers

    • Communities and Incentives (Background)

      • Overarching: can broadly think about communities being either consumers or providers of services and resources

      • Allocations

        • Providers:

          • Resource providers providing allocable resources and providing proposal review capacity for lower-tier awards

          • Community of reviewers for largest-tier allocations

          • Potential Incentives for Providers: Coordinated access to broad community of researchers who could benefit from your resource

        • Consumers:

          • Research community needing access to resources to support their research

          • Potential Incentives for Consumers: access to resources to be able to conduct research and education.

      • Support

        • Providers:

          • CI providers contributing to CSSN (proscribed by solicitation: Cybertraining CIP, others need to be incentivized to participate: others could include things like CSSI awardees, novel resource providers (that are not already integrating with ACCESS as a term of their cooperative agreement), campuses, ….

          • Potential Incentives for Providers: community grant pilot, supporting conference participation through substantial contributions to the CSSN

        • Consumers:

          • Research community needing access to resources to support their research

          • Potential Incentives for Consumers: access to support to be able to conduct research and education, both self-serve access to CSSN and documentation, as well as higher level of Match support (Match Plus and Match Premier)

      • Operations

        • Providers

          • Not sure we’ve discussed this yet

        • Consumers

          • Resource providers, including novel resource providers and CC* regional computing awardees

          • Interns: students wanting to learn how to be CI professionals

          • Potential Incentives for Consumers: 

            •  Coordinated access to broad community of researchers who could benefit from your resource

      • Measurement and Metrics:

        • Providers

          • Resource providers

          • Potential incentives for providers: 

            • Provide visibility into impact of your resource for the research community

        • Consumers

          • Funding agencies

          • Resource providers

          • Potential Incentive for consumers: Access to uniform information about resource usage along many dimensions

Notes

  • What would help us with managing our program?

  • Evaluation instruments and IRB: how best to proceed?

  • Coordination with evaluation leads for each track

  • What do we need to track to measure diversification and expansion of our community?

  • Evaluation framework

    • Awareness

    • Satisfaction

    • Impact

  • Formative data

    • How are things going?

    • Was this a good idea?

    • Is this working?

  • Baseline data: one possibility is the community survey data (at the bottom of this document)

  • How far should we track each effort?

  • What would we like to know but is out of our grasp (as best as we can tell, given budgetary constraints)?