Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Attendees:

  • RAMPS: Stephen Deems (v), Dave Hart

  • MATCH: Shelley Knuth (v), Alana Romanella

  • CONECT: Tim Boerner (v), Leslie Froeschl

  • MMS: Tom Furlani (v), Joe White

  • OpenCI: John Towns (v), Shawn Strande, Lavanya Podila, Lisa Kaczmarczyk, Shannon Bradley

  • NSF: Tom Gulbransen, Sharon Geva

  • Guest Speaker: Paul Parsons, Claire Stirm, Dina Meek

(v) indicates a voting member

Expand
titleAttendance

Name

9/5

9/12

9/19

9/26

10/3

10/10

10/24

11/7

11/21

12/5

12/12

12/19

Stephen Deems

X

X

Cancelled

x

Cancelled

X

X

No Meeting

X

David Hart

X

X

Status
colourRed
titleAbsent

Status
colourRed
titleAbsent

X

X

Shelley Knuth

Status
colourRed
titleAbsent

X

Status
colourRed
titleAbsent

X

X

Absent

Alana Romanella

X

Status
colourRed
titleAbsent

x

Status
colourRed
titleAbsent

X

Tim Boerner

Status
colourRed
titleAbsent

X

Status
colourRed
titleAbsent

X

X

Tom Furlani

X

X

Status
colourRed
titleAbsent

X

X

Absent

Joe White

X

X

x

X

X

X

James Griffioen

X

X

x

X

Absent

John Towns

X

X

x

X

X

Shawn Strande

X

X

x

X

X

X

Leslie Froeschl

X

X

Status
colourRed
titleAbsent

X

X

X

Lavanya Podilla

X

X

X

X

X

Lisa Kaczmarczyk

X

X

X

Status
colourRed
titleAbsent

X

X

Shannon Bradley

X

X

Status
colourRed
titleAbsent

Status
colourRed
titleAbsent

X

X

Tom Guilbransen

Status
colourRed
titleAbsent

X

Status
colourRed
titleAbsent

Status
colourRed
titleAbsent

X

X

Sharon Geva

X

X

Status
colourRed
titleAbsent

Status
colourRed
titleAbsent

X


Decisions made during the meeting:

  • Evaluation Plan approved by EC Vote - 12/19/2023

Use Decision Macro

Agenda/Notes

  1. SGX3 ACCESS website usability engagement @Paul and @Claire (15 mins)30 mins)

    1. This is an educational and informative session to understand the work of SGX3 Usability Consulting.

    2. Paul is an associate professor at Purdue. He leads the usability consulting for SGX3, completed over 60 projects in different domains and sciences.

    3. Projects range are different depending on what the user is trying to do, identify best practices and patterns, have methods to understand usability

    4. The team tries to evaluate how to improve the usability of the product

    5. Student team at Purdue with oversight from Paul

    6. Engagement timeline 2-3 months

    7. Image Added
    8. Students setup meetings and do milestone check ins

    9. Generate reports with findings, summary and next steps. Then go into usability and different methods, where things can be confusing and misleading. They identify issues and then come up with recommendations and explanations.

    10. Questions for the team; The process looks extensive and through, have you been able to measure impact of the work you've done?

      1. Paul: impact is a bit hard to assess. Claire maybe able to help. One thing the team has done is to go back and talk to people and understand what has been beneficial. He doesn’t have any quantitative metrics.

      2. Claire: Each gateway has different ways to measure impact. Helped with cleaner navigation, see the traction higher. Do not have to provide more information on following navigating the sites.

      3. They have also seen, that the vocabulary has been improved

      4. Folks who don’t have developers, it can be difficult to measure the impact

    11. When you do assessments, have you had a request where you are trying to serve different types of users/personas?

      1. Paul: Very frequently, when we start the engagement we ask about the core users. Who are the key stakeholders and then try to optimize for those groups.

    12. ACCESS is a set of individual awards and individual websites. Is there a model where we can drill on the personas

      1. Paul: Not as common but we see that. It's a single website but there's a lot of complexity and infrastructure hierarchy. And trying to built that into a cohesive vision.

      2. Claire: What's the launch point or landing page for each of the core users depending on their workflows? We want to know from you about the core users. Then we can start mapping from there.

    13. Can you please drill down on the process after you have identified the stakeholders?

      1. Paul: High level process, have a kick off meeting to understand the direction. Identify users/stakeholders.

        After that the team refines their plan.

      2. Sometimes they reach out to these stakeholders. But varies depending on the projects.

      3. If we have the necessary information handy, then we might skip the interviewing part.

    14. Do you sit down with each of the users and do pre testing? Maybe give them a task and see how they navigate the website?

      1. Paul: Yes the team does pre testing but it can be time consuming with scheduling issues.

    15. Cost of these services is about $9k (Correction from Shawn: a site engagement is no-cost; additional sites are about $9K. However SGX3 would do an all-ACCESS site engagement for sufficiently narrow scope.)

       

  2. Approval of summaries from prior EC meetings John Towns (5 mins)

    1. 2023-11-21 EC Meeting Summary (approved)

  3. Program Milestone CheckJohn Towns (15 5 mins)

    1. Action item for the awardees to update their program milestones

      1. no additional comments or updates

    2. Reminder: Do not edit "Dashboard" tab, please edit "ACCESS program milestones" tab as needed

      1. Resource catalog enhancement - covers many areas - the scope was to cut down resources and filter for RPs - Support has launched but it has some more to go

      2. Website - waiting on go live date - delayed by SC and other meetings - determining best time for launch - getting stakeholders input

      3. Allocations would be next quarter - meta selector

        1. have all groups met to determine how all this fits to gether - yes they need to discuss

      4. Program wide goals - eval team down selected 3 for a starting point - there will be subsequent discussion noted in the tracker

        1. Dave sent out an email on it

        2. Discuss more at next meeting

    EC Review and Vote
    1. Need to fix the newsletter milestones.

  4. EC Review on Communication Plan (5 mins)

    1. https://docs.google.com/document/d/1cgA1RAHgPO9lF4NWz5OK3zTLyE0X52C5JNNablzkGoY/edit
      1BqLGEzE4ME4gj8Mp7NNGWbOAjkw7odShSX9XEFsJRLc/edit?pli=1#heading=h.e0ce5of25nir (updating link)

    2. EC is asked to review and make recommendations on the refreshed Communications Plan

    3. It's out there for review by the EC.

    4. This revised plan tries to leverage the working groups.

    5. Will have this item for next EC meeting for a vote.

  5. March Quarterly Meeting Lavanya Podila (15 mins)

    1. Potential location options

      1. Arizona, Chicago (Big Ten Center), Indianapolis, anywhere else?

      2. Location finalized: Chicago

    2. Dates

      1. 1st week of March - 4th, 5th and 6th ?

      2. 2nd week of March - 12th, 13th and 14th ?

      Audience

      1. Dates finalized: 27th, 28th and 29th February

    3. Audience: PIs/Co-PIs and leads similar to Boulder meeting

      ?

      Will

    4. there There will be a full day EAB meeting. This is the cadence that will be followed starting In Feb and will happen once a year in person.

    5. How many EAB members should attend? it might be overwhelming for all EAB members to attend the program meeting as well?.

    6. Is there interest in holding a full or half-day DEI workshop as part of this meeting? If so, do teams have budget to contribute? Yes

    7. DEI Working Group would like to schedule a workshop in person. We can have a half day or a full day, at least 3 hours preferably at the beginning. The intention is to have a consultant lead the session.

    8. Leslie, Ag and Stephen will take it back to the DEI working group and see how long the session might take.

       

  6. Evaluation Plan David Hart (15 mins)

    1.  Draft Evaluation Plan

    2. Assuming we need surveys, we need to have an approval of the plan.

    3. Should the evaluation plan be broader than what was originally stated?

    4. Possible comprise; The plan includes an annual review of the evaluation working group. John believes that we arrived on the 3 surveys, this can be the plan for the coming year but then reconsider if it needs to be broader.

      Dave agrees that there are several things that need to be nailed down.

    5. Eval Plan approved with EC vote

  7. NAIRR Pilot Update David Hart (5 mins)

    1. Deferred to next EC meeting

  8. Informational Items (10 mins)

    1. Allocations Stephen Deems

      1. Researcher Advisory Committee Agenda Topics (Requested from RAC)

        1. Next meeting will be: Monday, February 26, from 2:30 – 4:00 PM (Central time)

        2. Input is requested from EC on topics. See current list of topics here: ACCESS Researcher Advisory Committee

          1. Items recommended/suggested by the RAC are merged with the EAB tracker: https://docs.google.com/spreadsheets/d/1CtAMssCnj8RNfWk7M7K4RvZG-rBdNJsuFseFVQWuP5s/edit#gid=266099631

            1. Software

            2. Data transfer

      2. New Allocations Website

        1. Launching on Tuesday, January 9, 2024.

          1. Slow roll-out (will be announced via newsletters/announcements later in January)

          2. Image AddedImage Added

          3. Submission Window is Now Open for March 2024 AARC Meeting

            1. Proposals accepted until January 31, 2024. Review meeting 2nd week of March. Awarded proposals start/renew April 1. More information.

    2. MATCH Shelley Knuth

    3. CONECT Tim Boerner

      1. Representatives from all teams met last week to share efforts to date/perspectives on a resource selector tool/filter. See notes from that meeting here. The group settled on the following next steps based on similarities in requirements:

        1. Reps from Allocations and Operations will meet to converge on a single solution for resource listings/filtering.

        2. Reps from Support, Operations, and Metrics will meet to talk about ways the two selector tool options might potentially be merged or at least ensure continuity in results if they remain two separate tools (https://access-ara.ccs.uky.edu:8080/ and content available in the Operations chatbot

        3. Metrics will be consulted throughout to determine where information they can provide can be made available via these various options.

    4. MMS Tom Furlani

    5. OpenCI John Towns

      1. Great experience at NeurIPS last week. CB&E and Comms should consider presence in the future.

      2. Reminder that the RP forum bylaws v1.0 were circulated for EC input. Eva could not attend today, but we will have her for discussion at our next meeting.

Misc Topics:

  • EC agrees to cancel meeting on 2nd January

Info

Parking Lot

  • RP forum bylaws

  • NAIRR pilot update

  • Risk Management progress = change in likelihood due to mitigation efforts?

  • Horizontal Leadership progress = compromises?

Next EC Meeting: 2nd 16th January 2024


Action Tracker:

Use Action Item Macro to track assigned To Do Items - check the box when it is complete

Format:


Reference:

Risk Register: https://access-ci.atlassian.net/jira/core/projects/RR/board

Project Change Request: https://access-ci.atlassian.net/jira/core/projects/PCR/board