Overview of the Tracking and Evaluation Core
The mission of the Tracking and Evaluation Core is to assist DaCCoTA leadership in making decisions about improving efficiency by streamlining program implementation. We accomplish this using process evaluation, which measures outputs to determine whether the center is operating as planned, and impact evaluation, which measures outputs to determine what impact the DaCCoTA is making in the world. To accomplish these, we adopt a theory-driven evaluation (TDE) approach:
- The first TDE approach will use system evaluation theory (SET). SET was developed by Dr. Renger, who will be overseeing the DaCCoTA evaluation core, and is the first, and to date only, theory to guide the evaluation of systems.
- The second TDE approach will use Kirkpatrick’s Evaluation Model and Gagné’s learning taxonomy to ensure the design of DaCCoTA education and training activities is sufficient to meet intended outcomes and to evaluate whether they are serving as a catalyst for change. Dr. Van Eck, an expert in instructional design and evaluation, will oversee these efforts.
- The third TDE approach uses the Matrix Mentoring Model (MMM) (Byington et al., 2016) to determine whether mentoring can be used as an effective change agent.
The intent of these TDE strategies is to ensure timely information regarding ways to streamline DaCCoTA processes is being provided to DaCCoTA leadership and necessary prerequisites for achieving system effectiveness are being met.
Evaluating the DaCCoTA efficiency using TDE approaches is markedly different than the strategies used to evaluate similar centers of excellence. It is our intent to show when the evaluation approaches are correctly matched to the evaluation challenge, it results in more meaningful data and useful recommendations to assist decision-making.
Finally, because one theory of change underlying the DaCCoTA’s mission relies on building a translational research culture, our training and evaluation methods employ innovative measures to capture the success of the DaCCoTA at creating a translational research culture and developing skilled researchers.
Process Evaluation Outputs
Develop network of reliance agreements |
Develop online portal system |
Maintain a comprehensive website/database of all services provided by the DaCCoTA |
6 month and 12 month meetings to formulate written outline of milestones/benchmarks |
Quarterly meetings of core directors |
Develop clinical coordination and biobank facility |
Establish two central core facilities to support clinical research |
Number of community engagement programs |
Monthly interaction with individual mentors and internal/external advisory committees |
Create central database management system |
Develop Community Advisory Board (CAB) |
Develop the first Inventory of Previously Conducted Research by the end of year one |
Develop the Database of Tribal Research Projects and Outcomes by the end of year one |
Present an update on DaCCoTA activities annually beginning in year two |
Evaluation Matrix
Goal |
Hypotheses/Research Question |
Measurement & Instruments |
Data Analysis |
1. DaCCoTA stakeholders will participate in mandatory training and will request training needed to master necessary intellectual skills. |
RQ1.1: Do stakeholders attend all mandatory training?
RQ1.2: To what extent do DaCCoTA stakeholders request training when needed? |
Counts of attendance and completion
Counts of requests for additional training; focus groups/interviews; surveys; self-report
|
Descriptive statistics
Descriptive statistics and thematic analysis of focus group transcripts |
2. DaCCoTA stakeholders will choose to meaningfully participate and initiate activities related to attitudinal change strategies |
RQ2.1: To what extent do DaCCoTA stakeholders engage with activities designed to foster attitudinal change?
RQ2.2: To what extent do DaCCoTA stakeholders initiate activities designed to foster desired attitudes? |
Surveys from formal training interventions; interviews/focus groups; counts of activities
Analysis of DaCCoTA operational reports of related activity (e.g., # of meetings between client and mentors; counts of meeting initiation by initiator); client/mentor self-report; observation |
Descriptive statistics, frequencies, thematic analysis of transcripts
Descriptive statistics, frequencies |
3. DaCCoTA stakeholders will demonstrate the intellectual skills necessary to support DaCCoTA goals at K2 and K3 |
RQ3.1: Do stakeholder test and survey scores indicate mastery level?
RQ3.2: Do stakeholder behaviors reflect application of skill mastery outside the context of training environments? |
Test scores from formal training interventions
Observation; focus groups/interviews; counts of related activities; self-report
|
Descriptive statistics; paired T-tests
Descriptive statistics and thematic analysis of transcripts |
4. DaCCoTA stakeholders will demonstrate the attitudinal outcomes consistent with the theory of change at K2/K3. |
RQ4.1: Do stakeholder surveys indicate respect/value for DaCCoTA processes and stakeholders?
RQ4.2: Are stakeholder behaviors regarding DaCCoTA processes and stakeholders consistent with desired attitudes? |
Attitudinal surveys from formal training interventions; interviews/focus groups Analysis of DaCCoTA operational reports of related activity (# of meetings between client and DaCCoTA and among DaCCoTA units; counts of meeting initiation by initiator); self-report; observation |
Descriptive statistics, frequencies, paired T-tests Thematic analysis of transcripts, descriptive statistics, frequencies |