Compliance Certification Report

3.3.1
The institution identifies expected outcomes for its educational programs (including student learning outcomes for educational program) and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results. (Institutional Effectiveness)
√ Compliant
Partially Compliant
Non-Compliant
Narrative:
The institution is compliant with this principle because it engages in an ongoing process of identifying, measuring, and assessing learning outcomes for each of its degree granting programs as well as identifying, measuring, and assessing expected outcomes of its administrative components and educational support services. Further, the institution uses the results of these processes to identify and take action on improvement opportunities.

Background
The University of Texas Medical Branch at Galveston is a state supported institution of higher education specializing in graduate and first professional educational programs in the health sciences. The institution is primarily focused on graduate education and has a limited number of undergraduate programs. The university does not accept “first-time, full-time” undergraduate students and thus it does not offer a general education curriculum.

Structure of Assessment
The institution uses a variety of processes for identifying and assessing student learning outcomes in its degree granting programs and the student support and administrative units that support those programs. Assessment takes place at the university or institution level, at the level of the individual academic program, and at the level of the individual student.

University Level Assessment
In 2003, the University of Texas System instituted the development of compacts with each UT institution. The compacts are written agreements between the Chancellor of the University of Texas System and the presidents of each of the System's academic and health institutions that summarize the institution's major goals and priorities, strategic directions, and specific tactics to achieve its goals (1) (2) (3). The Compact process entails a progress report on goals and the development of new short- and long-term goals. The attached table (4) contains selected outcomes from the Institutional Compact planning process. This process is discussed in further detail in the narrative for Core Requirement 2.5.

In addition to the Compact process, the institution conducts oversight and performance review through its annual budget process. This process entails the development of goals and objectives and review of prior year accomplishments. Each office and program within the institution develops its own budget, which is in turn aggregated into the budgets of the major organizational components of the university. This provides a forum for the weighing of competing priorities, the review of performance against a list of prior year objectives and the effectiveness of overall operations. Budgetary decisions are thus made with direct information as to their programmatic impacts. Resource allocation then takes place in a collaborative process informed by the institutional and program level impact of the choice (5).

Finally, each administrative and student support unit develops a set of metrics appropriate for their particular performance and service objectives. These measures are operational in nature and reflect the ongoing nature of assessment in the administrative units (6).

Program Level Assessment
Each of our schools, with the exception of the Graduate School of Biomedical Sciences, is accredited by its own specialty accrediting entity. Our School of Medicine offers a single education program and follows academic standards established by its accrediting body. The School of Nursing and the School of Allied Health Sciences offer a variety of educational programs each of which are regularly reviewed by their respective professional accrediting bodies. The Graduate School of Biomedical Sciences conducts assessment of its programs by way of peer review. Program review in the Graduate School is conducted within five years through a peer review process involving professionals from outside the institution whose professional competence is in the discipline being reviewed. Established peers review the resources, institutional support, academic offerings, and faculty of each program. They meet with the Dean, departmental chairs, and program coordinators and issue a report which contains a set of recommendations for improvement (7). Examples of recommendations that are being actively addressed include the distribution of dissertation advising more evenly throughout the faculty; a broader use of basic science faculty in the education of medical students; the optimization of research space allocation procedures; and the need to plan and acquire further computational resources.

In addition to these processes of external review and accreditation, the University conducts reviews of program level learning outcomes through the Institutional Educational Effectiveness Committee (IEEC) and the Office of Institutional Effectiveness. The IEEC was formed in May 2004 (8) and has as its primary charge “To perform a systematic, broad based university-wide evaluation of the institution's educational programs, to evaluate the extent to which educational goals are being achieved, and to provide feedback to improve the educational programs.” In order to ensure broad based participation in assessment activities, the IEEC is comprised of faculty from each of the schools as well as academic administrators and student representatives (9) (10) (11). The committee reports to the Council of Deans (12).

The Office of Institutional Effectiveness (OIE) was created in September 2006 as a successor to the Office of Institutional Analysis. A search for a new director was undertaken and the position was filled in May of 2007. This office has responsibility for providing leadership, staffing and analytical support to the institutional effectiveness process within the university. Also, the OIE performs assessments of the administrative and student support functions, coordinates university reporting to outside entities and has primary responsibility for educational program assessments. In that capacity, the office provides direction and support to the IEEC. The OIE reports to the Chief Academic Officer.

Student Level Assessment
The primary method of assessing university processes at the individual level is the annual Student Satisfaction Survey. The survey employs a sampling of students and evaluates the student experience on campus. The results of this survey are compiled and shared with institutional leadership for follow up. Examples of the process include a review of the Jamail Student Center cleaning frequency and scheduling; an effort to inform students as to the availability of wireless access points on campus; and a review the heating and cooling in the student center (13) (14) (15).

Process of Assessment
The IEEC in conjunction with the then Office of Institutional Analysis, began the process of developing a program review template in the Fall of 2004 (16). In June 2005, a consultant with particular experience in institutional effectiveness was engaged and visited the campus. The result of this visit was a review of the assessment activities on campus and the initiation of an integrated approach to academic program review. The visit also resulted in the development of a template for assessment and its review by the IEEC and academic leadership in each of the schools. The template served as a model which was further elaborated by academic leadership in each of the schools. In conjunction with the conceptual development, a web-based application was created to allow staff in each of the academic programs to enter program specific learning objectives, measurement criteria and methods as well as information as to performance against criteria (17). The template and the web-based interface were refined and the data entry process began. In order to evaluate the development of the program reviews, a subset of 14 programs were selected for initial review. Employing a rubric to ensure consistency of review, 4 members of the IEEC reviewed program review documents for completeness, adequacy, and clarity. Further comments were collected as to the overall quality of the program review document. This information was then employed to refine the program review documents as well as the data collection process (18).

Further work by this committee included the development of a timeline for assessment efforts and the completion of data entry for all academic programs (19). Currently, all programs have completed program reviews in place with review of programs ongoing (20).

Data from program level assessment is then reviewed and developed into actionable information for program improvement. Examples of improvements include course redesign; the addition of quantitative methods to strengthen data analysis component of coursework in the Nursing Doctoral program; implementation of practice examinations and the comprehensive review and redesign of the curriculum in the Medical School; revision to SOM curriculum for year 3 (2003-04) and 4 in (2004-05) included revision and review of clerkships, the addition of a scholarly project in year 4, and an increased emphasis on basic science education in clerkships (21).
#
Source
1
UTMB Compact with UT System 2006
http://www.utsystem.edu/osm/compacts/2006/UTMB07-08Compact.pdf
2
UTMB Compact with UT System 2005
http://www.utsystem.edu/osm/compacts/2005/UTMB06-07Compact.pdf
3
UTMB Compact with UT System 2004
http://www.utsystem.edu/osm/compacts/2004/UTMB052704.pdf
4
Institutional Compact Outcomes Table
Hard copy located in the Office of Institutional Effectiveness
5
UTMB FY 2008 Budget Instructions
Hard copy located in the Office of Institutional Effectiveness
6
UTMB Administrative Unit Assessment
Hard copy located in the Office of Institutional Effectiveness
7
UTMB Graduate School of Biomedical Sciences External Program Reviews
Hard copy located in the Office of Institutional Effectiveness
8
UTMB AEC Commission of IEEC, May 2004
Hard copy located in the Office of Institutional Effectiveness
9
IEEC Membership Roster, August 2005
Hard copy located in the Office of Institutional Effectiveness
10
IEEC Membership Roster, March 2006
Hard copy located in the Office of Institutional Effectiveness
11
IEEC Membership Roster, August 2007
Hard copy located in the Office of Institutional Effectiveness
12
UTMB Academic Infrastructure Organization Chart
http://www.utmb.edu/facts/general_information/UTMBAcademicOrgChart_Detail.pdf
13
UTMB Student Satisfaction Scores 2002-2007
http://www.utmb.edu/studentlife/SGA/documents/StudentSatisfactionSurveyScores02-07.pdf
14
UTMB Student Survey Responses 2006
http://intranet.utmb.edu/studentlife/SGA/SurveyResponse.htm
15
UTMB Student Survey Responses 2005
http://intranet.utmb.edu/studentlife/SGA/documents/04-05satisfactionsurveyresponses_000.pdf
16
IEEC Minutes, September 2004
Hard copy located in the Office of Institutional Effectiveness
17
IEEC Minutes, March 2005
Hard copy located in the Office of Institutional Effectiveness
18
UTMB Program Assessment Rubric
Hard copy located in the Office of Institutional Effectiveness
19
UTMB Schedule of Program Assessment
Hard copy located in the Office of Institutional Effectiveness
20
UTMB Program Reviews
Hard copies located in the Office of Institutional Effectiveness
21
UTMB School Strategic Plans Table
Hard copy located in the Office of Institutional Effectiveness