Guru's Verification engine ensures consistency, confidence, and trust in the knowledge your organization shares. Learn more.

Assessment at The New School

At The New School, we're building a culture of assessment that is both systematic and consistent. This approach ensures that all faculty, staff, and programs have the tools to measure student learning effectively. By aligning our assessment practices with our mission and strategic priorities, we can ensure that our methods are consistent across the institution, providing meaningful data that informs our decisions. This collaborative and transparent process helps us understand what and how well students are learning, allowing us to continuously improve our curriculum, teaching methods, and support services to foster creative growth and advance equity.

The New School's Commitment to Assessment

Building a sustainable and meaningful culture of assessment at The New School is one of our foremost goals. Our assessment principles, drafted in consultation with institutional groups, serve as our foundation. They are designed to evolve with your feedback, guiding us as we continue to shape our shared approach to assessment.

Growth as Educators

Assessment creates systematic evidence about what and how well students are learning, enabling faculty, staff, and institutions to enhance curriculum, teaching methods, and support services to maximize student development.

Supporting Creative Exploration

By understanding how students develop their voice and technical skills, we can better nurture experimentation, risk-taking, and authentic expression in our classrooms, studios, and support spaces.

Inspiring Innovation

Assessment sparks meaningful dialogue about pedagogy, practice, and the services that we provide students outside of the classroom.

Building Community Trust

Assessment helps us share our story with students, families, and the broader community - demonstrating how we foster creative growth.

Advancing Equity

Through thoughtful assessment, we can ensure our learning spaces welcome and support diverse voices, perspectives, and modes of expression.

Accountability and Transparency

Assessment demonstrates to stakeholders (students, parents, accreditors, policymakers) that we are effectively fulfilling our educational mission and responsibly using resources. Where program purposes lack specificity or agreement, assessment as a process pushes a campus toward clarity about where to aim and what standards to apply; assessment also prompts attention to where and how program goals will be taught and learned.

Informed Decision Making

Regular assessment generates data that guides strategic planning, resource allocation, program, and curricular development, ensuring decisions are based on evidence rather than assumptions or tradition.


Core Assessment Principles

Mission-Driven Assessment

  • All assessment activities align with institutional mission and goals
  • Assessment plans reflect institutional values and priorities
  • Outcomes connect to strategic planning initiatives
  • Assessment is participatory and equitable

Systematic and Sustainable

  • Regular, planned assessment cycles
  • Sustainable resource allocation
  • Integrated into normal operations
  • Documented procedures and timelines

Evidence-Based Decision Making

  • Utilize both quantitative and qualitative evidence, rooted in promising practices identified in published research
  • Clear documentation of decisions and the deliberative processes used to inform them
  • Course-to-program-to-institution analysis
  • Demonstrated use of assessment to inform changes including resource allocation

Principles for Teaching Assessment

Student Learning Focus

  • Direct, measurable evidence of student achievement
  • Allow different ways for students to demonstrate knowledge
  • Multiple assessment points throughout program
  • Clear connection between outcomes and curriculum
  • Regular review of learning outcomes

Faculty Engagement

  • Faculty co-ownership of assessment process
  • Collaborative curriculum development
  • Professional development opportunities

Principles for Student Support Assessment

Holistic Student Development

  • Assessment of academic and non-academic outcomes
  • Focus on student growth and development
  • Integration of support services
  • Consideration of diverse student needs

Quality

  • Regular evaluation of student support offerings
  • Student satisfaction measures
  • Usage patterns/response time analysis

Access and Equity

  • Assessment of service availability
  • Analysis of usage by student demographics
  • Identification of access barriers
  • Equity impact evaluation

Resource Utilization

  • Cost-effectiveness analysis
  • Staff utilization metrics
  • Facility usage patterns
  • Technology implementation effectiveness

Educational Effectiveness Assessment at The New School

For an interactive digest of Program Learning Outcome Assessment activities conducted over AY2425, explore this summary.

Assessment of Student Learning

Assessment as Inquiry

The assessment of student learning is a process of inquiry through which faculty examine what students are learning in their program, how they are learning it, and how their learning can be improved or better supported. When viewed in this way, assessment is similar to research - it involves articulating a research question, designing methodology to answer that question, and collecting and analyzing information.

Below are some examples of questions that faculty can examine through student learning assessment.


Assessing Outcomes

  1. Have students near the end of their program mastered a particular learning outcome?
  2. After students have completed lower-level courses in the major, are they prepared to succeed in advanced courses in the program?
  3. Did adding a prerequisite to a course improve student learning in that course?
  4. At which point in the curriculum do students in a program master a particular learning outcome?
  5. Did students in a particular course or set of courses learn as much in a remote setting as they did in face-to-face courses?
  6. What impact does a particular course requirement have on student learning later on in the program?
  7. Do students who take a particular course learn more if they take the course earlier or later on in their program?
  8. Are there any disparities in student learning?

Examples Methods for Measuring Learning:

  • Use a set of criteria aligned with the program learning outcomes to evaluate a sample of student work in a particular course or at particular points in the program (e.g. in the first year, in key courses, in capstone courses).
  • Implement short assignments or self-assessments in the first week of a key course in the program to measure prior learning.
  • Survey students or conduct focus groups with them about their perceptions of their learning.

Assessing Inputs

  1. Are there sufficient learning opportunities in the program to support student mastery of a particular learning outcome?
  2. Are the course materials aligned with the program student learning outcomes?
  3. Do the projects and assignments that students work on in their courses allow them to demonstrate what they have learned?

Methodology Ideas:

  • Examine the course descriptions and syllabi of required courses in the program and identify the courses in which material related to a program learning outcome is first introduced, is developed further, and is mastered.

Guiding Principles of Assessment

Assessment of student learning is the iterative process through which faculty members and administrators reflect on the degree to which students are achieving established learning outcomes. These outcomes are established for every program as part of the accreditation process, and every course within a program should be designed to lead students through a scaffolded progression of learning toward those outcomes. Upon completion of the program degree, or certificate, students will have produced work that demonstrates achievement or mastery of the program’s learning outcomes.

In order to align student expectations with what is being taught and to promote a positive student learning experience, it is important that we utilize learning outcomes next to the work students produce to ensure that coursework is building proficiency and skill. Our challenge is to engage in assessment in ways that teach us more of what we want to know as educators–that is, to strategically use assessment as a pedagogical tool.

We are committed to excellence in teaching and learning, based on academic best practices, and to continual exploration of our classrooms. Assessment enables faculty to carry out this process, leading towards student learning and growth. To this end, we believe that:

  • Program and course assessments are faculty-directed reflections of the work in which they already engage.
  • Assessment is not intended and will not be used to evaluate individual students or faculty members.
  • Program assessment is intended to illustrate the entirety of the student learning experience, including in program curricula and in co-curricular activities.
  • Assessment should include a balance of approaches, including some activities designed according to the specific needs of individual programs and circumstances and some that are consistent across the university, creating opportunities for cross-program comparisons.
  • Assessment should be performed as a routine aspect of teaching and learning.
  • Students should be considered as partners in the assessment process, acknowledging their lived experiences as part of culturally relevant teaching and learning practices.

Academic Programs

Degree-granting programs at The New School evaluate students' achievement of program learning outcomes in accordance with published Middle States Commission on Higher Education (MSCHE) and New York State Education Department (NYSED) standards. The Curriculum, Learning, and Academic Affairs (CLAA) team in the Provost’s Office collaborates with faculty and administrators to manage the assessment process, utilizing its findings to inform strategic planning and other academic initiatives.

Undertaken in an iterative cycle, program faculty are encouraged to consider several approaches in their review including but not limited to the examination of student work (e.g. papers, projects, portfolios) or the surveying of students and/or alumni on their perceived achievement of program learning outcomes. We recommend program leadership work closely with faculty and college deans to create a multi-year assessment plan to ensure that program assessment is informed by information from a variety of sources (e.g. direct assessment of student work, student surveys, faculty surveys). Whenever possible, students should be included in the development and execution of assessment initiatives as a matter of equity and transparency.

Through the assessment process, faculty members should consider the following:

  • What do we want our students to learn?
  • What methods or approaches will best determine whether students are learning what we expect them to?
    • Where would we look for evidence that students are learning what we said they should be?
  • How do we measure success? What criteria will we use to rank, level, or evaluate across samples of students' work?
  • What do the results of assessment tell us about how or where we can improve our courses, programs, and the institution to better aid our students in reaching their learning goals?

Why Does Assessing a Program Matter?

  • It gives faculty and administrators an ongoing opportunity to reflect on program priorities.
  • It provides information to the faculty about program strengths and weaknesses.
  • It provides information to the students about the program, including their responsibilities for meeting program expectations as well as what they can expect to learn in the program.
  • It allows the university to demonstrate its effectiveness in meeting its educational goals; and
  • It is a critical part of the re-accreditation process.

Shared Capacities

Shared Capacities underlie The New School’s distinctive approach to general education. Shared Capacities are the competencies or skills that undergraduates develop over the course of their education. The university community has identified 11 Shared Capacities, which range from basic skills, such as writing and oral communication and quantitative and scientific literacy, to competencies that are particularly emblematic of The New School. Each of these capacities has its own learning outcomes. For more information, visit TNS's Shared Capacities Initiative website.

Assessment Cycle

The Shared Capacities Assessment Cycle outlines which Shared Capacities and associated learning outcomes will be assessed each academic year. For more information, please contact the Assistant Provost of Curriculum and Learning at willsond@newschool.edu.

Equitable and Inclusive Assessment

At The New School, we believe that the assessment of student learning should be equitable, inclusive, and respect the entirety of a student's experiences. To facilitate this level of careful evaluation, the Curriculum and Learning team in the Provost's Office is proud to provide the following best practices below, with links to relevant resources located at the bottom of this card.

Best Practices for Equitable and Inclusive Assessment

  • Incorporate the student voice into program assessment and take action based on their input. This can be done through asking students to reflect on their learning through a reflective assignment, portfolio, survey, focus group, or some combination of these approaches. Ensure that many student voices are included in the assessment.
  • Incorporate varied assignments that allow students to demonstrate what they have learned in multiple ways.
  • Provide students with feedback and opportunities to incorporate that feedback into their work.
  • Provide clear criteria to students about how their work will be evaluated.
  • Disaggregate assessment data to examine inequities in student learning and progression through the program.
  • Examine whether diverse perspectives are represented in course materials.

Further Reading
What are inclusive assessment practices? - Tufts Center for the Enhancement of Teaching and Learning
Socially Just Assessment Podcast Series - Campus Labs

Identifying and Collecting Evidence

All academic degree programs at The New School are expected to have an assessment plan in place that describes how and when the program will use evidence to determine if students are meeting faculty expectations for achievement of the program learning outcomes.

Programs are encouraged to incorporate multiple sources of evidence into a multi-year assessment plan (e.g. rubric scores on student capstone work, surveys of graduating students and faculty, meeting notes from faculty meetings), but at least once every four years, faculty should collate and directly evaluate culminating student work to determine the extent to which a learning outcome is being met by students who have completed most of the coursework in the program. All program's learning outcomes will be assessed twice over a eight year cycle.

We recommend identifying a culminating experience or course that aligns with all or most of the program learning outcomes, such as a capstone project at the undergraduate level, a qualifying exam or thesis at the Master's level, or a dissertation at the PhD level. If such a course or capstone work is unavailable, faculty might instead review the work produced by graduating majors in upper-level courses in the program.


Direct Evidence

  • Capstone experiences, papers, projects, or performances scored using a rubric or checklist
  • Qualifying papers or dissertations scored using a rubric or checklist
  • Scores on qualifying examinations or comprehensive examinations when scored using a rubric or test blueprint aligned with the program learning outcomes
  • Portfolios of student work
  • Employer or internship supervisor ratings of students' skills that are aligned with the program learning outcomes articulated by faculty

Indirect Evidence

  • Assignment grades if a rubric was not used to score the assignment
  • Graduating student ratings and reflections on the knowledge and/or skills gained from the program
  • Alumni outcomes such as satisfaction, employment rates, and enrollment in graduate programs
  • Course evaluation results from questions that ask about the course, not the instructor.

Why don't grades count as direct assessment?

One of the most common questions from faculty members is why assessment is necessary given that students receive grades. Doesn't a grade already demonstrate that a student has met the course learning goals?

Although a grade is a global indicator as to how one student has performed in a course, it does not directly indicate which learning goals the student has and has not met. A grade of "B" or a "C" may indicate that a student met most goals but not all. Even an "A" grade does not necessarily mean that a student achieved all the program's learning goals, since course assignments and tests may not be directly related to underlying learning goals. Also, grades are often partly based on behavior like course attendance and participation, which are not usually learning goals.

Similarly, overall grade point averages in a program do not indicate whether specific program goals are being achieved. If students in a program have a cumulative GPA of 3.5, what does that tell us about the overall strengths and weaknesses of learning in that program? Program learning is cumulative, but students' performances in individual classes does not necessarily indicate that they are achieving the broader program goals. To evaluate student learning at the program level, it is necessary to examine specific areas of learning separately.

Although grades cannot be used demonstrate which specific program learning goals students have met, grades can on occasion be used to identify courses and topics that are particularly challenging for students.

References: Walvoord, Barbara E (2010). Assessment Clear and Simple: A Practical Guide for Institutions, Departments, and General Education, Second Edition. San Francisco: Jossey-BassMaki, P.L. (2010) Assessing for Learning: Building a Sustainable Commitment Across the Institution. Sterling, VA: Stylus PublishingSuskie, L. (2018). Assessing Student Learning, Third Edition. Boston, MA: Anker Publishing Company

Writing Student Learning Outcomes

Learning outcomes refer to the most important goals that the program or course has for its students. Programs and courses may have a large number of goals; learning outcomes reflect the most important of these goals.

A well-written learning outcome relates specifically to a program or course and to how the faculty envisions student learning. For example: "Students will write effectively" could apply to many courses or academic programs. In contrast, Kansas State University's English department states that students should be able to "research and write focused, convincing analytical essays in clear, grammatical prose" and "tailor writing for various audiences and purposes." This indicates what that department sees as essential writing skills. Other programs, even other English departments, may focus on different aspects of writing.

One advantage of well-written learning outcomes is that they help guide the choice of assessment methods. It is easy to imagine how the learning outcome stated above might be assessed: An English student could write a paper presenting original interpretations of literary works.Guidelines for Developing Learning Outcomes

Guidelines for Developing Learning Outcomes

1. Frame learning goals around the desired outcome or end result of the learning, not around the process or means.

Remember that the ultimate goal of a program or course, and the one we’re focusing on, is to create students who are well-educated in their field, not just to offer classes and have people graduate (not that those things aren’t also important, of course!).

2. Be specific about what students who complete the program or course should be able to do.

Describe observable student behaviors, avoiding fuzzy terms when possible. This document, Choosing Verbs for Learning Outcomes,* may be useful for choosing language that accurately and concretely describes the learning outcome.

It is difficult to observe whether student "understand," "appreciate" or are "familiar with" a concept but easier to determine whether they can "articulate," "explain" or "apply" a concept. Concrete verbs such as "define," "argue," and "create" are more helpful than vague verbs such as "know" or "understand" or passive verb phrases such as "is exposed to."

3. Find a balance between narrow and broad outcomes.*

Too broad: "Students will demonstrate information literacy skills."

Too narrow: "Students will be able to use the college's online services to retrieve information."

Better: "Students will be able to locate information and evaluate it critically for its validity and appropriateness.

*Based on Bloom's Taxonomy (Bloom 1956).

4. Create goals that are challenging, yet attainable.

Note that it is not necessary for every student to master every single outcome for a course or program to demonstrate success. In fact, such an outcome might indicate that the goals have been set too low.

Examples of Effectively Expressed Learning Outcomes

In these examples, the outcomes are broad enough to capture significant learning but are defined narrowly enough to be specific to the programs.

Degree in English

Present original interpretations of literary works in the context of existing research on these works

Degree in Environmental Science

Critically evaluate the effectiveness of agencies, organizations, and programs addressing environmental problems.

Degree in Theater

Use voice, movement, and understanding of dramatic character and situation to affect an audience.

Degree in Women's Studies

Use gender as an analytical category to critique cultural and social institutions

Helpful Links

Learning Taxonomies

Creating Student Learning Outcomes

Choosing Verbs for Learning Outcomes

Creating and Using Rubrics

Rubrics: A Direct Measure of Student Learning

One commonly used method for getting a direct measure of student learning is to select an assignment, project, performance, or other student work that aligns with some (or all) of the critical program or course learning outcomes and grading it using a rubric.

A rubric is simply a list or chart that outlines the criteria or standards to be used to evaluate student work. Rubrics vary from simple checklists and numerical rating scales to the so-called "full rubric" used to describe a student's performance at each of several levels. Rubrics are useful for grading individual assignments, and many faculty may already be using them for this purpose. However, when rubrics for individual students are gathered and the information is collated, they can provide powerful evidence for student learning in a program.

The key to developing a good direct measure is alignment. You want to ensure that the learning outcomes, student work, and rubric have clear connections to each other.


The AAC&U VALUE Rubrics

The American Association of Colleges and Universities completed a major project as part of its Liberal Education and America's Promise (LEAP) initiative. The Valid Assessment of Learning in Undergraduate Education (VALUE) project brought together more than 100 faculty members, assessment specialists, and other academic professionals to develop consistent rubrics for evaluating the "essential learning outcomes" of undergraduate liberal education as defined in the LEAP initiative. These rubrics, which are easily adapted to suit the needs of particular programs or institutions, include critical thinking, creative thinking, written communication, oral communication, civic engagement, teamwork, and other topics. For more information, go to www.aacu.org/value/.

Assessment Procedural Guidance

Program Learning Outcome Assessment Process

In thinking about your approach to how to assess student learning in Program Learning Outcomes, please consider a process and approach that will produce results that you are interested to know and will be sustainable over time. While programs are not required to assess all of their outcomes at once in an assessment project, programs are also invited to conduct a larger assessment if it best meets the intent of what faculty want to learn about their students. The Provost’s Office stands ready to assist. Below you will find:

  • Overview of recommended steps
  • Survey questions through which assessment activities and findings will be reported
  • Detailed instructions and recommended steps

Survey Questions to Report Program Learning Outcome Assessment

Programs will report their program learning outcomes assessment process and findings by submitting data through a Qualtrics form managed in the Provost's Office. When submitting an assessment, please complete one form for each Program Learning Outcome evaluated.

Program leadership (or other individuals) completing the assessment of one program learning outcomes are asked to report their activities and findings by completing the survey/submission form for their program. The questions to which they will be expected to respond are included below.

As you plan which outcome to assess, which courses and/or student work you will use to complete the assessment, revise or create evaluation rubrics, and decide who will undertake the assessment, please keep in mind the following questions so you will be prepared to report the assessment activities and findings in ways that will be consistent across The New School’s academic programs. We recommend that faculty consider creating a document that contains these questions and the responses to these questions while performing the assessment.

There is no maximum character limit to the open-ended questions.

Survey Questions:
  1. What types of student work were evaluated? Select all that apply:
    1. Artistic exhibition/performance
    2. Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
    3. Capstone work product (e.g., written project or non-thesis paper)
    4. Exam created by an external organization (e.g., professional association for licensure)
    5. Exit exam created by the program
    6. IRB approval of research
    7. Oral performance (oral defense, oral presentation, conference presentation)
    8. Portfolio of student work
    9. Publication or grant proposal
    10. Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
    11. Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
    12. Thesis or dissertation
    13. Other
  2. Please provide the instructions, prompt, or assignment guidance given to students for the work that was evaluated.
  3. How was the work scored, graded, rated, or analyzed and by whom?
  4. Please upload all rubrics or other rating instruments used.
  5. How many students were assessed?
  6. Of the students assessed, how many met the expectations for the Program Learning Outcome? This should be based upon the levels of achievement you defined in your rubric.
  7. Based upon these outcomes, please list actions planned (or already taken) to improve, enhance, or sustain student success in achieving this learning outcome in the future.
Detailed Instructions and Recommended Steps

The following guidance is provided to assist faculty, programs, and colleges in completing this learning outcome evaluation. Although this document seeks to answer common questions encountered during a learning outcome assessment, faculty may encounter unique situations or other factors that require additional discussion. Should you need additional guidance or support, please contact your College Assessment Representative.

  1. Identify which program learning outcome you will assess
    1. Using the learning outcomes provided in the Curriculum Maps, faculty should identify one learning outcome they would like to evaluate. This selection may be made based on several factors, including:
      1. Faculty curiosity about what will be learned
      2. Relative ease in assessing it this semester
      3. The frequency or presence of the outcome in the current program curriculum
      4. Alignment or relationship with other learning outcomes
      5. Desire to change, modify, or remove this outcome from the current list. Note: Assessment of program learning outcomes will be used as part of future efforts to change a program’s learning outcomes, pending the creation of a formal process
      6. Other assessment projects related to this outcome
      7. Other planned curricular changes that may be impacted by this learning outcome evaluation
  2. Identify which course(s) you will use to assess the learning outcome
    1. Programs may select any course to assess the outcome.
      1. Capstones are often used for assessment purposes, but any course may be used for this project
      2. Faculty may also choose to assess specific sections of a course rather than all sections; this does not present an issue, as long as the number of students assessed is meaningful
      3. Common courses that all students take, e.g., First Year Writing, Parsons First Year seminars, might also be considered
  3. Identify the student work that you will use to assess the learning outcome. We recommend that you:
    1. Consider how the type of work will evidence the learning outcome you are assessing
    2. Consider how you will be able to assess multiple “work products” efficiently
    3. Consider how you will be able to gather the student work, especially if from multiple courses/sections
  4. Design/identify the assessment rubric you will use
    1. Many courses and programs already use rubrics to evaluate student work - in some cases, these rubrics can be used to conduct assessments of the program learning outcomes
    2. If you want to create a new rubric or update an old rubric, more information on creating rubrics is available here
    3. If you would like assistance in creating, revising or evaluating a rubric, please feel free to contact Rita, Prisca, and Dale
  5. Identify which faculty will conduct the assessment
    1. Faculty are considered subject matter experts in the evaluation of student work and must be the ones performing this assessment
  6. Schedule a calibration session with the reviewers to ensure relatively consistent evaluation. Calibration sessions are intended to achieve a common application of the rubric across evaluators.
    1. Common steps in a calibration session include:
      1. Bringing together all those doing the evaluation together along with the assessment rubric and some student work samples
      2. Asking everyone to review the same student work in accordance with the rubric
      3. Discuss everyone’s assessments, variabilities, agreements, disagreements in order to come to common understanding of the levels of quality (e.g., did not meet, meets, exceeds, etc.) articulated in the rubric
      4. Make any necessary changes to the rubric based on group feedback.
  7. Review student work using the designed rubric
    1. Questions often arise during the evaluative process. As with the calibration session, questions are best be addressed collectively among the evaluators. The College Assessment Representative may also be of assistance
  8. Report your findings via the Qualtrics survey. The exact questions that will be asked are provided above so that the person submitting the results has everything needed at hand to submit the work
    1. Once a program has submitted their findings, they should notify their College Assessment Representative
    2. Technical issues regarding the form should be sent to College Assessment Representatives for resolution
  9. The purpose of program learning outcome assessments are to consider what improvements to curriculum, course content, and pedagogical approaches could be made to improve those outcomes for future students. Assessment should not be just an administrative project, but a process used to better understand ourselves and our learning environments. It’s important to build into the process of assessing student learning outcomes regular collective consideration of the results, what is learned from them, and what changes should be made. Faculty should always review findings and discuss how to modify courses, course sequencing, and/or teaching methods based on what was learned
  10. Questions that faculty may want to consider while performing this assessment include:
    1. What did you learn that was gratifying?
    2. Were there any surprising findings from the assessment, and if so, why were you surprised?
    3. What needs to be changed in the course curriculum or faculty pedagogies to positively impact future assessments?
    4. Was the rubric you used effective in gaining a better understanding of what students in the program know?
    5. How do you discuss assessment results within your academic community? Are there opportunities to include other members of your academic community (students, staff, alumni, administrators, etc.) in those conversations?
  11. Relax and celebrate!

Curriculum Mapping Process Guide

To effectively identify where learning outcomes are taught throughout a program's curriculum, each program at The New School keeps and maintains a curriculum map. A critical component of this effort is identifying when and to what level learning is taking place in a program’s curriculum; to support this activity, Curriculum Maps will be used as a tool to identify when and to what level student learning is taking place.

Individual curriculum maps are provided by the Provost’s Office to both support this effort and serve as a foundation for further thinking and refinement of learning outcomes. Forms were also pre-filled out with relevant information to minimize additional work.

Steps to Complete A Curriculum Map:

  1. Review the Program Learning Outcomes and Program Curriculum sections in your program's existing map
  2. For each course in the Program Curriculum section, determine the following:
    1. If a Program Learning Outcome (PLO) is taught within that course
    2. If yes, indicate the level to which that learning outcome is addressed by putting a ‘I’ (Introduced), ‘D’ (Developed), or ‘A’ (Advanced) within the corresponding box for that course.
    3. If no, please write ‘N/A’
  3. While no action is required, you may wish to review the Shared Capacities associated with individual courses by clicking the ‘Shared Capacities Map’ tab at the bottom of the document. Do not change or add information in this tab
  4. Once complete, please notify your College Assessment Representative.

Supplementary Information:

  1. Each Curriculum Map has tabs at the bottom of the document titled, ‘Program Learning Outcome Map’. This tab contains pre-filled information by the Provost’s Office using existing Program Learning Outcomes (PLO’s) and the program’s current curriculum, generated from Coursedog.
    1. Curriculum maps are required for all degree or for-credit certificate programs.
    2. An additional tab may be included in the document, titled ‘Shared Capacities Map’. This tab provides information on what program courses are tagged with a Shared Capacity.
  2. Program Directors and faculty within the program (or department) should review and familiarize themselves with their current learning outcomes and the listed curriculum.
    1. If inaccuracies are identified in the curriculum, the College Assessment Representative should be contacted for additional discussion.
    2. Program Learning Outcomes are drawn from Provost’s Office records. If the listed outcomes do not reflect what you believe to be current, please inform your College Assessment Representative. Additional information will be provided on a revision process at a later date.
    3. If there are non-curricular degree requirements that are not listed on the map, please enter them at the bottom of the ‘Program Curriculum’ section. Any learning outcomes addressed in these requirements should also be annotated using the process detailed above.
  3. Following a review of the map, Program Directors should begin identifying when and where current PLO’s are being taught in the curriculum, per the following guidelines:
    1. Collaboration with Faculty: Program Directors (or their representatives) should meet with faculty members to complete this form. The Curriculum, Learning, and Academic Affairs, and the Faculty Affairs team in the Provost’s Office can help facilitate these conversations upon request.
    2. Marking Levels of Proficiency: For each course, use the indicators, ‘I’, ‘D’, or ‘A’, to note the presence of a learning outcome within the curriculum, and to what level it is taught. If a PLO is not taught within a course, indicate with ‘N/A’ or leave the cell blank.
      1. Introduced (I): Often the first time a learning outcome is encountered within a program, this indicator represents when basic conceptual knowledge and information is formally conveyed to students.
      2. Developed (D): Courses that teach learning outcomes to this level refine information, techniques, and concepts provided at the ‘Introduced’ level, helping students feel a basic sense of subject mastery.
      3. Advanced (A): Courses at this level teach learning outcomes to the highest levels within a program, often requiring students to demonstrate a professional-level knowledge and skill, such as in a capstone experience.
    3. Program Directors are invited to add additional thoughts or observations in the ‘Notes’ column.
  4. While performing this mapping process, program directors and faculty should begin considering which of their learning outcomes they will assess by the end of the current academic year. General guidance for faculty at this time includes:
    1. All programs and for-credit certificate programs will be asked to assess one student learning outcome this semester.
    2. This assessment may be of any course and use any learning outcome.
    3. Faculty are asked to provide the course syllabus and the student work prompt in their program submission.
    4. Programs may desire to use capstone courses for this process as an opportunity to identify the level of knowledge students possess in a learning outcome at graduation. Programs may also be interested in evaluating first-year experience courses to determine the level of knowledge students possess at the beginning of coursework. Any course may be used for the learning outcome evaluation.
    5. Ultimately, the learning outcome evaluation is something that programs should consider as an opportunity to learn more about student knowledge and abilities at a certain point in time. This information can be used to leverage future curricular changes or other improvements.
  5. After revising an old curriculum map or completing a new one, Program Directors (or their representatives) should notify their College Assessment Representatives. College Assessment Representatives will then review the map and ameliorate any identified issues.
    1. After College Assessment Representatives verify map completion, they should annotate this in their internal records and on the institutional tracking document.

Examples of Assessment Success

Recognizing and Celebrating Your Work

In addition to filling out Curriculum Maps and completing a learning outcomes assessment, the Provost's Office appreciates opportunities to highlight, recognize, and celebrate diverse types of evaluations faculty perform at The New School. We’re interested in seeing the assessments that you already do to improve the student experience.

Faculty are encouraged to submit their own examples of assessments they’ve conducted that:

  • Show where they assessed student learning within their courses or programs;
  • How they used what they learned to change or improve a course or other activity; or
  • The ways in which the above were discussed with others to identify and develop best practices and other opportunities

Some examples of this may include:

  • Revised syllabi that responded to student feedback and/or student performance
  • Curriculum Reviews
  • The creation of new courses, or strategically retiring existing ones
  • Research performed to create a new minor, major, or concentration within an existing program
  • Approaching delivery of subject matter differently in response to the results of test or projects or other student work
  • Collaborations that led to a change in non-academic units (Advising, Financial Aid, Enrollment, etc.)

Frequently Asked Questions (FAQ’s) for Assessment Evidence Submissions

Are colleges required to submit assessment evidence TO THE PROVOST'S OFFICE?

    • Colleges are not required to do so, although we strongly encourage investing time in this activity to help us gain a better understanding of the great work faculty, staff, and others already perform in assessment

What steps should be taken to assure student confidentiality?

    • Submissions should remove identifying information from their uploads. All collected information will be used internally, so it is not required that we notify students that their work is being used for assessment purposes.

Is there a specific date range for requested evidence?

    • The Curriculum and Learning team in the Provost's Office welcomes assessment evidence from any point in time.

What if we know an assessment project has occurred in a college or program, but we don’t have documentation of it?

    • College Assessment Representatives should contact the Curriculum and Learning team in the Provost's Office for assistance.

Will the Provost’s Office archive our assessment examples?

    • The Curriculum, Learning, and Academic Affairs team collects assessment evidence from across the institution for future use and consultation. Colleges and programs are invited to contact Dale directly with access requests.

List of Questions Asked on the PLO Assessment Qualtrics Form

The questions below are asked of those completing the submission form. We recommend that the answers to all questions provided below are readily available prior to clicking the submission link.

  1. Please select the program that you are reporting on. [Drop Down]
  2. If you have examples of assessment of student learning outcomes at the course level, please upload them here. [File Upload]
  3. If you have examples of related student work, please upload it here. [Shown if File in #2 is uploaded]
  4. If you have documentation demonstrating the use of assessment results to improve curriculum, please upload it here. [File Upload]
  5. If you have documentation demonstrating collective reflection on assessment results, please upload it here. [File upload]

Detailed Instructions and Recommendations:

The following guidance is provided to assist faculty, programs, and colleges in completing this exercise. Although this document seeks to answer common questions encountered during the identification of historical assessment work, faculty may encounter unique situations or other factors that require additional discussion. Should you experience such a situation, please contact your College Assessment Representative at your earliest convenience.

  1. College assessment representatives meet with faculty and other relevant parties within their academic units
    1. The majority of time spent in this activity will be in identifying different assessment and evaluation processes taking place within colleges and programs - we assess and evaluate in most aspects of what we do, but we don’t always talk about it with others. College Assessment Representatives should act as discussion facilitators within their units to help faculty and staff identify when and where assessments have taken place over the last three years.
    2. Rita Briedenbach, Prisca Wood, Caroline Dionne, and Dale Willson are all happy to sit in on conversations and assist in this process.
  2. Review identified documentation to ensure that it could be easily read by someone who is unfamiliar with the college and its programs
    1. Oftentimes assessments and evaluations are written in such a way that specialized knowledge is needed to fully understand it. While this in itself should not prevent an example from being submitted, if possible, please include contextual information that would help better understand the activity within each submission.

You must have Author or Collection Owner permission to create Guru Cards. Contact your team's Guru admins to use this template.