-
Dr. Tia Brown McNair
Vice President of the Office of Diversity, Equity & Student Success
Association of American Colleges and Universities
Additional registration required to attend.
The 2018 NASPA Assessment & Persistence Conference is designed to promote student learning and success by strengthening assessment, improving educational quality, and developing intentional persistence programming.
Institutional leadership must create an environment which builds capacity, as well as encourage an organizational culture that includes comprehensive assessment as part of strategic planning. Similarly, institutional leaders have a unique role to play in providing an environment and education that encourages student persistence, especially for under-served, low-income, adult, part-time, and minority students.
The NASPA Assessment & Persistence Conference has been designed to address these important issues in assessment and persistence, as well as to provide a forum for experienced professionals to advance their skills by discussing assessment and persistence with practitioners and policy-makers.
Attend this conference to:
NASPA encourages institutional team attendance at this event. By having a multidisciplinary team, colleges and universities increase the probability of successfully implementing improved assessment and persistence programs when they get back to campus. The best assessment of student learning and outcomes requires collaboration from both academic and student affairs. The committee encourages campuses to send representatives from the following areas in order to allow for substantive conversations around how to create change on your campus:
This event is most likely to influence these groups.
Let others know you are coming!
#APC18
The conference planning committee is seeking proposals under both the assessment theme and the persistence theme. Presenters for both the assessment and persistence theme will be asked to identify the intended audience for their session; beginner, intermediate and advanced. Additionally, given the strong practical connection between assessment and persistence, the conference planning committee will select programs that demonstrate the integration of assessment and persistence for student learning and success.
The NASPA Assessment & Persistence Conference addresses emerging issues in assessment for new and experienced professionals. Practitioners, scholars, and policy-makers discuss data driven decision making. Transparency, accountability, and improvement are interwoven into the fabric of the conference. The conference planning committee will select innovative programs that relate to the conference themes and address fundamentals, as well as complex concepts.
The NASPA Assessment & Persistence Conference has been designed to address current issues in student persistence in higher education and connect the assessment of student learning to persistence and completion. The conference planning committee will select program proposals from various institutional types from community colleges to small colleges as well as large universities that provide institution-wide, proven interventions that connect student learning to persistence. The committee also seeks to highlight programs that share persistence strategies for first generation, low-income, minority, adult, part-time, transfer, veteran and other often under-served students.
Effective assessment becomes easier to understand and manage when grounded in a solid foundation of knowledge. Sessions in this theme provide an understanding of foundational assessment concepts by exploring topics that align with fundamental assessment skills and knowledge. Sessions in this theme may include the following topics:
A strong foundation of knowledge grounded in methods and measurements is necessary for effective assessment. Sessions in this theme provide foundational or advanced concepts of methods (e.g. case studies,portfolios, interviews/focus groups, use of national datasets, rubrics, etc.) and data analysis. Sessions in this theme may include the following topics:
Explaining the use of a specific method that aided in data acquisition that impacted a campus decision;
Discussing data analysis techniques, to include how to share results with various audiences; and
Sharing best practices for creating successful data analysis collaborations which positively impacting the student experience.
Institutions and their departments rely on relevant and timely data to inform their practices. Sessions in this theme provide foundational to advanced concepts of how data is collected, analyzed, and acted upon. Sessions in this theme may include the following topics:
Demonstrating how institutional data is communicated to various stakeholders and acted upon;
Identifying strategies for data collaboration and integration between academic units and student affairs;
Using data from multiple sources to describe learning outcomes achievement; and
Collecting and using data in the accreditation process.
Institutions provide a variety of programs and structures to support persistence and retention. Among these structures, intentional and advanced consideration for typical transition hurdles (e.g. financial aid, enrollment challenges, successful transition to campus etc.) is critically important.Sessions topics in this learning theme may include:
Discussion of how university programs (e.g. orientation, academic advising, bridge programs) improve student retention;
Examination of the curricular and co-curricular early intervention programs that impact student persistence and completion; and
Examination into the importance of strategic and inclusive enrollment management plans that view completion as an institutional priority.
This learning theme highlights institutional efforts to support persistence and student retention initiatives for specific and diverse student populations including first generation, low-income, minority (e.g. racial, religious, sexual orientation, etc.), adult, part-time, transfer, veteran, disabled, and other underserved students. Sessions in this theme may include the following topics:
Higher education leaders must create an infrastructure that connects assessment with student persistence, degree completion, and job placement. Sessions in this theme provide examples of how programs, initiatives, and strategies connect assessment to institutional outcomes. Sessions in this theme may include the following topics:
Looking for tips on writing an effective NASPA proposal? See sample submissions and formatting tips in our Program Submission Guidelines.
Looking for tips on writing an effective NASPA proposal? See sample submissions and formatting tips in our Program Submission Guidelines.
Please contact NASPA if you have any further questions about submitting a program proposal for the 2018 NASPA Assessment and Persistence Conference.
Tiki Ayiku
Senior Director of Educational Programs
Phone: 202-719-1184
Email: tayiku@naspa.org
The schedule will continue to be updated with any remaining edits.
Additional registration required to attend.
These two-hour sessions allow for a deeper dive into assessment and persistence topics
Participants are invited to take part in aftenoon roundtable-like discussions around pressing topics in assessment and persistence.
These #SASpeaks type sessions are designed to pack a lot of information into a 25 minute sessions.
These #SASpeaks type sessions are designed to pack a lot of information into a 25 minute sessions.
Registration rates are based on your NASPA membership. If you are not a member, please vist the NASPA Membership page for more information.
Register Online
Tonya Murphy
Membership Services Coordinator
Email: events@naspa.org
Phone: 202-265-7500 x1183
Cancellation: The cancellation deadline to receive a refund is April 6, 2018, less a $50.00 administrative fee. No refunds will be given after the deadline for any reason. All requests for cancellation and refunds must be in writing and sent to refund@naspa.org. This program may be cancelled or postponed due to unforeseen circumstances. In this case, fees will be refunded; however, NASPA will not be responsible for additional costs, charges, or expenses, including cancellation/change charges assessed by airlines, hotels, and/or travel agencies. NASPA is not responsible for weather or travel related problems and will not reimburse registration fees for these issues.
Group Registration Discount: NASPA offers a discount for members registering in groups of two or more individuals from a single institution. To apply for this discount send a request in writing to events@naspa.org prior to applying payment to registration orders. Please include all registrants’ full names, the institution name, and the title of the event. The membership department will follow up for any additional information required and provide a personalized discount code each member of your group can use when processing payment.
Purchase Orders: NASPA does NOT accept purchase orders as a form of payment. For registrants requiring an invoice to have a check payment processed, please use the Bill Me payment method to complete your registration. The resulting invoice can be found and downloaded under the My NASPA section of the website (must be logged-in), by selecting the View Invoices link from the dropdown menu. Alternatively, email a request to Membership to have a PDF of your invoice sent to you. Bill Me registrations are considered complete and will hold your place in an event, however the balance due must be settled prior to attending.
Click here to view NASPA’s complete Payment Policies and Procedures.
Additional Questions? Please contact the NASPA office at 202-265-7500 or via e-mail to events@naspa.org.
If you would like to exhibit at or sponsor the 2018 NASPA Assessment and Persistence Conference please fill out the exhibitor application form and e-fax back all pages to 202-204-8443 or scan and e-mail to kjerde@naspa.org by May 1, 2018. Questions? Contact Kristie Jerde by phone at 218-280-7578 or via email at kjerde@naspa.org.
All conference activities will take place at the Hilton Baltimore.
NASPA has arranged special room rates for conference attendees at the Hilton Baltimore starting at $199/night (not including 15.5% state and local taxes). The cut-off date to receive the conference room rate is Thursday, May 24, 2018. Rooms in the conference block may sell out prior to the cut-off date, so please make your reservation as soon as possible.
Hotel | Room Rate/Night |
Hilton Baltimore 401 West Pratt Street, Baltimore, MD 21201 443-573-8700 |
$199 - Single/Double |
Baltimore is serviced by Baltimore/Washington International Airport (BWI). The hotel is approximately 10 miles from the airport.
Shuttles
For more information regarding shuttle providers to and from BWI, please visit the Transportation section of the BWI website.
Taxis
The taxi stands are located just outside of the baggage claim area on the lower level. For more information regarding taxi cab service, please visit the Transportation page of the BWI website. Taxi fare from BWI to the Hilton Baltimore is approximately $25 one-way.
Rental Cars
For more information on available rental car companies and contact information, please visit the Transportation page of the BWI website.
Temperatures in Baltimore in June are around 83 degrees F during the day and 67 degrees F in the evening. As the conference gets closer, please visit the Weather Channel for more information.
Julie Neisler, Research Assistant, University of Houston
The presenter will cover the basics of statistical testing, with a focus on advancing your skills and using statistical analysis in your work in assessment and research. A sample data set will be provided and participants will practice running a t-test, an ANOVA, a chi-square test, and a regression. Participants should have a general understanding of the following statistical terms: mean, standard deviation, the normal curve, variance, and hypothesis testing. We will be using the free statistical syntax-based software R, which participants should download prior to the session.
Darlena Jones, Director of Assessment and Research, Association for Institutional Research
Program assessment often uses survey research as a data collection tool. Survey research provides valuable insight into the student experience necessary for program review. In this workshop, we discuss the principles of survey research, explore the traits of a quality survey instrument, and give practical tips for reporting. Participants new to survey research will not be overwhelmed and those needing a refresher will find useful tips. Learn from a seasoned researcher on ways to incorporate survey research into your data collection.
Marjorie Dorime-Williams, Assistant Professor, University of Missouri
Calls for accountability mean more pressure to implement and report on assessment beyond student learning. The environment that supports student learning is just as important. Student affairs divisions must develop comprehensive, aligned, and organized assessment processes to promote a sustainable and useful process. The presenter will focus on implementing a comprehensive assessment process that looks at student learning outcomes and administrative or support outcomes within student affairs divisions. Further, the presenter will offer practical information and tools to develop comprehensive assessment practices.
Aiden Powell, Program Manager, Purdue University LGBTQ Center
Purdue University designed an innovative data collection tool in Banner that assesses the retention and academic performance of LGBTQIA+ college students. Along with rich data from over 6000 students, it revealed that 27% of LGBTQIA+ Purdue undergraduates, compared to 16% of all Purdue undergraduates, are at risk of failing to complete their degrees. The presenter will provide an overview of the implementation of the data collection system, and lead a discussion of barriers and strategies for closing the achievement gap for LGBTQIA+ students.
Brent Klingemann, Assistant Director, Assessment Development, University of Colorado Boulder
Amy Biesterfeld, Director, Strategic Planning and Assessment, University of Colorado Boulder
Does student learning occur in a residence hall? Definitely! But how do you define the "soft skills" that are being learned by college students in residential environments? And how do you measure these skills to determine the effect of residential programming efforts? This is the goal of "Residential Curriculum" assessment at CU-Boulder. This session will provide a brief recap of Residential Curriculum efforts nationally, steps to developing a curriculum for your own institution, and lessons learned throughout implementation.
Sonja Daniels, Associate Vice President for Campus Life, San Jose State University
Becky Varian, Director of the Center for Student Progress, Youngstown State University
Learn how two institutions mapped student behavior and got student feedback in real-time by adopting new tracking and assessment tools. Case studies will be presented on how assessment culture was scaled across departments and events. Outcomes on student engagement and happiness will be discussed. Join us and explore how in-depth student behavior data within one unified platform is essential to create a holistic understanding of student engagement and identify engaged and unengaged students.
Kimberly Moore, Asssitant Dean of Students, Loyola University Chicago
The presenter will share the findings from a study focused on an innovative pre-enrollment retention strategy called compassionate enrollment. The results of the study confirm compassionate enrollment, when coupled with traditional post-enrollment strategies, leads to the retention of first-generation students and offers university leaders a new way of addressing retention rates. Through an interactive presentation, participants will leave with a deeper understanding of a new retention solution that has broad implications for research and practice.
Michael Nguyen, Research and Assessment, California State University, San Bernardino
David Giardino, , University of Southern California
From developing annual plans, ideating and tracking goals to administering collaborative assessment strategies, the student affairs division is asked to do more with less than ever before. During this session, the presenters will introduce a new framework—DASH—that teaches leaders how to align assessment strategies with institutional and divisional goals. This six-pillar, digitally-based foundation will be explored, while the focus of the session will be on the development and implementation of this comprehensive initiative.
Demetri Morgan, Assistant Professor and Primary Investigator, Loyola University Chicago
Hilary Houlette, M.Ed Candidate and Research Assistant, Loyola University Chicago
Jessie Payne, M.Ed Candidate and Research AssistantLoyola University Chicago
Rachel Fischer, M.Ed Candidate and Research Assistant, Loyola University Chicago
The mapping cross-difference interaction research study attempts to locate and visually represent the specific occurrences of diversity and cross-difference between students on college campuses using geospatial-mapping technology. The generation of student "heat maps" will aid student affairs practitioners and higher education researchers in their campus climate assessments. This presentation showcases our cross-difference mapping application and provide attendees with recommendations for incorporating geospatial technology into assessment.
Brittini Brown, Director for Assessment, Research, and Strategic Priorities, University of Maryland, Baltimore County
Nancy Young, Vice President for Student Affairs, University of Maryland, Baltimore County
Ken Schreihofer, IT ManagerUniversity of Maryland, Baltimore County
Joel DeWyer, Associate Director, The Commons, University of Maryland, Baltimore County
Colleges and universities across the country have increasingly begun to use predictive analytics as a tool to enhance student success. The presenters will focus on how student affairs at the University of Maryland, Baltimore County (UMBC) utilized student IDs in conjunction with UMBC's custom portal to collect student engagement data not only to enhance assessment for divisional programs, but also to contribute to the UMBC's predictive analytics efforts. The presenters will share lessons learned, key implementation strategies, and engage the audience in discussion.
Ciji Heiser, Director, Student Affairs Assessment and Effectiveness, Western Michigan University
Each basic component of the assessment cycle that Maki described in 2010 holds opportunities to develop a more inclusive assessment practice. Inclusive assessment moves beyond data disaggregation and towards an assessment approach that leverages data to advance more equitable outcomes around student learning, persistence, and graduation across demographic groups. The presenter will draw upon literature written about higher education assessment as well as culturally responsive evaluation in order to examine how the assessment process can be developed and leveraged for equity.
Michelle Bombaugh, Assistant Director, Office of Academic Advocacy, University of South Florida
Leslie Tod, Director, Office of Academic Advocacy, University of South Florida
Kim Williams, Academic Advocate for Policy and AnalyticsUniversity of South Florida
Zulmaly Ramirez, Academic Advocate, First Year Students, University of South Florida
The presenters will address the use of predictive analytics in conjunction with a case management model to identify and support at-risk students. Presenters will discuss how the case management model evolved at their institution and the multi-pronged approach they have taken to identify students in order to provide the right support at the right time to address student persistence. Presenters will elaborate on how they track student cohorts, collaborate with on-campus partners on student issues, and work individually with students.
Anne M. Hornak, Professor, Central Michigan University
Frim Ampaw, Associate Professor, Central Michigan University
Matt Johnson, Associate ProfessorCentral Michigan University
Ellen Wehrman, Assistant Director, Sarah R. Opperman Leadership Institute, Central Michigan University
LeShorn Benjamin, Doctoral Student and Research Assistant, Central Michigan University
Quentrese Cole, Doctoral Student and Research Assistant, Central Michigan University
This session will explore methods used to effectively engage faculty and student affairs professionals in the development of a comprehensive assessment plan. Based on the insights gained from a faculty-administrator partnership, this session aims to examine and assess the program and learning outcomes of the six main units in student affairs. The facilitators will provide an overview of the campus partnership model and explain how the university is assisting in developing and assessing outcomes and facilitating professional development experiences for continual improvement.
Pamelyn Shefman, Director of Assessment and planning, Division of Student Affairs and Enrollment Services, University of Houston
Daniel Kaczmarek, Director of Assessment and Evaluation, University of Buffalo
Colegate Moore, Assistant to the Vice President for Student Life and Director of Student Life Assessment and PlanningElon University
Institutions of higher education should assist students in gaining competencies that they can demonstrate. Texts such as Learning Reconsidered, the Association of American Colleges & Universities' LEAP Essential Outcomes, the National Association of Colleges and Employers' Attributes that Employers Want, and the outcomes from the Council for the Advancement of Standards show that the competencies gained from higher education are numerous. What are student affairs educators doing to track those competencies? This panel will share examples where they are making strides.
Abbygail Langham, Director, Assessment & Strategic Planning, Auburn University
Emily Wlikins, Graduate Assistant, Assessment & Strategic Planning, Auburn University
Paul Jacobson-Miller, Consultant, Campus SuccessCampus Labs
The last phases of the assessment cycle are to report and share; however, how do we know we have prepared well-crafted and comprehensive documents for these stages? How do we assess that reports are robust for accreditation and institutional reporting? The presenters will address an innovative peer review meta-assessment approach among student affairs staff in order to strengthen co-curricular assessment reports.
Lisa Maletsky, Coordinator of the Office for Student Persistence Research, University of Nevada, Reno
Jennifer Lowman, Director, Student Persistence Research, University of Nevada, Reno
Assessment has the power to make university values salient and map a path towards operationalizing values into real-world policies and programs. Unfortunately, many believe assessment is a bureaucratic afterthought, not a design mechanism. Using the Theory of Change framework and a logic model, we developed an assessment process to identify gaps in diversity and inclusion. We addresses ethical issues of incorporating marginalized voices, communicating across a multitude of institutional contexts, and navigating the political climate in a time of heightened fear and racism.
Gralon Johnson, ISU University Innovation Alliance Fellow, Iowa State University
Matthew Pistilli, Director, Student Affairs Assessment & Research, Iowa State University
Institutional data at Iowa State underscored clear disparities in retention and degree completion rates among historically underserved students. To address these, a cross-campus think tank identified what works well for these students, where there are gaps, and where there are opportunities. Additionally, a data symposium was held where over 200 campus leaders worked to further understand the challenges and identify ways to ameliorate them. The presenters will discuss how institutional will shifted to create collaborative efforts and improve student outcomes.
Stacy Ackerlind, Special Assistant to the Vice President for Student Affairs and Director, University of Utah
Andy Mauk, Associate Provost, University of North Carolina Wilmington
With the increased focus on persistence and completion, institutions are focused on leveraging data effectively to promote student success. More and more, student affairs assessment and institutional research are partnering to connect knowledge to develop meaningful actions. The presenters will focus on the critical collaboration between these two campus offices, and how the NASPA AER KC will foster this collaboration in the years ahead.
Teresa Dorman, Associate Dean, College of Sciences, University of Central Florida
Harrison Oonge, Assistant Dean, College of Undergraduate Studies, University of Central Florida<
Using the framework of a multi-institutional collaboration, this session focuses on sharing student success data and facilitating faculty-driven analyses of those data that result in changed behaviors and practices to improve student success. Sharing comparative and cross-institutional data involves thoughtful discussion and an emphasis on assessing any changes designed to positively affect those outcomes. This session is intended for those interested in sharing challenging data and the discovery process involved with assessing faculty-driven changes intend to impact student success
Vincent Nix, Dean, Student Services & Assistant Professor, United States Sports Academy The 2018 Southern Association of Colleges and Schools Commission on Colleges (SACSCOC) Principles of Accreditation requires that "The institution provides information and guidance to help student borrowers understand how to manage their debt and repay their loans. (Student debt)" in new Principle, 12.6. Doctoral teaching assistants worked with the dean of students to design, implement, assess, and re-tool an online orientation with a separate Financial Aid Literacy module. Mixed-method assessments provide evidence for accreditors of compliance.
Lan Song, Graduate Student, Abilene Christian University
Sophie Tullier, Assistant Director, Assessment and Research, University of Maryland, College Park
As institutions build their capacity to collect and analyze data, the desire to contribute to the profession through the publication of research findings is growing. One institution responded to this desire by creating an in-house research team comprised of student affairs staff. This session will provide an overview of the impetus, process, goals, challenges, and future opportunities of the research team. Those interested in starting their own research team will walk away with a set of "lessons learned" from the presenters' experience.
Darlena Jones, Director, Assessment and Research, Association for Institutional Research
Program assessment often uses survey research as a data collection tool. If you’re committed to a data-informed culture, you must be committed to quality data. Survey research is a popular method of data collection in student affairs but, if done poorly, can result in inferior data which leads to flawed decisions. In this session, the presenter will explore some funny, and not-so-funny, do’s and don’ts when writing a survey instrument. Come and learn from a seasoned researcher how to improve the quality of your survey instruments.
Julie Neisler, Research Assistant, University of Houston
Sexual assault campus climate surveys have shown that incident rates far exceed reporting rates. This session will discuss the operationalization of students' likelihood of reporting and will explore to what degree rape myth acceptance impacts this likelihood. Additionally, students' perceived school support will be examined as a possible moderator to this relationship.
Suja Rajan, Assessment Specialist, University of Alabama
Timothy Salazar, Director, Assessment and Planning, University of Alabama
At the University of Alabama, the Honor's College in collaboration with the Center for Service and Leadership enables students to be agents of change by empowering them with leadership and communication skills. This program will measure course success and identify areas of improvement in course curriculum to achieve learning objectives.
Gianina Baker, Assistant Director, National Institute for Learning Outcomes Assessment (NILOA), University of Illinois
Natasha Jankowski, Director, National Institute for Learning Outcomes Assessment, University of Illinois
In this session, attendees will discuss current assessment practices at their institutions and the role equity can play in moving towards culturally responsive assessment. Acknowledging that learning happens in a variety of ways and places, colleges and universities are working to connect, capture, and document learning in both the curriculum and co-curriculum through several initiatives.
Kimberlyn Brooks, Associate Director,Undergraduate Education, Bowling Green State University
Andy Alt, Assistant Vice Provost, Bowling Green State University
Stephen Kampf, Assistant Vice President for Student Affairs & Director of Recreation and Wellness, Bowling Green State University
Cynthia Roberts, Assistant Director, Office of Student Retention, Bowling Green State University
Sima Sharghi, Graduate Student, Bowling Green State University
Audry Alabiso, Graduate Student, Bowling Green State University
Using data from multiple sources and working with graduate assistants from statistics and computer science, persistence models with predictive, and more importantly, prescriptive analytics for the freshman cohort were produced and refined. Based on the outcomes from the modeling, secondary support structures were identified, strategies were constructed, and systems were put in place to monitor persistence of students impacted by the strategies. At the conclusion of the registration period, the analytical model was evaluated and refined for improved accuracy.
Nicole Cavanaugh, Graduate Student, University of Southern California
Michael Nguyen, Professor, University of Southern California
This presentation aims to arm leaders with what they need to know to increase the likelihood of success before diving into an assessment initiative using applied study within an actual Division of Student Affairs. Analysis revealed correlations among leadership, clarity, safety, and performance among student affairs departments as we worked alongside a division to design and implement an assessment initiative. The presentation will present information, role play for deeper understanding, showcase a student affairs case study, and allow for Q&A.
Rebecca Goldstein, Associate Director, Assessment and Research, Florida Atlantic University
Antonio Perry, Director, Assessment and Academic Initiatives, Florida Atlantic University
Assessment offices are uniquely situated to use state performance-based funding metrics to showcase how student affairs contributes to the overall profile of an institution. In this session, participants will learn how an assessment office designed their annual reporting cycle based around metrics and university priorities to best showcase the efforts and impact of student affairs. Participants will have the opportunity to practice using this model and discuss the impact of metrics in creating a culture of assessment.
Eric Walsh, Assistant Director of Assessment and Coordinator of Survey Research, University at Albany
Data collection and analysis can be an overwhelming task, especially when someone is new to assessment. Often times though, we make our work more difficult than it has to be. The presenter will introduce a framework for how to think about data collection, research design, and analysis. Drawing on lessons from a diverse set of fields ranging from data science and software development to the philosophy of science, we will explore strategies to increase efficiency and refine our research.
Scott Reinke, Coordinator, Ball State Achievements, Ball State University
Cole Heady, Institutional Effectiveness Analyst, Ball State University
Tipton Russell, Assistant Director, Institutional EffectivenessBall State University
The Ball Sate Achievements App is a unique method of gathering student engagement data for use in student success analyses. The presenters will share results of their work studying the relationships between achievements and retention and graduation. They will also showcase this project as an example of a highly successful IR-enrollment management/student affairs collaboration.
Marguerite Culp, Executive Director, Maggie Culp Consulting
In 2017, Dr. Maggie Culp (co-editor of Building a Culture of Evidence in Student Affairs) led an interactive session that explored eight traps into which student affairs professionals often fall when using assessment to guide major decisions, tell the student affairs story, and build a culture of evidence. In this follow-up session, Dr. Culp will identify additional traps, help participants assess the extent to which the traps exist on their campuses, and explore strategies designed to help participants avoid or escape from the most common traps associated with building a culture of evidence.
Michael Seals, Associate Director of Residential Learning, Purdue University
Amber Martin, Coordinator for Program Analysis, Purdue University
The expansion of residential learning communities (RLCs) at Purdue University was seen as a method to address the problem of low graduation and retention rates (compared to Big Ten peers) and improve retention of the ever-growing percentage of under-represented minority (URM) students on campus in particular. The presenters will discuss the success of this persistence initiative as demonstrated by the comparison of eight years of cohort data with special emphasis on the effect of RLCs on URM student persistence.
Molly Morin, Program Manager, LiFT Scholars Program, Indiana Univeristy-Purdue University Indianapolis
Mathew Palakal, Executive Associate Dean, IU School of Informatics and Computing at IUPUI, Indiana University-Purdue University Indianapolis
The Leading Informatics for Tomorrow (LiFT) Scholars Program is a multi-institutional scholarship/support program funded by the National Science Foundation. LiFT seeks to support the persistence and career readiness of students with unmet financial need at IUPUI/Ivy Tech pursuing degrees in IT fields. This session will provide an overview of the mission/history of the program, co-curricular activities, strengths of the program, and opportunities for growth. This session aims to help campus leaders brainstorm ways to support underrepresented students in STEM!
Linda Demyan, Training Analyst, California State University Channel Islands
Dorothy Ayer, Special Assistant to the VPSA & Strategic Operations Administrator, California State University-Channel Islands
This program will provide an overview and practical examples of how to utilize the ACPA/NASPA Professional Competencies to guide your professional development and staff training program. Additionally, this program provides direction and suggestions on how to assess your training program to ensure that each staff member is increasing their level of expertise among each of the ACPA/NASPA competencies. Presenters will demonstrate how assessment and training can and should connect through the use of the Professional Competency rubrics and learning outcomes.
Matthew Pistilli, Director, Student Affairs Assessment & Research, Iowa State University
The Council for the Advancement of Standards in Higher Education (CAS) publishes standards, guidelines, and self-assessment guides to help institutions demonstrate student learning, development, and achievement. Accrediting bodies increasingly recognize the role of student affairs and services in these outcomes. Working with CAS, assessment professionals developed a white paper discussing how this link could work. The presenter will review that paper, and engage participants in dialogue about how they do or could use CAS as part of their accreditation.
Bianca Evans, Assistant Dean, Diversity and Inclusion, Indiana University
Recruiting, retaining, and graduating underrepresented graduate students requires targeted, comprehensive approaches, and welcoming, inclusive graduate environments. The presenter will examine and evaluate several initiatives ("Getting You into IU," recruitment program, "Emissaries for Graduate Student Diversity Ambassadors," and "Diversity Fellows Program") targeting recruitment and retention of underrepresented graduate students at Indiana University. An overview of research and diversity initiatives will include: examining the implementation and challenges of initiatives; evaluation of these programs; and identifying key factors contributing to student success.
Maureen Cochran, MSAP, MSTD Coodinator, Oregon State University
Daniel Newhart, Assistant Vice President, Director, and Assistant ProfessorOregon State University
For the past three years, the presenters' institution has been working towards a more coherent data collection strategy to examine student success. Presenters were recently able to analyze their first year cohort, triangulating data that we have not previously been able to triangulate - towards the end of creating a more complete narrative around student success. This afternoon dialogue will discuss roadblocks, lessons learned, and considerations for the responsible use of data to tie co-curricular involvement to retention and persistence analyses.
Timothy Salazar, Director, Assessment and Planning, University of Alabama
Steven Hood, Associate Vice President, University of Alabama
Matthew Kerch, Executive Director, Housing and Residential CommunitiesUniversity of Alabama
Kathleen Gillan, Director, Fraternity and Sorority Life, University of Alabama
This presentation will detail how strategic approaches to collecting and disseminating assessment data has been utilized by divisional leadership in decision-making. Through a collaborative approach, of creating a shared vision between assessment professionals and divisional leaders, meaningful assessment processes have been negotiated. Examples of divisional leaders closing the loop through intentional data reporting that is meaningful and relevant to operational decision-making will be provided.
Stacy Ackerlind, Special Assistant to the Vice President for Student Affairs & Director, University of Utah
Darby Roberts, Director, Student Life Studies, Texas A&M University
Directors of student affairs assessment face unique challenges, such as managing up with division leadership, across the division with peers (often without the authority to implement change) as well as their own teams. They face issues such as politics, ethics and competition for resources. This session is specifically for directors to talk about the challenges and successes in their role and will build on the conversation with directors of assessment that was begun last year at the 2017 NASPA Assessment and Persistence Conference in Orlando.
Danielle Glazer, Assessment and Research Analyst, University of Maryland, College Park
Sophie Tullier, Assistant Director, Research and Assessment, University of Maryland, College Park
While many assessment projects focus on issues of inclusion, equity, and accessibility in terms of content, less has been shared about how to make assessment tools and reports accessible themselves. This session discusses how student affairs professionals can design surveys and share results in ways that promotes inclusivity.
Carrie Carroll, Senior Partnerships Manager, College Possible
Jeff Knudsen, Director, Data Analytics and Evaluation, College Possible
Low-income students face many barriers to college graduation. To better support retention efforts, College Possible launched the Catalyze partnership model in 2016, which embeds a proven coaching model and curriculum on college campuses. With a laser focus on results, the program evaluate both Catalyze program outcomes and attribution- whether the outcomes are actually a result of programming. This learning lab will demonstrate how innovative programming, collaborative partnerships and adherence to a rigorous evaluation framework can lead to improved success for vulnerable students.
Ethan Kolek, Assistant Professor, Central Michigan University
Anne Hornak, Professor, Central Michigan University
Frim Ampaw, Associate ProfessorCentral Michigan University
Matt Johnson, Associate Professor, Central Michigan University
Quentrese Cole, Doctoral Student and Graduate Assistant, Central Michigan University
Le Shorn Benjamin, Doctoral Student and Graduate Assistant, Central Michigan University
This session will explore how focus groups can be an effective approach for summative and formative assessment in student affairs. Presenters will discuss the strengths of focus groups as a data collection method and provide examples of focus group assessment in four student affairs department at one university. This session will review strategies for conducting focus groups as well as the successes and challenges in using focus group assessment results to inform programmatic change and understand student learning.
Michele Tencza, Senior Academic Program Advisor, University of Texas at San Antonio Developing the UTSA Resilience and Retention Advising Program under the University of Texas at San Antonio's student success initiative, two senior academic program advisors implemented the Resilience and Retention Advising Program: helping students at risk, due to subsequent academic dismissal or because they were not accepted into their competitive major. Student development theory, positive psychology, and best practices influenced the presenters' intentional action strategy. This session will discuss program vision, institutional mandates, implementation learning curve, and program assessments.
Brandy Barksdale, Senior Academic Program Advisor, University of Texas at San Antonio
Cheryl Stanley, Assistant Director, Office of Student Assessment, Tulsa Community College
Scott Mannas, Assessment Specialist, Tulsa Community College
Dashboard is the hot new buzzword in higher education, but is the usage of dashboards right for you and your division/unit? What does it take to implement effective dashboards? This learning lab will focus on the decision to implement dashboards, a quick checklist for reviewing dashboard software, challenges facing the implementation of dashboards, and creating buy-in from campus constituents who will utilize assessment dashboards to make their decisions.
Argyle Wade, Assistant Vice Provost and Associate Dean of Students, University of Wisconsin–Madison
Ning Sun, Program Assistant for Student Life Assessment, University of Wisconsin–Madison
Recognizing the broad range of positive impact resilience can have on college student success, the division of student life at a large four-year research university adopted increasing student resilience as their divisional strategic priority. The presenters will share steps and lessons learned in developing and implementing the assessment on this divisional strategic priority. Included will be an institutionally created assessment tool as well as results from a variety of department assessment plans.
Cheryl Stanley, Assistant Director, Office of Student Assessment, Tulsa Community College
Scott Mannas, Assessment Specialist, Tulsa Community College
As institutions continue to press forward in creating a culture of assessment, it is critical that student affairs play an important role in that process. In creating a new organizational structure at Tulsa Community College, the division of student affairs established a new office of student assessment. The goal of this office was to support assessment activities within student affairs. Participants in the presentation will explore a pre-development checklist, a multi-tiered training process, and how to establish your voice on campus.
Chris Crippen, Director, Center for Service and Learning, Brigham Young University
Casey Peterson, Associate Dean of Students, Brigham Young University
Moises Aguirre, Assistant Dean of Students, Assessment & EvaluationBrigham Young University
It has been demonstrated that students are more likely to persist to graduation as they feel supported by student affairs personnel and programming. This presentation will demonstrate the causal links between individual outreach through integrated student involvement programming and persistence, particularly with students who have exhibited poor conduct. Also included is a discussion regarding the benefits of integration of collaborative assessment within student affairs.
Ling Ning, Quantitative Research Analyst, University of Colorado Boulder
Amy Biesterfeld, Director, Strategic Planning & Assessment, University of Colorado Boulder
Many factors influence students' sense of belonging and retention. The research question of focal interest has always been what the prominent factors are and how to align the resources towards the implementation of programs that are effective to improve students' first-year experience. The present study proposes to use a supervised machine learning algorithm to identify and rank order prominent factors within the residential hall environment impacting students' sense of belonging and retention, with particular focus on the underrepresented student population.
Christine Deacons, Director of Academic Support Programs and the Holman Success Center, Eastern Michigan University
Lauren Condon, Director of Student Union and Engagement, Cedar Crest College
Eastern Michigan University and Cedar Crest College will demonstrate how they utilized data collected through large-scale assessment initiatives to identify successful student pathways of engagement. Both institutions used mobile technology to integrate real-time measurement and feedback mechanisms capturing student engagement across events and services. Session participants will learn how these institutions have used this assessment data to identify patterns of involvement, set benchmarks, and highlight critical programming for student engagement.
Terrel Rhodes, Vice President for the Office of Quality, Curriculum and Assessment, Association of American Colleges and Universities (AAC&U)
Kate McConnell, Senior Director for Research and Assessment for the Office of Quality, Curriculum and Assessment, Association of American Colleges and Universities (AAC&U)
Assessment in higher education has seen vast improvements in recent years. Institutions have grown beyond using standardized tests to assess learning, and now implement new methods, such as using rubrics, to evaluate authentic student work. The presenters will discuss the development and utility of the Association of American Colleges and Universities VALUE rubrics by evaluating data displays and institutional examples. Attendees can evaluate whether their campus can benefit from utilizing the VALUE rubrics as a tool for co-curricular assessment.
Jennifer Lowman, Director, Persistence Research, University of Nevada, Reno
Lisa Maletsky, Coordinator, Persistence Research, University of Nevada, Reno
We examine tradeoffs made in the construction of a comprehensive and comparable model of campus involvement to predict persistence. Campus involvement indicators differ widely in how they capture the depth of involvement (e.g. quality, intensity, onset, and duration). Tradeoffs between depth and breadth are a common research challenge that we invite our audience to debate. The presenters will highlight the need of institutional researchers and program professionals to collaborate when evaluating the impact of decisions made in quantitative persistence research.
Sherry Woosley, Director of Analytics and Research, Skyfactor
Erin Bentrim, Senior Research Analyst for Student Affairs, University of North Carolina Charlotte
Most presentations focus heavily on the successful aspects of data analysis. But what happens when the results are the opposite of what was expected or just does not make sense? How do we respond when our data does not support our theories, programs, initiatives, or resources? In this session, we will explore six concrete strategies for making the most of unexpected data findings using both national data and examples from specific campuses.
Frim Ampaw, Associate Professor, Central Michigan University
Anne Hornak, Professor, Central Michigan University
Matt Johnson, Associate ProfessorCentral Michigan University
Ethan Kolek, Assistant Professor, Central Michigan University
Quentrese Cole, Graduate Research Assistant, Central Michigan University
Le Shorn Benjamin, Graduate Research Assistant, Central Michigan University
This session will explore the importance of precollege characteristics, commonly referred to as "input measures" in student affairs assessment. The presenters will highlight the relevance of input measures, especially attitudes and perception constructs, in assessment activities. Further, they will explore possible uses of input measures within programmatic and division-wide assessments, and finally, will discuss practical and inexpensive procedures for incorporating input measures into current institution data collection.
Jillian Kinzie, Associate Director, Indiana University Center for Postsecondary Research, National Survey of Student Engagement Institute, Indiana University Bloomington
The assessment of inclusivity and cultural responsiveness represents an imperative for higher education. In 2017, the National Survey of Student Engagement (NSSE) added a module asking students more about inclusive educational practices and perceptions of their institution's cultural responsiveness. This session highlights findings from this set, examines how results vary by student characteristics, and includes a discussion about campuses' use of these findings to create environments that support students of all backgrounds and leverage the educational benefits of diversity.
Elizabeth Duszak, Assistant Director for Student Affairs Assessment, Evaluation, and Research, University of Utah
Carlyn Graham, Assessment Analyst, University of Utah
The presenters will delve into the use of statistical analyses for student affairs assessment projects. While some projects don't need analyses beyond simple descriptive statistics, other projects can benefit from more advanced statistical analyses. The presenters will address: the selection of various statistical tests in line with the data; examples of analyses conducted to answer specific assessment questions; the use of statistical software programs to employ these techniques; and the accurate interpretation of the results of these statistical tests.
Dan Stroud, Assessment Specialist, Midwestern State University
Mark McClendon, Director, Institutional Research and Assessment, Midwestern State University
A commonly asked question by student affairs professionals and faculty regarding assessment expectations is 'How will this help my program?' In too many cases, the response refers to compliance rather than commitment. As Barbara Walvoord suggests, assessment should be "a kind of 'action research,' intended not so much to generate broad theories as to inform local action." This session outlines strategies for enhancing commitment and transparency in assessment that will positively influence staff, faculty, and student learning outcomes.
Nicole Battaglia, Director, First Year Initiatives, Seton Hall University
Monica Burnette, Director, Projects and Planning, Seton Hall University
Rohan Thakkar, Consultant, Campus SuccessCampus Labs
The persistence efforts of first-generation college students continue to be a national focus as statistics indicate a high attrition for this population. To improve the academic and social integration of first-year, low-income, first-generation students, a small, private institution partnered student services and faculty to develop a 10-day orientation program called "Gen 1"; to provide a holistic pre-college experience. This session will describe "Gen 1", it's formative and summative assessment, and show how this program yielded a 100% semester persistence rate.
Pamelyn Shefman, Director of Assessment and Planning, Division of Student Affairs and Enrollment Services, University of Houston
Christos Korgan, Director of Institutional Research and Effectiveness, University of Saint Katherine
Timothy Salazar, Director, Student Life Assessment and PlanningThe University of Alabama
Vincent Nix, Dean of Student Services and Assistant Professor, United States Sports Academy
B. (Brenda) Woods, Director of Research and Assessment,
Knowing student affairs' impact on student persistence assists in telling the story of our places within our institutions. The institutions represented on the panel will share how they define and measure persistence as well as roadblocks and successes. Attendees of this session will be able to gather solid frameworks of how persistence data is used across a variety of institutions and how to apply these to their respective campuses.
Robert Snyder, Executive Director of Planning and Outreach, Division of Student Affairs, The George Washington University
Learn how the George Washington University's Division of Student Affairs Assessment Committee re-thought departmental program review in response to divisional culture and stakeholder feedback, resulting in a departmental continuous improvement project model. This model provides departments with flexibility and significantly expanded the number of reviews being completed, while also remaining rigorous and standards-driven. Participants will have an opportunity to consider how they might adopt this model and to provide feedback on the model as the committee continues to refine it.
Kristen Vickery, Director, Testing & Assessment, Anne Arundel Community College
Danielle Brookhart, Director, Orientation Programs, Anne Arundel Community College
How do you get student affairs staff to embrace assessment with excitement instead of fear? Learn how one institution created a culture of assessment and implemented an assessment framework grounded in CAS with continued success. Information will be provided on strategies, meeting planning, organization, materials, accreditation planning, and training. All of this while staying student focused.
Eric Walsh, Assistant Director of Assessment and Coordinator of Survey Research, University at Albany
Douglas Sweet, Director of the Office of Student Affairs Assessment and Planning, University at Albany
Emily Feuer, Assistant Director for Student Affairs Assessment and PlanningUniversity at Albany
Student affairs assessment and planning, with help from institutional research, surveys all new students six weeks into the fall semester. The data collected is used to understand a student's experience, reimagine how we engage with these students, and ensure that they persist and succeed. The presenters will discuss how we have refined the instrument and increased response rates over time. We will also discuss the items that provided the most meaningful insight and explain how this insight was put into action.
Daniel Kaczmarek, Director, Assessment and Evaluation-Student Life, University at Buffalo
Rohan Thakkar, Consultant, Campus Labs
A friendship built from shared history, investment in the same community, and common interests and goals? When student affairs assessment professionals partner with Institutional Research, data collection, reporting, and decision-making can improve across the university. This sesion will guide participants through a reflection of their relationship with Institutional Research on their campus, provide insight through promising practices at the University at Buffalo and other institutions, and leave participants with concrete steps on how to develop this important campus connection.
Sophie Tullier, Assistant Director, Assessment and Research, University of Maryland, College Park
This session will examine beliefs about knowledge creation and student voice as student affairs professionals. Attendees will engage in an activity centered on identifying how our beliefs about knowledge creation align with the methods used to assess student learning and discuss how we might work towards greater alignment in our assessment data collection.
Dr. Tia Brown McNair is the Vice President in the Office of Diversity, Equity, and Student Success at Association of American Colleges and Universities (AAC&U) in Washington, DC. She oversees both funded projects and AAC&U’s continuing programs on equity, inclusive excellence, high-impact educational practices, and student success, including AAC&U’s Network for Academic Renewal series of yearly working conferences. McNair also directs AAC&U’s Summer Institute on High-Impact Educational Practices and Student Success. McNair serves as the project director for several AAC&U initiatives: "Truth, Racial Healing and Transformation," “Committing to Equity and Inclusive Excellence: Campus-Based Strategies for Student Success,” and Purposeful Pathways: Faculty Planning and Curricular Coherence.” She directed AAC&U's projects on "Advancing Underserved Student Success through Faculty Intentionality in Problem-Centered Learning,” "Advancing Roadmaps for Community College Leadership to Improve Student Learning and Success,” and "Developing a Community College Roadmap. McNair chaired AAC&U’s Equity Working Group that was part of the General Education Maps and Markers (GEMs) project that represented a large-scale, systematic effort to provide “design principles” for 21st-century learning and long-term student success. She is the lead author of the book Becoming a Student-Ready College: A New Culture of Leadership for Student Success (July 2016). McNair is a co-author on the publication Assessing Underserved Students’ Engagement in High-Impact Practices. Prior to joining AAC&U, McNair served as the Assistant Director of the National College Access Network (NCAN) in Washington, DC. McNair’s previous experience also includes serving as a Social Scientist/Assistant Program Director in the Directorate for Education and Human Resources at the National Science Foundation (NSF), Director of University Relations at the University of Charleston in Charleston, West Virginia; the Statewide Coordinator for the Educational Talent Search Project at the West Virginia Higher Education Policy Commission; and the Interim Associate Director of Admissions and Recruitment Services at West Virginia State University. She has served as an adjunct faculty member at several institutions where she taught first-year English courses. McNair earned her bachelor’s degree in political science and English at James Madison University and holds an M.A. in English from Radford University and a doctorate in higher education administration from George Washington University.