FAQs about Listening & Learning Results
What can we learn from the survey and listening session responses?
Open-ended survey questions (where participants write in their feedback) and listening sessions generate qualitative data. Qualitative data gives us a more in-depth understanding of the complexity, context, nuances, and connectedness of issues. It can help us answer “how” and “why” types of questions, rather than the “how much” or “how many” types of questions that quantitative data helps us answer. In this case, we used qualitative data to answer complex questions including:
- How do people understand and experience the School District of Philadelphia?
- Why do they think problems exist? What do they think should we do about them?
- What do they say they need from the School District of Philadelphia?
Qualitative analysis helps us understand people’s experiences and perspectives about what is important to them. It also helps us understand how issues are related to each other. When there is a consensus in the responses, that can be helpful in prioritizing, problem-solving, and decision-making. It can also be helpful to explore how various stakeholder groups experience issues differently.
Who analyzed the Listening & Learning feedback?
The research team that analyzed the Listening & Learning feedback are all staff from the Office of Evaluation, Research, and Accountability. The team consists of trained researchers who are familiar with making sense of large qualitative and quantitative data sets. They followed best practices in qualitative research methods to enhance trustworthiness, confirm common themes, and minimize bias.
What were the steps for analyzing the feedback provided by survey and listening session participants?
The research team reviewed the survey responses and listening session transcripts (written record of what was said during the focus group). The team “coded” the responses, which is a way of labeling or organizing the text according to common topics or themes.
First, the research team coded the responses.
- The team read through the survey responses and listening session transcripts and looked for common topics and themes based on the feedback. Using these topics and themes, the team developed a coding system to consistently analyze all of the feedback.
- The coding process involved labeling a survey response or section of a focus group transcript with a “code” that represented the topic(s) in the feedback. These codes were documented using the Dedoose qualitative data analysis software program.
- The coding process was iterative. As more feedback was reviewed, the team discussed, added, and reorganized codes.
Then, the research team analyzed the survey responses and listening session transcripts for common themes.
- The team read and re-read the feedback many times, sometimes as part of an entire transcript and other times by just looking at all of the different times that a section of feedback was coded with a specific topic or subtopic.
- The research team met throughout the process to check and discuss the codes given to each survey response or transcript section and the common themes in those topics. Eventually the team reached consensus about the topics respondents mentioned the most and what they mean.
- The researchers also examined where codes overlapped (code co-occurrences) in order to help make sense of how issues were related to one another.
Important note: codes and code counts do not, by themselves, represent findings or conclusions. Rather, the coding system helps the researchers make sense of and summarize the data.
Who provided feedback during the Listening & Learning Tour?
Participants could provide feedback by completing a survey or by participating in a listening session. The survey asked respondents to select their role from a list of options. Table 1 shows the number of survey respondents by role. Table 2 shows the number of respondents who participated in a listening session by role (when available – in some cases there was a mixture of participants in a listening session).
Table 1. Listening & Learning Survey Respondents by Role
Role | Number of Respondents1 |
---|---|
School Staff2 | 566 |
Central Office Staff | 127 |
Community Members | 36 |
Parents/Guardians | 169 |
Students | 13 |
Total | 9023 |
1This reflects data captured from June 16, 2022 to September 13, 2022.
2Respondents were not asked to indicate their school-based role.
3Some respondents indicated that they fit in more than role (i.e., a parent and a teacher, so are counted more than once across role types.)
Table 2. Listening & Learning Session Participants by Role
Session Type | Number of Sessions | Total Number of Participants4 |
---|---|---|
Teachers | 15 | 326 |
School Leaders | 8 | 386 |
School Support Staff (e.g., nurses, counselors, climate staff, etc.) | 2 | 43 |
Central Office Staff | 6 | 223 |
Parents/Guardians | 7 | 182 |
Students | 10 | 189 |
Community-led Sessions5 | 16 | 614 |
Privately-hosted Sessions | 26 | N/A6 |
Total | 90 | 1,963 |
4This reflects data captured from June 21, 2022 to September 22, 2022.
5Community sessions included a mix of community members, families, and SDP staff, and some followed a different protocol or process.
6Privately-hosted sessions were not recorded, so the research team could not analyze them in the same way; however, session notes were reviewed where possible for overarching themes.
What topics did participants mention in the survey and listening sessions?
Participants discussed many different issues. Sometimes there was a broad consensus and sometimes there was disagreement. To summarize this feedback, the research team followed a systematic coding process to categorize and summarize the topics and sub-topics that were mentioned by participants.
We categorized each text segment into one of nine main topics and numerous subtopics:
- Each segment of text represented one respondent’s main point.
- Each segment included only one stakeholder.
- However, one respondent may have made multiple points or a single complex point that then received multiple code applications. (We call this “double-coding.”)
Table 3 (below) shows the number of times the topic codes and subtopic codes were applied to a segment of text. The totals numbers (“N”) represent the survey and listening session feedback combined. Even though the survey and listening sessions followed different protocols and formats, we chose to combine the code counts to better summarize the feedback.
Table 3. Topics and Subtopics from the Listening & Learning Feedback
Topic and number of times that topic was used to code feedback | Subtopic and number of times that subtopic was used to code feedback |
---|---|
School Climate & Safety (N=1,176) | Mental Health & Socioemotional Support (n=483); Physical Safety & Discipline (n=359); Student-School Staff Relationships (n=242); Physical Health & Nutrition (n=107); Student-Student Relationships (n=59) |
Staffing (N=1,025) | Retention & Morale (n=421); Vacancies & Allocation (n=328); Caring/Hard-working Staff (n=258); Staff Training & PD (n=250); Time Constraints, Expectations & Demands (n=127); School Staff Collaboration & Mentoring (n=113); Recruitment, Hiring & Promotion (n=100) |
Academics (N=994) | Curricular Materials, Scope & Sequence and Academic Frameworks (n=335); Special Education (n=231); Student Achievement & Academic Supports (n=225); Instruction (n=185); Extracurricular & Enriching Opportunities (n=109); Testing/Assessments (n=86); English Learners (n=76); Course/Program Offerings (n=52) |
Central Office (N=817) | Leadership & Decision Making (n=386); School-Central Office Relationships & Communication (n=386); Internal Collaboration & Functionality (190); Network/Assistant Superintendent Leadership (n=86) |
Communications & Engagement (N=817) | School-Family Relationships & Communication (n=368); Listening to, Visiting & Learning from Stakeholders (n=326); Public Relations, Trust & Transparency (n=242) |
Facilities (N=557) | Capital & Environmental Enhancements (n=281); Cleanliness & Maintenance (n=172) |
Funding & Resource Distribution (N=346) | No Subtopics |
Diversity, Equity & Inclusion (N=318) | No Subtopics |
School Leadership (N=263) | No Subtopics |
Note: Participants in the listening sessions most often responded to the specific question that the facilitator asked. Thus, if more sessions asked about “Communications” compared to “Facilities,” the code counts here reflect that. Thus, code counts here should not necessarily be interpreted to mean that one topic was more important or urgent compared to another.
References
- Cohen, L., Lawrence Manion, and Keith Morrison. Research Methods in Education, 6th ed. New York: Routledge, 2007.
- Creswell, John W. Educational Research: Planning, Conducting, and Evaluating Quantitative and Qualitative Research, 4th ed. Boston: Pearson Education, 2012.
- Creswell, John W. Qualitative Inquiry and Research Design: Choosing Among Five Approaches, 3rd ed. Los Angeles: Sage, 2013.
- Creswell, John W. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches, 3rd ed. Los Angeles: Sage Publications, 2009.
- Emerson, Robert M., Rachel I. Fretz, and Linda L. Shaw. Writing Ethnographic Fieldnotes, 2nd ed. Chicago: University of Chicago Press, 2011.
- Hesse-Biber, Sharlene N. and Patricia Leavy. The Practice of Qualitative Research. Los Angeles: Sage Publications, 2006.
- Horvat, Erin. The Beginner’s Guide to Doing Qualitative Research. New York: Teachers College Press, 2013.
- Kvale, Steinar and Svend Brinkmann. InterViews: Learning the Craft of Qualitative Research Interviewing. Los Angeles: Sage, 2009.
- Lichtman, Marilyn. Qualitative Research in Education: A User’s Guide, 3rd ed. Los Angeles: Sage Publications, 2013.
- Merriam, Sharan B. Qualitative Research: A Guide to Design and Implementation. San Francisco: Jossey-Bass, 2009.
- Miles, Matthew B. and A. Michael Huberman. Qualitative Data Analysis, 2nd ed. Los Angeles: Sage Publications, 1994.