Welcome to the Youth PQA Crash Course. This PowerPoint presentation is designed to guide you through using the Youth Program Quality Assessment (PQA) for program self- assessment. You will find notes included with each slide. The notes below each slide are meant for you as you instruct your staff and colleagues on how to use the tool. You may find it useful to print this presentation in handout or notes form for each participant. 1
People sometimes get the wrong idea about the Youth PQA because of unfavorable previous experiences with other assessment tools. The following information is key… This tool is not meant to judge you but to help you. The evidence that is gathered will not be used against you but as a starting point for creating a plan for program improvement. It will not only help identify modifiable issues but it will also tell you where your strengths are so that you can build upon them for maximum impact in your youth work. The key is that this only works if you allow yourself to assess honestly. Otherwise, you are wasting your time. 2
The Youth PQA is just the first step in a larger cycle of improvement, which we call the Youth Program Quality Intervention. Today we’ll focus on the ASSESS phase. But after you conduct your assessment, the next step is to develop a plan for improvement based on the data, and then to carry out that plan to improve the quality of your program for kids. This sequence can take place over a month or a year, and is meant to be repeated 3
1) Quality is important because it produces positive outcomes, including academic achievement. 2) Quality happens at the point of service ( POS ) where adults and kids meet – it can be influenced by both management ( PLC ) and regulation ( SAE ) but what really counts is the point of service where program experiences are co-constructed by youth and adults. Development / learning / adaptation / resilience – all happen through the basic formula: [Secure relationship + task over time + increasing complexity = developmental change]. These experiences occur at the point of service. 4
The Youth PQA represents widely shared ideas about how programs can best promote youth development and learning. The structure of the pyramid above (which has parallels to Maslow’s Hierarchy of Needs) shows common trends in scoring. Youth programs tend to have higher scores in the area of Safe Environment . They then taper off in the areas of Supportive Environment, Interaction, and are often lowest in Engagement . So, the pyramid shape represents the way scores tend to look. The top two categories — Interaction and Engagement — while often the lowest scoring, are the most connected to positive youth outcomes. 5
Here’s a sample rubric. The written evidence goes in the white space on the right, then the score goes in the box – 1, 3, or 5 (no 2s or 4s!). This sample demonstrates how the tool addresses a common positive youth development topic. It’s generally agreed that youth do better when they feel a sense of belonging. This slide shows one indicator we use to get at that idea. Notice how this converts an inner state (whether youth feel like they belong) to a measurable behavior ( whether staff provide get to know you activities ) . Each booklet is a Form. To see how Form A breaks down, look at the last page (titled “Youth PQA Summary Sheet”). The boldface words are the domains and the words labeled A, B, C, etc. are scales. An item is each row. An item contains 2 to 6 indicators like the one on this slide. 6
Here are some tips about Steps 1 and 2: • Self-assessment only works if program staff are involved. • All staff involved should familiarize themselves with the tool before collecting any data so that when they are observing they will have an idea of what they are looking for. • Try to get a good mix of program times and offerings. 7
These are examples of weak anecdotes. Take some time to identify what makes them ineffective. As a visual reference you can create a chart on a board or easel paper that has two sections. At the top of the paper write Ineffective on one side, Effective on the other side, and divide it by a vertical line in the middle. As the participants identify the things that make this anecdote ineffective, scribe those things onto the paper, and then ask them what are some things that could make an anecdote more effective. Scribe their answers on the paper or board. Next, have them practice rewording the anecdote to make it an effective one. 8
These anecdotes contain more details, actual quotes, and objectively tell what happened. The key to making strong anecdotes is removing any judgment or interpretation. Write down what you see & hear — not what you think about what you see & hear. 9
Compare these characteristics to the ones that you scribed as identified by the participants. Highlight the ones that they did not mention. 10
Here are some tips about your scoring meeting: • Remember, you are only completing one Form A. • Often, several team members may have evidence that fits a particular indicator, and the score you write may depend on which evidence you use. In such a case, use your judgment and decide as a team which is most representative of your program, and score accordingly. • The same anecdote can be used for more than one indicator. You can fit them wherever they apply. Always try to see multiple items in every interaction and cross-reference constantly. •Since you must have evidence for every score you give, if you lack evidence with “fit,” collect more data. • The conversation during the scoring meeting (step 4) is the most important part of program self-assessment. 11
1. Goals should be specific, measurable, and doable. Typically, goals are at the item or indicator level. You may have an overall goal of raising a subscale, but you would most likely not list this on the planning form. You would instead list item-level or indicator- level goals within that subscale. Why? It’s much easier to improve one thing at a time. 2. It is important to be clear on who is working on meeting the goals and how success will be measured. An effective plan for measuring success should clearly indicate who is carrying out the improvement, what will be improved, and how the improvement will be measured. 3. Steps should be short, simple if possible, tangible, and should clearly designate who is responsible for completing them. Breaking a larger goal into short steps is a way to make it achievable. 4. Your organization wants you to succeed! Work together as a staff to connect services to your improvement initiative. 12
For more information visit us online: Web: cypq.org E-mail: Amber@cypq.org 13
Recommend
More recommend