Dec 4, 2014

#aesa14 Work Life: Measuring the Effectiveness of your ESA

Measuring the Effectiveness of your ESA

Why I chose this:
Phoebe, my Director, was presenting at the same time as this session and she asked that I attend in order to help get information for her. I'm looking forward to hearing how other agencies figure out how, and if, they are succeeding - and in what areas.

What we covered:

Motivation and a process to help motivate/engage employees.

ProMES - teams work together. NEFEC has 80 employees, 12 teams. This was phased in but every employee is a member of a team.

"It's not that I'm lazy, it's that I just don't care."

ProMES - measuring individual performance as well as team performance. Research-based. This could be applicable to schools and/or districts, not just ESA.

People want input as to how they are measured. We don't want subjective feedback. The feedback is in the conversation.

Different than strat planning and team-building. ProMES works from bottom up.

What motivates the individual? What does the individual bring to the organization? What does the organization bring to the individual?

Need controllable, quantitative measures.
Clearly defined roles and expectations.
Participation and team-based decision making
Accountability through ownership of the system
Evaluative and descriptive feedback

Teams are people that work together, but not necessarily arranged so in your org chart.

Select Objectives - What do the individuals add to the org?
Define Indicators - Measures to see if team is meeting objectives
Develop Contingencies -
Feedback
Periodic Review of System

Objectives/indicators
  • Manage resources effectively - % of total budget funded from outside
  • Develop programming code to meet district needs - Ratio of programming call tickets completed to received
  • Provide quality data-drive PD - % of PD generated based on district performance data
Team members have to discuss what they do, what they bring to the organization. "What are you paid to do?"

Develop Contingencies
  • Create scale graph for each measure. Scale of potential performance against effectiveness. 
  • Overall importance (Ex: -100 to +100) - overall, this would be important. Other example: -50 to +50 so, not quite as important.
  • What is the zero point? Don't want to fall below zero. May not be at the halfway mark; depends on the measure.
  • Check priorities at any given time. Where are compared to where you want to be, then prioritize based on those stats/measures. Ex: from 0 to 60% may be a 77-point gain, where moving from 60% to 100% may be a 23-point gain.
  • System itself is partially subjective, but should be based on previous data the group wants to use as the baseline.
Feedback
  • Based on what the group has determined they want feedback.
  • Use software to measure the effectiveness
  • Effectiveness gains and losses
  • Need outside perspective to provide objective feedback
Periodic Review:
  • Monitor and adjust
  • Dedicated time to do this
  • Once folks are embedded and have ownership, they want to be successful
  • Things get better, use as launchpads for further development. Ex: Did teachers enjoy the PD to data-driven PD
  • Can adjust along the way. Ex: as you grow, you move the bottom up to keep yourself from falling back again.
Input will be gathered from the stakeholders affected by the various teams/measures. Interview the stakeholders and find the areas of need/weakness to be addressed.

AdvancED accreditation achieved by NEFEC through the use of ProMES goals, etc.
Moved toward organizational data-driven Goals and Objectives
Developed an Organizational Mgmgt System
Incorporated into Performance Appraisal System - Based on evidence of meeting, collecting data, conducting feedback sessions - using the process. 15% of individual performance eval






No comments:

Post a Comment