Honorarium Ideology

I have been following the honorarium discussion in an effort to see what happens organically but I want to talk about it as the creator of the honorarium program.

First a little history it’s 2014
https://dallasmakerspace.org/wiki/Board_of_Directors_Meeting_20140515#Modification_to_the_Event_Policy_Encouraging_Classes_.28Robert_Davidson.29

Establish the following policy under the Event’s section of the standing rules:

  • All Teachers will be eligible for a 50$ Honorarium per approved class (http://en.wikipedia.org/wiki/Honorarium)
    This was done because it is the amount of one month of membership I had people asking all the time for handouts free membership or reduced membership so with the assistance of the 2014 board we made a deal teach one class a month and it pays for your membership.

  • A Honorarium can be forfeited and given to a committee.
    Some people teach because they want the committee to succeed and for those people we wanted to be able to allow there time to further the goals of the representing committees.

  • Committees’s are also eligible for a matching honorarium for every class sponsored by a committee a matching 50$ will goto the committee (Committees will need to keep track of class’s they offered for the month and notify the treasurer at the end of the month)
    This was really just documenting the training as the calendar system did not support tracking payments

  • Definition of Approved Classes that are scheduled on the calendar and the curriculum is approved by the Majority of the BOD
    Finally Approved Classes this is the lever to tweak the program

Specifically, I want to talk about the final bullet “Approved” Classes we knew that there would be people that would abuse and game this system but felt overall it was worth it to benefit the whole. I believe this is the lever DMS is currently failing which has called this systemic growth.

I did a lot of the auditing when the program came out and I would always ask myself the following questions:

Does this class benefit the Dallas Makerspace as an organization?
Does this class align with the goals of Dallas Makerspace?
Does this class provide more value to the organization than the $100 payment? (50 to the instructor 50 to the committee)

I have read every message from all the various post and I think it’s time to step back and say what is the problem we are trying to fix?

Personally, I don’t believe we have a problem with the financial side of the program I believe the problem is a quality control issue specifically around “approved classes” the program was designed to be simple if people overcomplicate the message it cannot be shared. Teach – Dallas Makerspace

I believe it’s time to sit down and define what are the approved classes that qualify for honorarium and depersonalize the program. The idea of holding personal grudges need to be removed and look at it based on factual information.

So with that being said, I will offer to implement this project by doing the following:

1. Build a task force to define qualifying programs (Timeline of 3 months to complete this if not completed the entire program is shut off until it is defined)

2. We must measure the quality of classes there are off the shelf applications to handle this for us so I will project manage a paid developer (If someone wants to volunteer great but it’s not going to be on Makerspace Time (Finish Whenever) to implement an off the shelf commercial solution to send out surveys to start monitoring classes.

3. If the Instructors Quality Score drops below a certain amount they will be ineligible for the honorarium program until they can raise there score teaching non-honorarium classes.

  1. Stretch Goal expand the calendar system to send out events to third-party calendar system such as Meetup and Eventbrite.

The intent is to immediately increase the quality of classes and setup checks to make sure honorarium classes are held to a higher standard.
This will likely cause an immediate drop in classes approved.
I expect the classes will trend towards the current level of cost but by then my hope is we can measure if the program is working.

Ultimately my intent is to quantify the honorarium program it worked well enough when we could see it working but now I know myself and others are not confident the money that is paid out is worth it overall.

Finally, I want to thank all of the teachers that used the program I really believe you guys are what makes Dallas Makerspace successful.

7 Likes

I think this is all great but I’m waiting to hear calls of potential weaponization. When I said chairs should oversee this, I was told it could be used as a weapon. All I know is y’all must have had some hellacious issues in the past with as much as you distrust one another.

2 Likes

This is Mutually Assured Destruction the rules will be consistent to all applicants. It’s a pretty dumb idea to weaponize the program to shoot at yourself. Also that is why there is an approval process and it’s setup the way it is where there is an appeals process.

4 Likes

This sounds consistent with (our unstated) DMS goals and objectives, and with some level of automation it should help relieve the “approval” burden. The objective of consistent criteria is laudable also.

But I believe there are still two outstanding issues that this doesn’t address, because the burning platform is that we’re spending more money on honorariums than we think we can afford.

  1. Budget. Your proposal would be a temporary fix in that it would likely have the short term effect of dropping the number of classes available, thereby reducing the short term honorarium payments. But once the number of great classes with great instructors increases, it’s reasonable to ask if we should continue to have an unlimited budget for honorariums.

  2. Required classes. Your proposal would have the effect of weeding out inferior instructors, but in the case of approved classes with an existing curriculum another instructor will just step in. The number of classes and honorariums would not decrease, still leaving the potential funding problem. There are some committees that have a large number of required classes. IMO it’s reasonable to determine whether this is appropriate from a funding perspective.

2 Likes

First, I love the idea of a survey at the end of every class. The results need to be anonymous, and quantifiable. I just hope they aren’t like every satisfaction survey that I’ve been asked to do lately where “anything less than a 10 means we failed.” I’d like to see questions like the following:

  1. What would you say your knowledge of the topic when you signed up for this class? 0-10
  2. Was the instructor knowledgeable in the topic? 0-10
  3. Did you ask any questions during the class? Yes/No
  4. If yes, do you feel like you got a sufficient answer to your question(s)? 0-10
  5. Was the length of the class sufficient for the information being presented? Yes/No
  6. What would you say your knowledge of the topic is after the class? 0-10
  7. Would you take another class from this instructor? Yes/No
  8. Would you recommend this class & instructor to your friends? Yes/No
4 Likes

This should be two questions. One for the instructor, one for the material.

My question was more about the combination of the two. You may really like an instructor for one class that you take from them, but not like the way they teach another class.

1 Like

I’d also like some feedback about the process/class vs the teacher. I teach in a couple different commitee’s some where we “wing it” and some where we have fully written slides for the entire class. If it’s one or the other that is a problem for students it would be good to know. Personally, if I’m knowledgeable in the area, I prefer a more conversational “winging it” class, but if that’s not everyone’s thing it would be cool to have data on that.

1 Like

The granularity of 1-10 is far too wide. I’d suggest looking at restaurant kiosks at thousands of facilities that use 1 to 5 as well as a N/A.

2 Likes

Absolutely I think this should be quantified for example if we measure every dollar we put into honorarium we get two dollars out than sure let’s keep the budget unlimited but if we are putting a dollar in and only getting .75 cents out I think we need to adjust the “accepted” lever so that there is an increase to the ROI.

1 Like

This should be situation dependent. Some types of classes are more suited to presentations than others. IMO, no one should require an instructor to create a full slide deck for a one-off project class.

I like your idea. I suspect we don’t capture the data we need to do this analysis. And we’ll never have the data unless at some point we determine what we need to know and what data should be collected.

I agree wholeheartedly. I’m just saying if I would be more open to writing something to keep me on track if students feel I’m getting distracted or something. Conversely, if they feel that the class structure is too rigid to teach their learning style (some people learn through questions) then I’d like that feedback as well.

1 Like

I think this is also a failure of automation first we need to take a hard look if “Training Required” is needed you did a lot of great work on defining what equipment require training I believe that is the path forward a consistent methodology to define if something is “Training Required”

Finally if equipment is deemed as Training Required than we need to look into automating and lowering the cost of entry to DMS.

For example I would expect Laser and Woodshop have the highest amount of turnover I hope that these committees could look at ways to reduce the cost of training per student to DMS.

For Example (Quick YouTube search I have not watched)

1 Like

I like your questions. I think they’d apply to most classes and are good metrics for quality of content and instruction.

And useful feedback for an instructor that wants to improve or tweak things

If this is an email survey, there needs to be some mechanism where the instructor can see the feedback.

Fine if it’s anonymous, collated, whatever. But several are good questions that would be good tweaks for future classes. Not just data for auditors.

Absolutely the teacher must be aware of the survey results so that they can effect there behavior and habits. I do hesitate on 100% anonymous to the auditors again there are people that would bomb an instructor because they are a terrible person. But I agree the feedback to the teacher would be anonymous.

2 Likes

Agree with non-anonymous to auditors for several reasons

Strongly think good knowing it will be anonymous to the teacher so a student will feel more free giving honest feedback

Another thought on the form of feedback questions…email vs hard copy

My husband has managed internal IT helpdesk at corporate bank offices, closely monitored email feedback of helpdesk tickets, and strongly encourage feedback on email survey.

Still would only get 7-11% response rate

When I used to teach at Brookhaven College, we did paper surveys that we had to turn in. Student supplying their names were optional. I’d just hand them out, they filled out, then turned in face down and I didn’t mess with them until everyone left, so it more or less preserved anonymity to those that cared. Those went in an envelope then turned into the office.

Pros of email, can be really anonymous and bypass teacher completely. Potentially fodder for data miners

Cons of email, possibly low response rate, especially with time elapsed from end of class to getting around to checking email (if it doesn’t get junk filtered). And then whatever method to supply de-identified info to teacher (hopefully could be automated)

Pro of paper, could get immediate feedback and capture much higher percentage of students since most will fill out if handled right at the end of class. Have a box like the finance box they could be returned to.

Cons of paper, potentially an instructor with poor feedback could game the system (fake good ones, dispose of bad). And just training instructors to remember to do it before folks leave. Would be totally manual to preserve data. If anyone cares. Data miners. BUT if it’s just used as a red flag for patterns with certain instructors, it wouldn’t have to be complicated to just review them.

Soooo…I’m on the fence opinion wise. Just bringing up things I’ve thought of every time the survey thing comes up.

Whatever form it takes, I think a survey has value and is a good idea

Just for giggles, I converted your questions into a Google Forms quiz…
Click here if you want to see it in action.

Some notes:

  • it would be really handy to include the course and instructor info on the quiz (hard to do in a generic thing like this, but generally speaking, we need to know what course, which instructor; not so easy to do with Google form like this, but probably someone with more google-suite-foo than I have could make that happen)
  • having the email address or other ID for the respondent if they want to be contacted would be another field; not a big deal, but I hate having to ask for it; one option would be to require a sign on to Google, which brings another perk as noted in the next bullet point, but has obvious drawbacks
  • Google forms cannot (natively, easily, quickly) limit to a single response without requiring logon, meaning if we want them anonymous, we cannot limit the number of times they can be submitted (obvious skullduggery observations).

Anyway, more for fiddling with whether we like the feel of those questions/format than anything else…

1 Like

I know I am not one for surveys. Can’t speak for others though.

1 Like