I wanted to get a sense of how associations really feel about their AMS systems. I knew how my clients felt, but how does the broader market feel?
So I conducted a non-scientific survey of association executives, asking for their opinion of their AMS. I received 183 responses, covering 27 different off-the-shelf AMS products, as well as custom built systems. I found the results very interesting.
The first question I asked was:
Mean = 56
Median = 57
Mode = 50
Based on these results, the‚”average‚” association finds their AMS software to be just above merely OK when it comes to software meeting the association’s needs. In fact, the most common answer (mode) to this question was 50, or OK. Not good, not great, just OK.
The second question I asked was:
Mean = 59
Median = 65
Mode = 50
So when an association executive is asked if they would recommend their product to others, assuming it could meet their needs, the‚ “average‚” association rated this just above neutral, meaning they would recommend it, but not very strongly. But oddly enough, the mean and median of this question are higher than the first question. In other words, associations aren’t terribly pleased with their AMS software, but they are likely to recommend it to their colleagues.
So what does this all mean?
Where does your association rank? Is your AMS just OK, is it performing really well, or is it underperforming? And if things aren’st better than just OK, what can you do to change that?