Session results are always a bit strange to go through. I want to share what they looked like for me this year. Nearly all the comments were outstanding - but there's always one that really leaves me scratching my head.
Like most speakers who take that part of their work seriously, I try to look at every survey, and pay extra attention to ones with written comments to see if there are things I can do better for next time. You have to grow a bit of a thick skin sometimes because there are always some outliers that don't mesh with the result of the results. You tell yourself to ignore those, but it doesn't always work.
It's also a bit disappointing how few surveys there were. I know in my own case, the online survey app didn't work so I guess a lot of other people had that same problem. I don't know if there are more yet to be entered, because usually there are more surveys than this -- but here's what I've got so far:
BOOT102 | BP 105 | BP 114 | BP 115 |
Content Quality | Excellent | Good | Fair |
22
| 7
| 0
|
|
Content Quality | Excellent | Good | Fair |
8
| 1
| 0
|
|
Content Quality | Excellent | Good | Fair |
12
| 5
| 0
|
|
Content Quality | Excellent | Good | Fair |
21
| 2
| 2
|
|
Impact on Purchase | Yes | No | Und. |
19
| 8
| 1
|
|
Impact on Purchase | Yes | No | Und. |
8
| 1
| 0
|
|
Impact on Purchase | Yes | No | Und. |
11
| 3
| 2
|
|
Impact on Purchase | Yes | No | Und. |
14
| 11
| 0
|
|
Speaker Effectiveness | Excellent | Good | Fair |
22
| 7
| 0
|
|
Speaker Effectiveness | Excellent | Good | Fair |
7
| 2
| 0
|
|
Speaker Effectiveness | Excellent | Good | Fair |
13
| 4
| 0
|
|
Speaker Effectiveness | Excellent | Good | Fair |
23
| 1
| 2
|
|
So here's where you have to scratch your head a bit -- This is the set of comments from "BP115 - Performing Your Own IBM Lotus Domino Security Review". I'm including all the comments as they were entered for both repeats in answer to the first question (the only place any negative comment showed up). I've merged the two repeats of the same session, but otherwise, not changed anything.
# How would you rate the quality and relevance of the information in the Session/BoF/Lab?
Excellent: 21 Good: 2 Fair: 2
* Excellent - Very important information, ideal for any admin concerned about "securing" their environment.
* - Very good points.
* Excellent - This was all good information. It provides the framework for an in depth look at our servers security.
* Excellent - An excellent session, with great content that I'll be downloading straight away.
* Excellent - great content. the real worl examples really help to reinforce the information.
* Fair - This session was total fluff, a total waste of time. This could have been a great session, however the speaker chose to waste people's time by providing zero technical details. This was a waste of a session slot.
So -- How much would YOU pay attention to that last statement? There were 25 responses total, and one person really felt like they didn't get what they wanted. I have no way to know what the other couple of hundred people who came thought, since they didn't fill out surveys (or at least not that I received).
Given that I started the session by saying that it was not a technical session, but a process one -- I guess I don't have a lot of sympathy. Am I way off base here? At least this year nobody down-graded the session and commented that the room was cold so I guess we're ahead on that count.
met. Or, perhaps, the person just doesn't like you. :-) Out of all of the
responses you received from all of your sessions, I would throw this one out.
After I thought about it a while, which it looks like you did.