I just want to add - besides commenting that my previous post was horrifically filled with typos! - that the needs analysis is a critical part of this.
I developed a package "Desktop Apps and You". It had about 20 - 30 questions and tasks for each Office apps. NB. this also included about 20 basic use of Windows questions/tasks.
There were two routes.
1. A local sysadmin (a "coach") sat with the person, and walked them through it. They used a Coaching Guide.
2. self-administered. They used a checklist type document.
Each person was then self-scored, or scored by a coach. 1 to 5, with 1 meaning no or little knowledge, 5 meaning expert competency. Obviously scoring by a coach gives more accurate numbers.
EXAMPLE (Word):
Self-assessment document.
#11 I know how to change page and paragraph formatting.
1 2 3 4 5
#12 I know how to set tabs, indent paragraphs and change line spacing.
1 2 3 4 5
Coaches document.
#11 - #12
Direction to participant: "Select one of the larger paragraphs, indent it by 0.5 inches, apply a 6-point space before and after it, and change the line spacing to 1.5 inches. Select a different paragraph and set tabs for that paragraph to 1.8 inches. Include a hanging indent of 1.8 inches as well."
Action: The participant makes the required changes. Styles can be used and discussed, but the participant may also use the ruler or the Paragraph comand on the Format menu.
See? In the coaches case, the coach actually asks the participant to do the actions. They do NOT (ever) tell them, even if asked, how to do it. They just ask them to do X Y Z. The coaches observes and the ]b]coach[/b] scores them.
This has some intuitive aspects. If someone does not know how to do something, but quickly figures it out...they are rated higher.
Whether self-assessed, or coached assessed, the numbers are crunched. Why?
Because by ALSO developing "chunked" learning material, it was easier to see things like:
hmmmm, this office knows Word pretty well, but MOST of them really do understand using tables very well. So...a
focused training, on tables, can be delivered. Again, a one hour session in a boardroom, over lunch (if managers are good...provided) is the most efficient way to get that knowledge out.
I found - through bitter experience - that sending a group of people to a general course is mostly a waste of time. There will be people who know little, and people who know a lot. The people who do know, are bored. The people who don't know are confused. The instructor is stuck, as they can't give the people who don't know the attention they need. And they can't give the people who do know much value. By filling a class with people who basically
all at the same level, everyone benefits.
Plus a general course by its nature IS general.
By getting some real numbers, on real subjects, we could direct our limited training rsources efficiently.
Needs analysis, and focused "chunked" training.
In another office, we determined that most people did not know (again) how to use tables very well at all. However....it was ALSO determined that.....they never used them!
So...ummmm.....we are going to spend money training them on something they will never use?
Can you say stupid?
Needs analysis, and focused "chunked" training. This can be stand-up training, a focused desk drop (we have a series called Getting The Most Of...; of Word, of Word tables, of Outlook, of Your Keyboard, of IT Security etc. etc. etc.); lunch-and-learn sessions; web-based tutorials, web-based classroom courses; take-home documents.
Good luck!
faq219-2884
Gerry
My paintings and sculpture