After objectives are ready, the next stage is the writing of questions, called items, for exams.
Security is a major concern in item development. All items must be kept as confidential as possible. Everyone participating in this phase agrees not to disclose item content to anyone and to sign non-disclosure agreements. LPI also took other undisclosed security precautions.
The procedure used to develop the items for most IT certification exams is to fly a group of subject-matter experts into a location for a week or more, give them training in how to write items, and then have them work intensely to create the questions.
But this technique is expensive and exclusive (and not in a good way). So for Level 1 exams, LPI put out a public call across the Internet in August 1999 for item writers. We encouraged everyone who was interested and knowledgeable to help with item writing. LPI used a web interface called TIPS to collect most items.
During that initial exam development phase, item writers submitted items for each objective. The weight value assigned to each objective determined the number of items written for each one. This method of item collection was effective. More than 70 people submitted items for consideration, but the process significantly lengthened this stage of exam development.
Since then, LPI has developed new items for exam rotation in house by tapping the knowledge of subject-matter experts, online volunteers and participants in item-writing workshops.
Supervisors screened all submitted exam items, and accepted, rejected or reworded them. They focused on three criteria:
- Redundancy: Items that were substantially identical to previously submitted items were rejected.
- Phrasing and clarity: Items phrased in confusing or otherwise inappropriate ways were rejected or reworded. Supervisors paid attention to ensuring that questions can be understood by non-native English speakers.
- Accuracy: Supervisors rejected or reworded items that were not technically accurate.
Item Technical Review
Next, LPI used a group of 10 Linux experts to put items through a technical review. Each item was reviewed by at least two experts. Each expert classified items as approved, rejected or “other” for rewording or review by others.
The primary technical criteria:
- Appropriateness of distractors (for multiple-choice items): Reviewers ensured that the distractor answer choices were incorrect but reasonably plausible.
- Phrasing and clarity: Reviewers ensured items were worded in appropriate language.
- Expected difficulty.
Supervisors then collected the reviews. Each item was:
- Accepted based on consensus.
- Rejected based on consensus.
- Accepted after further review: If reviewers did not agree, the supervisor might accept it, perhaps based on the opinion of another reviewer.
- Rejected after further review: If reviewers did not agree, the supervisor might reject it, perhaps based on the opinion of another reviewer.
- Accepted after revision: In some cases, reviewers might suggest rewording the item and the supervisor might accept the item after rewording it.