The Process: Item Development

Once the objectives are finalized, we being the process of writing questions, called items, for exams. Security is a major concern in item development. All items are kept as confidential as possible by having those involved in the process sign non-disclosure agreements agreeing not to disclose item content to anyone. LPI also takes other undisclosed security precautions.

Item writing

Historically, the process used to develop the items for most other IT certification exams was to fly a group of subject-matter experts into a location for a week or more, give them training in how to write items, and then have them work intensely to create the questions.

But this technique is expensive and exclusive. At LPI, during our initial exam development phase we leveraged the power of the community through the internet to encouraged everyone who was interested and knowledgeable to help with item writing.

Since then, LPI has developed new items for exam rotation in house by tapping the knowledge of subject matter experts, online volunteers and participants in item writing workshops.

Item Screening

Supervisors screened all submitted exam items, and accepted, rejected or reworded them. They focused on three criteria:

  • Redundancy: Items that are substantially identical to previously submitted items are rejected.
  • Phrasing and Clarity: Items phrased in confusing or otherwise inappropriate ways are rejected or reworded. Supervisors pay attention to ensure that questions can be understood by non-native English speakers.
  • Accuracy: Supervisors rejected or reworded items that are not technically accurate.

Item Technical Review

Next, LPI uses a group of Linux experts to put items through a technical review. Each item is reviewed by multiple experts. Each expert classifies items as approved, rejected or “other” for rewording or review by others.

The primary technical criteria:
  • Correctness
  • Appropriateness of distractors (for multiple-choice items): Reviewers ensure that the distractor answer choices are incorrect but reasonably plausible.
  • Phrasing and clarity: Reviewers ensure items are worded in appropriate language.
  • Relevance
  • Expected difficulty
Supervisors then collect the reviews to determine if each item was:
  • Accepted based on consensus
  • Rejected based on consensus
  • Accepted after further review: If reviewers did not agree, the supervisor might accept it, perhaps based on the opinion of another reviewer.
  • Rejected after further review: If reviewers did not agree, the supervisor might reject it, perhaps based on the opinion of another reviewer.
  • Accepted after revision: In some cases, reviewers might suggest rewording the item and the supervisor might accept the item after rewording it.

Next: Exam Creation