Direct electronic acquisition of data from patients possesses accuracy and diagnostic value. The mechanics of how best to capture historical information from patients using electronic interfaces are not well studied. We undertook an iterative usability experiment to answer 2 questions: 1) How can maximal electronic data input from a patient be achieved, and 2) Do varying structures for data entry promote differential documentation of specified data elements? METHODS: A series of four trials comprised the testing cycle. Unstructured text entry, directed text entry, and closed ended questions were tested in combination against outcomes of word count, time to task completion, and user preferences. Covariates of interest included participants' technologic experience and ergonomic experience with keyboards, as well as self-report of educational status, literacy, and primary language. RESULTS: Participants clearly preferred the order of initial closed-ended questions followed by unstructured text entry, and this ordering was not associated with decrements in word count or increase in time. When compared to unstructured text entry, directed text entry provided higher documentation of data for past medical history and questions which parents wished to discuss with the clinician. A closed-end question structure, when compared to directed text entry, provided higher capture of parents' questions for discussion. CONCLUSIONS: Optimal design of an electronic interview for the capture of medical histories will benefit from a mixed structure of directed text entry and closed-ended questions. For historical or clinically relevant items where maximal capture of data is desired, a structure with closed-ended questions would be preferred.