Skip to main content



Child-Maltreatment-Research-L (CMRL) List Serve

Database of Past CMRL Messages

Welcome to the database of past Child-Maltreatment-Research-L (CMRL) list serve messages. The table below contains all past CMRL messages (text only, no attachments) from Nov. 20, 1996 - March 6, 2018 and is updated quarterly.

Instructions: Postings are listed for browsing with the newest messages first. Click on the linked ID number to see a message. You can search the author, subject, message ID, and message content fields by entering your criteria into this search box:

Message ID: 9809
Date: 2015-04-29

Author:Andrea Sedlak

Subject:RE: prompting for responses among 10-14 year-olds

I see this topic has resurrected on the listserv, which experienced some technical glitches last month. I’m reposting my reply to Kelly earlier to enter the conversation on this now. (FYI, Kelly and I corresponded offline during the listserv hiatus.) My earlier posting, 4/1/15: Hi, Kelly, Let me know if I’m understanding what you want your interviewers to do here: You have a set of response alternatives that you’re interested in using for coding answers and your interviewers will have to listen to the respondent’s open-ended answer and then code it into your response alternatives. That’s not recommended practice. It demands that your interviewers “code on the fly/in the field.” The problem is you have no way of ensuring that their classifications are standardized or appropriate. The rules of standardized measurement require that the interviewer read all the response alternatives and have the participant select one. It also sounds like you’ve been concerned about the youth agreeing with the first response offered or interrupting the interviewer during the reading of the question and responses. When a participant interrupts the interviewer, the interviewer should go back and re-read the whole question and response alternatives. You can somewhat prevent those interruptions by telling the participant to wait until they hear all the possible answers and then pick one. It can also help to give the respondent a card with all the answers typed out on it. Tell them to wait to choose their answer until the interviewer has read all the available responses. They then follow along watching the card as the interviewer reads the alternatives—the card also helps the participant remember the responses and minimizes questions like “what was that second thing again?” If you are interested in using open-ended responses, then the interviewers should record those answers verbatim (i.e., not classify the answers during the interview). During data processing, have coders apply whatever classification scheme you want to use and be sure to conduct reliability assessments of the coders' classification work. You can refer to various classic texts on survey data quality (e.g., Floyd Fowler (1995), Improving Survey Questions, Sage). See also Sudman & Bradburn’s comments on field coding (pp.152-153) in their 1982 book, Asking Questions: A Practice Guide to Questionnaire Design, Jossey –Bass. Hope this helps. Andrea ______________________________________________ Andrea J. Sedlak, Ph.D. Vice President Westat, Inc. 1600 Research Blvd. RW2520 Rockville, MD 20850 (301) 251-4211 fax: (301) 315-5934 From: bounce-118950820-42416177@list.cornell.edu [mailto:bounce-118950820-42416177@list.cornell.edu] On Behalf Of Hallman, Kelly Sent: Wednesday, March 04, 2015 11:35 AM To: Child Maltreatment Researchers Subject: prompting for responses among 10-14 year-olds Hello, I am working with children aged 10-14 in LMICs around issues of pregnancy knowledge and risk. In my experience interviewing this age group, we have not prompted for responses (i.e., read the response options to them) due to concerns of children feeling compelled to say “yes” to something just to please the interviewer. I have a colleague who is insisting we read the response options to the interviewees. Is there an academic literature indicating what the best strategy is here? Even US or European studies would be useful. Thanks, Kelly ________________________________________ Kelly K. Hallman, PhD Senior Associate POPULATION COUNCIL IDEAS. EVIDENCE. IMPACT. www.popcouncil.org ________________________________________

I see this topic has resurrected on the listserv, which experienced some technical glitches last month. I’m reposting my reply to Kelly earlier to enter the conversation on this now. (FYI, Kelly and I corresponded offline during the listserv hiatus.) My earlier posting, 4/1/15: Hi, Kelly, Let me know if I’m understanding what you want your interviewers to do here: You have a set of response alternatives that you’re interested in using for coding answers and your interviewers will have to listen to the respondent’s open-ended answer and then code it into your response alternatives. That’s not recommended practice. It demands that your interviewers “code on the fly/in the field.” The problem is you have no way of ensuring that their classifications are standardized or appropriate. The rules of standardized measurement require that the interviewer read all the response alternatives and have the participant select one. It also sounds like you’ve been concerned about the youth agreeing with the first response offered or interrupting the interviewer during the reading of the question and responses. When a participant interrupts the interviewer, the interviewer should go back and re-read the whole question and response alternatives. You can somewhat prevent those interruptions by telling the participant to wait until they hear all the possible answers and then pick one. It can also help to give the respondent a card with all the answers typed out on it. Tell them to wait to choose their answer until the interviewer has read all the available responses. They then follow along watching the card as the interviewer reads the alternatives—the card also helps the participant remember the responses and minimizes questions like “what was that second thing again?” If you are interested in using open-ended responses, then the interviewers should record those answers verbatim (i.e., not classify the answers during the interview). During data processing, have coders apply whatever classification scheme you want to use and be sure to conduct reliability assessments of the coders' classification work. You can refer to various classic texts on survey data quality (e.g., Floyd Fowler (1995), Improving Survey Questions, Sage). See also Sudman & Bradburn’s comments on field coding (pp.152-153) in their 1982 book, Asking Questions: A Practice Guide to Questionnaire Design, Jossey –Bass. Hope this helps. Andrea ______________________________________________ Andrea J. Sedlak, Ph.D. Vice President Westat, Inc. 1600 Research Blvd. RW2520 Rockville, MD 20850 (301) 251-4211 fax: (301) 315-5934 From: bounce-118950820-42416177list.cornell.edu [mailto:bounce-118950820-42416177list.cornell.edu] On Behalf Of Hallman, Kelly Sent: Wednesday, March 04, 2015 11:35 AM To: Child Maltreatment Researchers Subject: prompting for responses among 10-14 year-olds Hello, I am working with children aged 10-14 in LMICs around issues of pregnancy knowledge and risk. In my experience interviewing this age group, we have not prompted for responses (i.e., read the response options to them) due to concerns of children feeling compelled to say “yes” to something just to please the interviewer. I have a colleague who is insisting we read the response options to the interviewees. Is there an academic literature indicating what the best strategy is here? Even US or European studies would be useful. Thanks, Kelly ________________________________________ Kelly K. Hallman, PhD Senior Associate POPULATION COUNCIL IDEAS. EVIDENCE. IMPACT. www.popcouncil.org ________________________________________