SRU Methodology Bits
The Methodology Bits Series summarizes information on methodological designs and lessons from the research field.
Block randomization and real-time group assignment are helpful design features for randomized controlled trials. While block randomization guarantees a balanced group split throughout the data collection process, real-time group assignment fosters assignment transparency and allows for on-the-spot group orientation. Some on-line applications offer these simultaneous features, but you have to pay and host your randomization information with them. In this Methodology Bits we describe an alternative paper-based solution, which can be implemented for free and hosted in-house. This solution works well on small and medium size studies with detailed-oriented and well-trained staff.
Monitoring interview follow-up rates on a regular basis is paramount in order to plan timelines and resources for achieving project targets. This Methodology Bits presents templates and suggestions that can help you choose what information to track and how to do it. They can be modified for your particular project needs.
For researchers considering investing in in-house information technology capacity, this Methodology Bits shares guidelines and tips used by the C-UHS Survey Research Unit (SRU) when developing participant management databases for our projects. The aim of a participant management relational database is to develop a flexible structure that functions as a relational platform for primary data collection operations and reporting.
Traditional program evaluation employs linear logic-based approaches to solve problems. They work well when a problem is well defined, understood, and when a single optimal solution likely exists from a selected range of possible solutions. Developmental evaluation has emerged as an alternative approach that takes into consideration problems that are dynamic, complex in nature, and require adaptive responses.
Planning and implementing safety protocols increases the likelihood of having a safe and positive research experience for both the research team and the participants. It is important to tailor protocols to the population, topic and methods of your study. Our research unit works with many hard-to-reach populations including those that are experiencing homelessness, housing instability, mental health issues and substance abuse. It is often necessary for us to interview participants in their homes and other locations in the community. As a result, we have created safety protocols that we adapt and strengthen based on study population and topic. This edition of Methodology Bits outlines how safety measures can be incorporated into your research study protocol. We draw from our experience of survey data collection with an emphasis on in-person interviewing.
Testing survey questionnaires before data collection is a necessary step for data quality. There are several methods available to test and evaluate survey questionnaires. Qualitative methods do not require large sample sizes and tend to provide rich information on questionnaire performance. When budget and time allow, a combination of methods is recommended. This Methodology Bits outlines some of the more popular qualitative methods available and goes into further detail about the cognitive interviewing method.
There are many useful classification frameworks for epidemiological designs and attempts to make the distinctions between these designs easier to understand. The Epidemiological Ladder presented here adds to this body of literature by proposing a quick reference guide to six well established epidemiological designs.
It is unclear for many in the research field what to do when they suspect a case of abuse and/or neglect. Knowing when and how to make a report when abuse is suspected can be a difficult process to navigate. This edition of Methodology Bits serves to introduce the topic within the research context in the province of Ontario. Each case is unique so in times of uncertainty, it is recommended you seek guidance from your Principal Investigator, Research Ethics Board (REB), and/or a relevant community organization.
Having a structured reporting strategy to document your project, set up clear expectations, raise issues, plan actions and communicate progress and decisions can be very helpful. The template presented here can be used by research staff as part of their communication strategy and can be adapted to specific project needs.
Qualitative data collection methods explore experience, feelings and thinking processes. They make sense of reality in terms of the meanings people give to them. They have in common a focus on language (verbal and non-verbal) rather than numbers and measures. However, they can vary significantly on the amount of control the researcher has over the information being collected and the level of real world interaction observed during the data collection process.
Here we present a series of steps for working with a team to develop a codebook and complete a thematic coding analysis of qualitative data. We propose a step-by-step process that allows for modifications to accommodate different sized teams, complexities of datasets and timelines for coding. While teams can benefit from reviewing and revising the steps as their coding progresses, having an outlined schedule can be beneficial to ensure consistency and understanding at the onset. It should be noted that this process can be multi-directional, with previous steps revisited as many times as needed. Coding teams should meet as many times as the team feels is needed. Depending on the size and expertise of your coding team, you may need one or many meetings in steps five and six to discuss progress, interpretations and new codes.
The importance of partnerships within social science research is becoming ever more evident as “researchers and funding agencies are increasingly showing interest in the application of research findings and focusing attention on engagement of knowledge-users in the research process”. Early engagement of knowledge-users, such as decision makers, policy makers and community members increases the usability of the research findings. Establishing a strong relationship with partners can also help to gain trust throughout the duration of the study. Strong relationships help those working on-the-ground to get involved with research and to engage in issues important to them. Perhaps even more importantly, partners will have greater knowledge of community needs and may have delivery modes in place to gather community input and feedback.
Mixed methods designs can be defined as: the combined utilization of quantitative and qualitative data collection methods in a research and/or evaluation project, making possible a greater understanding of the phenomenon being studied than what could be offered by separate quantitative and qualitative designs. This type of design gives the investigator the opportunity to utilize the strengths of both quantitative and qualitative data collection methods, at the same time that it compensates for some of their weaknesses.
It is important to think about survey quality during all research phases (e.g. survey design, sample selection, data collection, data analysis and knowledge dissemination). Since resources are usually limited, personnel and funds need to be strategically allocated to maximize the relevance and quality of your data. Here we describe core strategies you can use to ensure and monitor data quality during the survey data collection phase.
Group communication can be an insightful method of data collection as it brings together individuals with a shared interest or experience in a defined area. Here we compare the benefits and limitations of using: concept mapping, Delphi, and focus groups, as methods of group-based data collection. Each of these methods gathers a group of purposively sampled participants to focus on a specific topic and uses the expertise of participants to foster levels of agreement (e.g. group norms, consensus) and/or rationales for agreement and disagreement. Participants have the opportunity to formulate and reconsider their own opinions and rationales after being exposed to the rest of the group’s input and ideas through either real time discussion or organized feedback loops.
Surveying hard-to-reach populations can be a challenge due to  the lack of a sampling frame (list) from which one can draw a representative sample from; and/or  the existence of structural barriers (e.g. language, literacy, culture differences, power imbalances, legal problems) which can systematically effect recruitment and follow-up efforts.
Thanks to advancements in technology over the past two decades, today there are a significant number of survey administration options to choose from. The varieties of ways one can categorize these options make the researcher’s task of selecting a survey administration option seem complicated. However, all the main options available can be broken down into three main choices:
1. Interviewer administered versus self-administered surveys
2. Face-to-face, telephone, mail, fax, kiosk, email or web surveys
3. Computer-based or paper surveys
When choosing a study design the signs are not always clear regarding which direction to go. Each option has strengths and weaknesses to be considered. Understanding the main differences between the designs is a first step in the right direction. The CRICH Survey Research Unit has developed a new reference table to help you understand these differences: The Epidemiological Ladder.