Scientific breakthroughs aren’t possible without the painstaking process of clinical trials. So what happens when many of those trials leave out large portions of the population?
Researchers found that most studies (74.6%) had eligibility criteria that directly and/or indirectly exclude adults with intellectual disability. Approximately one-third of studies had direct exclusion criteria based on cognitive impairment or diagnosis of intellectual disability. Nearly 65% of studies indirectly excluded adults with intellectual disability based on factors likely associated with intellectual disability (e.g., functional capacity, inability to read/write, and/or research staff discretion).
I've always found it amazing how many studies take place among university students, and because they check for factors like age and gender, a list of criteria at the end, they call it a reliable study.. But they are testing available people - people available to university students. Already a population separated from other working professions, not including factors that make study difficult or foreign to people whose day to day realities involve effort in different directions - like survival on a street that includes multiple dangers, the need to feed or shelter family, or being able to walk across a room, or the ability to feel welcome - it is assumed that those studies are "representative" enough to validate the small conclusions they assess. Life involves how all the pieces can fit together, not "fix up" fragments and send the person back out there, because science has told them they can do it now.
researchers are supposed to disclose the nature of their "convenience sample" in the limitations section of their report. It's this area that meta-analysis studies mine to see if the study is relevant to a particular population. When we look through the What Works Clearinghouse for tools that may help us in Special Ed, we can delimit for certain factors like Special Ed, Language Learner, etc. Often times, this rules every tool, technique, or program out of the search. Try it and see: https://youth.gov/federal-links/what-works-clearinghouse
A "new study" figured that out?
I've always found it amazing how many studies take place among university students, and because they check for factors like age and gender, a list of criteria at the end, they call it a reliable study.. But they are testing available people - people available to university students. Already a population separated from other working professions, not including factors that make study difficult or foreign to people whose day to day realities involve effort in different directions - like survival on a street that includes multiple dangers, the need to feed or shelter family, or being able to walk across a room, or the ability to feel welcome - it is assumed that those studies are "representative" enough to validate the small conclusions they assess. Life involves how all the pieces can fit together, not "fix up" fragments and send the person back out there, because science has told them they can do it now.
researchers are supposed to disclose the nature of their "convenience sample" in the limitations section of their report. It's this area that meta-analysis studies mine to see if the study is relevant to a particular population. When we look through the What Works Clearinghouse for tools that may help us in Special Ed, we can delimit for certain factors like Special Ed, Language Learner, etc. Often times, this rules every tool, technique, or program out of the search. Try it and see: https://youth.gov/federal-links/what-works-clearinghouse