Does Prolific have language they recommend providing in a consent form to explain the payment process for participants? We are especially curious about how to explain that compensation may change in the event our median study payment falls below the $6.50 threshold due to misestimated completion times (e.g., âHow to Resolve Underpaying Studiesâ). If Prolific doesnât have this language, Iâm curious how others have handled this issue with their IRB?
No worries
You might be right, there could be several factors at play. But indeed, I eventually reached my target of 150 participants, within 5 hours (which of course still is very fast).
Best,
Anouk
Hello @Moses_Rivera, welcome back!
Does the Custom Allowlist override the âExclude participants from previous studiesâ? For example, if I exclude all participants from Study 1 (using âExclude participants from previous studiesâ), but I want to make one exception for one participant by adding them to the Custom Allowlist, will that work?
If you do so, your new study will be visible ONLY to that one participant you add in the Custom Allowlist. The way this pre-screener works is indeed the one described here:
-
From what I can guess, I donât think this is what you want: If I understood correctly you want to launch your study for everybody (or those with certain characteristics applying other filters) PLUS one guy that already joined your previous study & making sure all the rest of old participants donât join it.
-
If instead you want the study to be visible to ONLY that one participant, you actually need only the Custom Allowlist and you can forget about any âExcludeâ or âBlocklistâ pre-screener.
If 1. is correct, you may want to create a study with a Custom Blocklist with ALL participants of your previous study EXCEPT the one you want to recruit again. If you want to force that one person to participate, and not only leave this possibility open, create TWO studies: one with a Custom Blocklist with ALL participants of your previous study (or, easier, the âExclude from previous studiesâ prescreener) PLUS create a separate identical study with that one guy in your Custom Allowlist.
Hope it helps!
Veronica
@timtak Thank you for the information! I havenât examined my data closely yet but I wanted to get a general idea if this was a major potential issue because I am currently still recruiting and so this would impact whether I have to recruit more people.
hi @Carly_Gray , one way would be to pilot the study, determine that you need to increase the completion time, and then submit an amendment. I know thatâs not exactly what youâre looking for, but I would shy away from a sort of âif ⌠then âŚâ in the consent form because that might lead participants to think they will be paid more if they take more time. Thatâs just one thought though âŚ
Hello Josh, Hello everyone,
I have a question regarding the demographic data that Prolific provides:
We ran a survey the last weeks subdivided into several sessions, 9 in total (one day one session). Now, I downloaded the demographic data for each of these sessions. However, in only one of these downloads the variable Ethnicity is included and in the others it is not. See my overview below. I downloaded all this demographic data last friday (12.11.).
Any idea why we got Ethnicity only for one of those sessions?
Session 1 (19 variables)
Session 2 (19 variables)
Session 3 (19 variables)
Session 4 (19 variables)
Session 5 (19 variables)
Session 6 (20 variables) here, Ethnicity is included!!!
Session 7 (19 variables)
Session 8 (19 variables)
Session 9 (19 variables)
Thank you for your response and best regards,
Arna and Hendrik
Hello @Arna_Wommel, welcome to the Forum!
Just brainstorming a bit on your issue. The Prolific demographics export by default contains the 19 categories listed here, in addition to the responses to all prescreeners applied to the study:
As you see, Ethnicity is not included as a default category. Could it be that you selected Ethnicity as a prescreener only in Session 6?
Let me know,
Veronica
Hi everyone,
I have two questions regarding deception in surveys. Concretely, I would like to know whether Prolific considers the following two survey elements as deception that require corresponding pre-screening and de-briefing:
- Online experiment where one half of participants see a more positively framed description of something and the other half see a more negative one (involves only slight variations in the text)
- A real effort task that provides the opportunity for participants to earn an additional financial reward. Later in the survey they can donate this reward to charity - this is the only option for donations from participants I currently see (I obviously cannot ask for their participation fee, but I can also not track any payments from their own money executed outside the platform). Of course, if you have any other ideas, I would be more than grateful!
Note that before the real effort task they are only informed they can earn money, not that they can donate this additional money to charity. This is because giving them this information upfront will likely change their effort levels which may be a function of propensity to give to charity.
Thanks a lot in advance!
Aja
I donât know, so I could be quiet but my opinion is that neither are deception (with a proviso).
The Prolific prescreening question regarding âdeceptionâ is
" Would you be happy to take part in a study where you are intentionally given inaccurate information about other participants and the study? You would be debriefed after the study."
I donât think that slightly different levels of valence constitute âinaccurate information,â myself.
I donât see why offering participants the option to donate to charity might constitute deception.
However, if you mean that participants must donate the effort reward money to charity, then I think, imho, that would constitute deception since some participants (me at least) would be hoping to get the money for themselves.
Tim
Hello Veronica,
thank you for your feedback. Yes, I saw that by default it should only be those 19 variables. But in Session 6 we received Ethnicity additionally. Pre-screening were in all sessions the same: UK Nationaility and excluding participants from our previous sessions, thatâs it.
So it is strange why we received this information for one of our sessions. I cannot explain this.
Best regards,
Arna and Hendrik
Hi there,
Iâm trying to use the balancing demographics feature which doesnât allow me to increase places in my study. Because of this and some technical glitches which were to be expected, I had to approve some participants whose data is not actually usable so Iâm stuck with a smaller sample size than I had wanted. I just wanted to duplicate this study to open up a few extra slots to make up for the shortfall, but it seems Iâm not allowed to do that either??
I know I can just start a new study from scratch on Prolific, but then the completion code generated is different from the original one, and this is a problem due to my integration with Gorilla which would require me to make a new version of the study on Gorilla as well. Iâd really appreciate a) understanding why these restrictions are in place and b) how to overcome them (aside from opening extra places the first time around in future studies).
Thanks
Dear Child Lab Canada
I am sorry I canât help you but I can report that âduplicateâ
was greyed out for Moses
and myself
recently, above. We did not work out why.
The âduplicate studyâ function is also greyed out in the study that I remade.
I was using the gender balancing feature (which I have not done in the past) but I donât see why this should be related. Perhaps gender balancing triggers a âdemographic balancingâ (representative sample) flag, and since representative samples now once again cost money, âduplicate studyâ may be blocked to prevent researchers from incurring costs without directly clicking on the demographic balancing checkbox where the cost is clearly stated.
Gender balancing may be destined to cost money in future (once the genders level out) and Duplicate Study may be greyed out for that do-NOT-let-researchers-incur-costs-by-mistake reason. All this is just supposition.
It could be a bug, that @Josh may be able to help us with. Otherwise or in any event a support request may be a good idea. If you contact support, please let us know what they have to say.
I am sorry I canât help more.
Tim
Thanks, Tim!
I thought so myself, but I wanted to be on the safe side
In the demographics CSV, there is a column for started date time, completed date time, and time_taken. Under status, there are âReturnedâ. The Returned people do not have a completed data time, which makes some sense, but they do have a time_taken. Is that the time from starting to the point that they returned the slot?
Hi, I would like to prescreen participants according to multiple categories.
I do not want the filters to be additive (for example, female AND US resident AND married), but alternative (female OR US resident OR married).
Is there a way I can do this without having to recruit my own custom sample?
Thanks!
Hello @Arnout_van_de_Rijt, welcome to the Forum!
As you correctly said, for multiple screeners an AND logic is applied, but unfortunately there is no OR functionality. An OR logic is applied only within filters, so that you can get all people who responded to any of the selected response.
Sorry if I canât help more.
Veronica
Hello @DrJBN,
I just went through some Prolific export I have from past experiments and looked into the issue.
To me it seems that time_taken for RETURNED submissions is quite unreliable (too big, not plausible numbers).
So personally I just wouldnât use this data.
Veronica
Mmmh, maybe @Josh can look into your experiment if you provide him with your study name and details (via DM).
I honestly cannot come up with a better explanation!
Cheers,
Veronica
Hi Josh, thank you for the help.
Is it possible to start again a completed survey study? I have to collect few more participants, but I do not want to enroll in this reboot the same participants who took part in the first distribution. Thanks!
Hi!
Iâm making the switch from mTurk to Prolific and I have a question about screening.
-
Can participants see the Prolific demographic screening filters you use? For example, if Iâm recruiting teachers between the ages of 22 and 30 who live in the United States and have 5-9 drinks per week, are those filters visible to participants? (Do they know, âOh, Iâm seeing this specific task because Iâm a 26-year-old teacher in North Carolina who drinks alcohol every dayâ?
-
If I do a screening survey (survey 1) first to find eligible participants, can I/how do I make my next survey (survey 2) only available to the people that were eligible on my screening survey (survey 1)?
Update - found this link - How do I recruit a custom sample? â Prolific, which answered this question.
thank you,
Cassidy