“Exclude participants from previous studies” filter not working? I already submitted a help request but was wondering if anyone else has noticed this recently. Details: I have an ongoing study over zoom. I run a prescreen study every week or two and recruit batches of participants. In the most recent prescreen, 5 (out of 30) participants had already completed the prescreen in the past despite the filter. I just launched another prescreen and again I see at least one repeat participant.
EDIT: There was a question in the prolific participant redit about this—seems like participants have noticed repeat studies as well.
In other words, please may I confirm, the custom blacklist, in a duplicated study or added by yourself manually, was there but participants whose ID were in the custom blacklist were allowed to take the survey?
The custom blacklist will be generated if a study is duplicated via the duplicate study option in the Action menu on the study page.
If so, or in any event, I think we need the help of @Jon on this one m(._.)m
Having a look on Reddit I see that there is a thread from a year ago
I think our engineering team would agree with you @timtak! They’re on the case already, so I’ll follow along and relay any further updates when I know more.
In the mean time, it’s probably best to assume the Exclude Participants from Previous Study feature is out of service for the moment - either in being entirely unavailable or not returning desired results if you are seeing it in your app. You should still be able to build a custom blocklist to exclude previous study participants, albeit in a bit more labour-intensive way.
I’ve had the same problem, 6 participants from a previous study have been able to return and start the study again. No new data was collected, because they were blocked in Qualtrics, but they still marked the survey as completed with NOCODE.
What should I do? I don’t want to have to pay these participants again for no contribution, and if I reject their work, they will get marked down for a mistake by Prolific.
It’s much worse than I thought, at least 40 participants were allowed to start the survey and Qualtrics did not block all of them. I will therefore have to delete all their data.
Can someone please advise on how I should proceed in terms of approving or rejecting participants that completed the same survey TWICE because Prolific’s filter was not working properly? That’s a lot of money wasted and lost data that I cannot afford.
Sorry to hear about that @Leo_C, I appreciate the annoyance and extra hassle on this one! Rest assured though a fix is hopefully not far off
Our support team can help you with rejecting the duplicate participants from your study in a way that won’t penalise them for their double participation. To do so, raise a support ticket and share the study IDs of your current study with the duplicate entries and the previous one you had attempted to duplicate from. If you can share the participant IDs you’ve so far identified as duplicate as well, that’ll help greatly.
Updating on this, our engineers have released a fix for the blocklist issue, so the exclude participants from previous studies feature should begin to start excluding participants from previous studies again.
NB on my use of “should” above - if your study is live and a participant who should be excluded already has access to it, this fix will not automatically remove them from the study.
For new studies, this feature is now switched back on, so you should now again be able to exclude previous participants with a few clicks (though worth doublechecking, just in case!)
Apologies again for the inconvenience this caused, hopefully though it should be back up and trouble free from now on.
Do other researchers know of this issue? Will there be any communication of this issue? I only knew about repeat participants because I message participants to invite them to the full study and I can see our message history.
Our Support team is leading on getting the message out, so hopefully all who are affected should be notified in due course (if not already). My understanding is it’s a relatively small number of studies who had experienced this issue.
Generally speaking when there’s a major outage, planned downtime, or otherwise, we’ll get the message out via email, on Twitter, and here in the community as well.