#HaveYourSay - Custom Screeners, Screener of the Month, Participant IDs & Study Icons

Who We Are

Hi Researchers!

We are the Researcher Experience Squad (our word for ‘team’).

Our aim is to provide you with an amazing experience from the moment you register to the successful completion of your studies.

Our Objective for Q2

Our participant eligibility system finds the participants you request. This quarter, we’re transitioning to an improved version.

We are doubling down on providing you with a reliable, stable and fast platform to recruit the participants you need.

We’ve got a number of ideas below that we’d love feedback on.

We Want Your Feedback On…

:mag: Custom Screeners

We want to make it easier for you to search for your target demographics. So, we’d love to get your thoughts on a feature that would allow you to create your own screener:

Would you be willing to create your own screener (for a fee to compensate participants), even if you might not be guaranteed to get the sample you need?

:calendar: Screener of the Month

Is there a screener you wish was available on Prolific?

Reply with new screeners you’d like to be added. And if you agree with someone else’s suggestion, give it a ‘like’!

If it’s feasible, the most popular suggestion will be made available!

:passport_control: Friendly Participant IDs

We want to update participant IDs. Let us know what has and has not worked for you:

What has been your experience with handling participant IDs?

Have you had any issues with it?

:camera_flash: Study Icons

In the past, we have offered researchers the ability to add a logo to their account which would display with their studies.

Would you want to display an icon or institution logo with all your studies?

Would the icon need to change depending on the study you are running?

2 Likes

Hello,

I would like to suggest adding Cybersecurity/Information Security as part of the ‘Subject’ screener to make it easier for researchers to filter out participants and to save time and money. What I have suggested might not be popular and thus might not be made available, but I would appreciate it if you take it into consideration, given the importance of the subject.

In my studies, I use Computer Science/IT as the required Subjects and a survey question to filter out participants, but the number of those who do not have expertise in cybersecurity is high (and, to some extent, expected). It would be extremely beneficial to know (in advance) how many participants specialize in cybersecurity before deciding whether or not to run relevant studies.

The next point might be irrelevant. I noticed that even though I set my custom pre-screening to include participants who specialize in certain areas, I sometimes end up having data from participants whose subject areas do not match my predefined requirement.

Thank you.

2 Likes

Hey @ScienField, thanks for your suggestion!

We’re working on a way to allow you to suggest new screeners, so watch this space. In the meantime, you can use our two-study approach to find your audience.

I noticed that even though I set my custom pre-screening to include participants who specialize in certain areas, I sometimes end up having data from participants whose subject areas do not match my predefined requirement.

I’m very sorry that’s happened to you!

We have a help page which has guidance on resolving this kind of issue: Participants who don’t match the prescreening selected.

Our prescreening data relies on self-report, and we screen participants into studies on the basis of this information. It’s worth checking that the data in your Prolific export matches with the prescreening filters you’ve applied.

If so, then you can message your participants to ask for clarification about their inconsistent response. It’s possible there was a genuine misunderstanding arising from any differences in how the screening questions are worded.

If participants have provided deliberately false information, and can cite no reasonable explanation for the inconsistency, then they can be justifiably rejected.

In future studies, you can validate your screening criteria by repeating the questions at the beginning of your survey, so that any participants who provide inconsistent information are not allowed to proceed further. The questions in your survey must be presented exactly as they appear on Prolific for this purpose.

Let me know if anything is unclear :slight_smile:

1 Like

Thank you, Josh. I appreciate it. As for the differences in the Subjects, I did not use a survey screener to filter out participants based on the subject areas. I used the built-in screener to only invite participants who specialize in the aforementioned areas. But, on inspecting the exported data, I found, for instance, cases where participants’ reported areas are physics/biological sciences …etc (something totally unrelated to my criteria). I am not sure if they changed it before I downloaded the file (which I think should not be allowed, given that subject areas are not likely to change within a short period of time unless there is an honest mistake), or if there is a technical error.

1 Like

That shouldn’t be happening. If you don’t mind, could you report these participants to us here, so we can investigate and take action if necessary

(btw, I love your username) :slight_smile:

1 Like