🤔 Ask Anything Thread (Newbie Friendly)

Hey Sonja, welcome aboard! :tada:

I’m very sorry that you’ve not been able to get the participants you need :frowning:

Most screeners which have answers that may change over time have to be updated at set time intervals, so participants will be prompted to answer them again once their response has expired. But, sometimes they don’t update their answers despite our prompts, unfortunately.

How often they’re prompted to update depends on the screener. If you provide me with a list of your filters, I can give you the day interval after which they’re asked to update their answers?

Sorry that I can’t be more helpful!

Hey @Gabriele_Zippilli, welcome to the community! :tada:

@timtak is completely right about your third question, so nothing more to add there! :slight_smile:

Can I refuse their answers without been caught in some kind of penalty?

Yes, you can. If their in-study answers do not match their prescreening answers then they can be rejected. However, they must be asked the exact same question as the one asked by our prescreening survey to avoid misinterpretation. You can read more about this here: Participants who don't match the prescreening selected – Prolific

Do you suggest me to set the completion time to 4 minutes?

Yes, can do this. It’ll make your estimate reward per hour more accurate :slight_smile:

I hope that helps! Let me know if I can answer any other questions

Hey Josh!

Thanks for your reply! It would be useful to know when the answers are updated on each screener.

The filters I am using are:
Sex: Male
First language: English
Living with spouse/romantic partner: Yes
Has children: No
Partner pregnant: Yes

As you might imagine, the answers to the family and relationships questions are the ones that most often don’t match as the answers have changed since the participants filled in their profile.

Besides waiting for participants to potentially update their profiles, are there any other options for me to find the right users from the participant pool on Prolific? I was thinking about making a separate pre-screener questionnaire but I am unsure how helpful that would be.

Best,
Sonja

At the moment, we only require participants to update one of your chosen filters: “Partner Pregnant”. It must be updated every year.

But I agree that “Living with spouse/romantic partner” & “Has children” should be subject to more regular updates. I’ll raise the issue with the team to see if we can make it more relevant.

Are there any other options for me to find the right users from the participant pool on Prolific?

In this case, I think your demographic needs are too niche for us to be able to properly support your studies.

We’re actively looking at ways of expanding our demographic offering so hopefully we’ll be able to support your research in the future.

Sorry that I can’t be more helpful :frowning:

Hi! A participant just told me they could not enter their code. No one else seems to have had an issue with it however. Is there someone who can assist me with this?

Hey @Zententia, welcome to the community! Glad you’re here :tada:

Have they submitted their data? If so, has it shown up on your dashboard as ‘NOCODE’?

Yes that is correct! Most my participants seem to be submitting the code just fine, but then a handful have ended up with NOCODE submissions.

Don’t worry, NOCODE is not a problem :slight_smile:

It usually occurs for one of two reasons:

1) The participant reached the end of your survey, but were not redirected back to Prolific via the completion URL for some reason. Therefore, they had no way to access the completion code and had to submit without one, or instead submitted the wrong code.

  • These cases should be approved as normal if there is no issue with the submission; you should check your survey data to ensure that they fully participated in your study, and you can also use their completion time to gauge the likelihood that they did reach the end of your survey.

2) The participant decided to leave your study early, or could not proceed because of some technical issue. Instead of returning their submission, they have submitted without a completion code or with the wrong completion code by mistake, and appear in your ‘awaiting review’ list.

  • You can identify these cases by checking your survey data to see that they have provided incomplete (if any) data, and those that experienced technical issues may also have short completion times (i.e. a few seconds).

  • If you feel they do not deserve to be penalised with a rejection, we recommend that you send a message to these participants asking them to “return their submission” on Prolific or find out if they experienced an issue and would like the opportunity to participate again.

Hope that helps! Let me know if there’s anything else I can help you with :slight_smile:

Ok that’s fine then, thank you!

1 Like

Hi there, I tried to submit this as part of automatically reporting a participant but it kept freezing for some reason, I’m not sure why? But anyhow, I recently ran a pilot study in which I pre-screened on Covid-19 vaccination attitudes to be ‘anti’, but one participant reported “strongly agreeing” with all positive Covid-19 vaccination statements within our study, e.g. " i think covid-19 vaccinations are important/safe/effective " etc.

They still submitted the study fine and passed all attention checks, so I have approved them, but I didn’t want them in my sample to begin with. This was a pilot for a larger study, we only recruited 10 this time, but want to recruit ~700 next time. What if 1 in 10 are pro-vaccination in the larger study? Is there a way of compensating for this after the fact? Sorry if this is the wrong place to ask this. Many thanks for any advice!

Hey @Lotty, welcome to the community! :tada:

I’m sorry that you’ve been having trouble with the report feature! I’ll get the team to look into it :slight_smile:

Why a participant might submit info that does not match their profile

If a participant gives you info that does not match their prescreening criteria, then you can reject their submission. It could be the case that their view has changed, but they have not updated their demographic profile yet.

Avoiding this in your full study

To avoid this in your full study, ask them to reconfirm the answers to questions that are crucial for your study at the beginning of your study. Then, you can ask them to ‘return’ their submission & redirect them out of the study.

Rejecting participants on this basis

If you choose to reject, please not that rejections can only be made on the basis of inconsistent screening responses if the questions within your survey are presented exactly as they appear on Prolific . This is to avoid rejecting participants because of subjective misinterpretations, arising out of any differences in wording between your screening questions.

So, for your covid question, you would have to ask " Please describe your attitudes towards the COVID-19 (Coronavirus) vaccines:" and present the same options we present to participants.

Does that make sense? :slight_smile: Hopefull that helps!

Thanks so much for your quick reply, Josh! That is really helpful.

Just to clarify… am I right in thinking that if we add a question at the beginning identical to the Prolific question, and screening them out if they answer differently to how they had on Prolific, right at the beginning? But if they go on to answer more specific questions about coronavirus vaccines very positively- there’s nothing we can do about that, because the wording isn’t identical? Just making sure I’ve understood correctly, thanks again :slight_smile:

Hello.
I recently ran a study here and I found a mismatch between the prolific ID entered by a participant in the study page and their ID displayed on the system. When I asked the participant, they confirmed that the ID on the datafile is indeed theirs. The two IDs look very different so it is not likely that this is due to a typo.
My study requires participants to manually enter their ID and I know they can directly copy & paste their ID from prolific, but is it possible that there are more that one IDs displayed on prolific and that this person has accidentally copied a different one when they participated in my study?

If we add a question at the beginning identical to the Prolific question, and screening them out if they answer differently to how they had on Prolific, right at the beginning?

Yes, doing it right at the beginning will make it as easy as possible for participants.

But if they go on to answer more specific questions about coronavirus vaccines very positively- there’s nothing we can do about that, because the wording isn’t identical?

If this happens, you can make the case that the participant is not providing consistent answers, and is unlikely to be providing good quality data. I think it’s unlikely to happen if you screen them at the start, as suggested above, but if it does happen please report those participants via our report feature. Hopefully, it’ll work this time!

Hey @Helena_szw, welcome to the community! :tada:

Sometimes participants accidentally paste things that they’ve copied from other sources. We’ve had all sorts pasted before! :laughing: So, I wouldn’t worry too much. As long as the participant has confirmed that the ID on their datafile is, in fact, theirs, everything should be fine.

That’s why we recommend automating the collection of IDs.

I hope that helps!

Okay, thank you for the quick response!

1 Like

Brilliant- thanks Josh!

1 Like

Heya! I’ve got a question about a (perhaps) rather funky filter I’d like to apply.

  • Abstractly speaking, I’d like to combined filters via AND & OR.
  • Concretely spearking, this is an example: men younger than 45 that are either a student or have completed an academic education.

Is there any way to achieve this?

Thank you so much Josh. One last question (hopefully) related to this.
When I filter on ‘against’ covid-19 vaccines, it says there are 657 available.
We are hoping to recruit 700 altogether, and are happy to “fill up” the rest of the sample with those who answered ‘neutral’ instead. I’m happy to do this with two different phases, but,
if we ask to recruit 657 out of the 657 available, do you think it’s feasible that all of them will actually take part? Or do you think that risks us waiting around a long time to get the full sample, or is there no way to tell?

Do you think it would be more sensible to, say, ask for 400-500, and then get the rest from the neutrals? Hope this makes sense!

Hey Thomas, welcome to the community! :tada:

Yes, this is definitely possible. You just have to use these specific filters:

  1. Age
  2. Highest education level completed

When separate filters are combined, they operate in an AND relation. Whereas, different options within a filter operate in an OR relation.

So, as long as you choose all the levels of education within filter (2), you’ll get those who are below 45 AND have only completed high school (and hence should still be students), or have done a PhD (and hence should have completed their education), for example.

Does that make sense? :slight_smile:

(P.S we’ve just launched a £10k Research Grant Competition! You can enter if you’re in need of funds, or if you know someone who is, share it with them)