🤔 Ask Anything Thread (Newbie Friendly)

Hi @Josh , thanks for the reply!

I paused the study because I was worried about the average dropping below the minimum, so I haven’t let any more people take it. When I paused it, 250/500 had already taken it, so I’m worried the average won’t change. Further, I looked at my Group 1 data, and there were hardly any outliers in terms of time–only a few people taking longer than 20 minutes and no one taking anywhere near the max amount of time, whereas in Group 2, at least half are taking >20 minutes with several taking the maximum amount of time.

My main concern is with the average compensation dropping below the minimum and having to pay Group 2 more (which lowers the sample size I can afford). If the problem was common to both groups, I’d say I must’ve underestimated how long the study would take and take the hit. But I have 500 people in Group 1 who indicate otherwise. Does Prolific have a way of investigating something like this, or should I just plan to stop my study early if my Group 2 participants are being “underpaid”?

1 Like

Thanks for the additional info!

Could you provide me with the name of the study with the slower completion times, so I can investigate this for you? :slight_smile:

Both branches of the study are called “Social Experiences and Reactions Survey”. The branch with Group 1 is completed, and the branch with Group 2 is active but paused. I can message you with more details if necessary.

Thank you for looking into this!

1 Like

Hi @Josh, I have a question related to Samuel’s: Is there any functionality on Prolific that would allow the participant to participate as much or as little as they want–that is, perform as many tasks as they want until they get bored, then opt out, then get paid a per hour amount commensurate with the time they spent? Whereas another user who completed fewer tasks / spent less time would be paid less?

1 Like

Thank you for the details! I’ll look into it and see if anything unusually was going on :slight_smile:

Hey @CHRISTIAN_WILLIAMS, welcome to the community! Glad you’re here :tada:

get paid a per hour amount commensurate with the time they spent?

Unfortunately, participant rewards are fixed as soon as you publish a study. It’s not possible to automatically pay people different amounts.

But what you can do, is pay everyone the minimum amount (£5/hour), then offer bonus payments according to how many tasks a participant completes.

Let me know if you have any other questions! :slight_smile:

Oh, interesting solution. Thank you, @Josh.

As far as implementing this, what’s the simplest route do you think? Would we need to manually identify how many tasks each participant performed and then provide them with that bonus? Or, are we able to pass a URL parameter back to Prolific that includes the number of tasks completed, and that being used to automatically calculate bonus? I assume not possible but wanted to check.

1 Like

Hi there, I’m nearly ready to upload my questionnaire to Prolific. I have had funding agreed from my University and plan to give my bank details so that I pay Prolific and the Uni reimburses me at a later date. However, I’m concerned that if my questionnaire takes longer than the estimated time, then I will be charged more than what the University has agreed to pay me. Is there a way of capping the amount I pay Prolific - even if this means getting less participants than my ideal sample? I want to avoid going over the amount that my University has agreed to fund me for this project. Many thanks.

1 Like

Are we able to pass a URL parameter back to Prolific that includes the number of tasks completed, and that being used to automatically calculate bonus?

Unfortunately, this won’t be possible. But, to make the manual proces easier, you can:

  • Using your external survey software, track how many tasks each participant completes
  • Group participants according to the number they’ve completed
  • Export the IDs of the participants in the various groups, and ‘bulk pay’ those IDs with their relevant bonus

Does that make sense? :slight_smile:

1 Like

Hello,

I published my research questions and now I have a returned status for each participant ID. What does this mean? What do I need to do next to move forward?

Thank you

1 Like

Hi Edgar
Something does not seem to be right. The “RETURNED” status implies that something did not go well with the study, such as they were unable to proceed to the questions, or were unable to submit the questions.

Perhaps if you quote the link to the questions here, then we can see if we can submit (bearing in mind that you’ll have to delete our attempts).

Tim

1 Like

Thanks! I have a study that timed out in terms of completion (I had to Stop the study because recruitment had slowed to a halt). But I would like to preclude those who participated in that study from participating in a new study I am running. That old study, since it isn’t under “Completed” is not showing up though when I go to the filter for “Participated in my previous studies.” Anything I can do about this?

1 Like

Nevermind, I figured it out😊

1 Like

Thanks Josh, much appreciated.

1 Like

Hey @Lucy_Sainsbury, welcome to the community! :tada:

There isn’t a way of capping spend on Prolific, but there are a number of ways you can make sure you don’t spend too much:

  • Pilot your study with a decent sized group to get a more accurate estimated completion time
  • Only release your study to 80% of your sample size so you only use 80% of your budget. Then release it to the last 20% when you’re sure that the first study hasn’t gone over budget.

Does that make sense? :slight_smile:

Hi Tim
My URL is https://vaughn-e.com and the survey tool is https:// app.prolific.co. The majority of showed a returned status and 3 showed a timed out status.

Thanks

As @timtak has said, it’s likely that something has gone wrong. In addition to letting Tim check, it would be a good idea to also message participants to see what problems they encountered :slight_smile:

Hi Edgar

I think I understand the issue.

I am afraid that Prolific (the site for which this is the forum/community) is not a survey tool but participant recruitment service. For a survey tool you’ll need to use a site such as Qualitrics, Gorilla, Typeform, or Google Forms and then include a link to that site in your Prolific study so that participants are sent to that site. And then include a link back to Prolific at the end of your study on so that Prolific knows who has completed your survey, and who to pay.

The support guides for integrating these sites and others to Prolific are here
https://researcher-help.prolific.co/hc/en-gb/sections/360004085479-Survey-software-integrations

I use Google Forms because it is free.

Tim

Hello,
I am new to prolific and I am interested in options for screening participants out (instead of in). Is there an option to specify that if a respondent meets a certain criteria, they should not take part in the study.
Thanks!
Stela

Hi there,
I have just finished a study here, thanks!
When I download the demographic data, in the first_language_1 field it is DATA EXPIRED for all participants. As far as I remember, it was not like this in my previous studies. What does this mean?
Thanks,
Anna