View Single Post
Lt. Commander
Join Date: Dec 2007
Posts: 120
# 10
10-22-2010, 10:19 PM
Personally, I think a couple of thing would solve it.

1) Flat rewards for missions.

2) No rewards for missions under X minutes.

3) Users with missions that are rated well get a personal author rating. A good author rating nlocks Silver, Gold, and Latinum Author status, which allows your new missions to get rated up to rewards status faster. However, get any infractions for missions that break the content guidelines and you get busted back down to the start for 30 days.

4) If you're going to go the player-testing route, actually hire the players. It will provide a much cleaner standard in terms of results and professionalism. It will face less criticism than simply elevating a few players in unpaid positions and give them resume credentials for the work they're doing. You can always give them the option between cash and Cryptic Points for a week.

Personally, I think the smart move is to just to hire 10 or so distance employees as Foundry testers. Give them a separate green name for the forums. Give them a code of conduct and an NDA preventing them from disclosing their player identity. Break it and you're out. Do well and you get brought into official content design and maybe working to assist on something like a guest author program.

You can go by an hourly payscale (easy enough to check) or make it a commission basis where they get, say, $4 for every star that the missions they approve get rated by players. Say... $10 an hour with an additional $10 for 4 star rated missions and $20 for 5 star. So you get bonuses for greenlighting quality missions.

Get caught fixing the polls and you're out. You approve too many missions with a low rating, you get terminated and replaced. You approve too few missions per week and you get laid off and replaced and put back into the queue of applicants. If players are generating too few missions, the supervisor will set it up so that you can meet your quota by creating missions to submit to other testers.

By having professional UGC testers, you have some standards in place that would be harder with volunteers.

Alternatively, if hiring isn't an option, you can incentivize accurate reviewing. Anyone who reviews within one point of the average rating for a mission gets entered in a daily Cryptic Points drawing for 250 CP, for example. So it gets people in the mentality of trying to assign a rating they'd expect other people to give the mission. This could strip some bias out.