Crowdsourced software testing has been a common practice lately. It refers to the use of crowdsourcing in software testing activities. Although the crowd testing is a collaborative process by nature, there is no available research that provides a critical assessment of the key collaboration activities offered by the current crowdsourced testing platforms. In this paper, we review the process used in the crowd testing platforms including identifying the workflow used in managing the crowd testing process starting from submitting testing requirements and ending with reviewing testing report. Understanding the current process is then utilized to identify a set of limitations in the current process and has led to propose three process improvements (improving assigning crowd manager, improving building test team, monitoring testing progress). We have designed and implemented these process improvements and then evaluated them using two techniques: 1) questionnaire and 2) workshop. The questionnaire shows that the process improvements are significantly sound and strong enough to be added to crowd testing platforms. In addition, the evaluation through conducting a workshop was useful to assess the design and implementation of the process improvements. The participants were satisfied with them but asked for further modifications. Nevertheless, because crowd testing requires participation from a large number of people, the automation suggested improving managing the current process which was highly appreciated.