There have been numerous recent attempts to “decentralize” social media platforms, loosely referred to as Web3. Such ideas, often underpinned by blockchain solutions, offer decentralized equivalents of well-known services (e.g., forums, social networks, video sharing sites, microblogs). One particularly challenging function to implement in such a design is content moderation, due to the lack of central control. Consequently, they often rely on user-controlled moderation, whereby each user must create their own personal block list to filter out content they do not wish to see. This paper presents a first study of user-controlled moderation on one exemplar Web3 social microblogging platform called memo.cash. Based on a dataset covering 391K posts, we study the factors that lead users to “mute” each other. We find that the most crucial factor is the platform action count, rather than the presence of things like hate speech. We also show that the followership network plays a pivotal role in determining their visibility on the platform, further influencing their muting behavior. This leads us to design tooling to automate the muting process on a per-user basis. We model this as a recommendation problem, and experiment with a number of state-of-the-art recommender engines. We show that our system can generate effective personalized mute lists for users.