Considering the existence of multiple edge dependencies in realistic interdependent networks, we propose a model of edge-coupled interdependent networks with conditional dependency clusters (EINCDCs). In this model, the edges in network A depend on the edges in dependency clusters of size $m$ in network B. If the failure rate of edges within the dependency clusters in network B exceeds the failure tolerance $\alpha$, the corresponding edges in network A that depend on those clusters in network B will fail accordingly. By adopting the self-consistent probabilities approach, a theoretical analytical framework is established to quantitatively address this model. Specifically, we study the robustness of the system verified with numerical simulations in the effect of the cluster size and failure tolerance under random attacks on systems composed of two networks A and B constructed with Random Regular (RR), Erdős-Rényi (ER) and Scale Free (SF) model. Our results show that both networks A and B undergo a first-order or hybrid phase transition when the dependency cluster size does not exceed 2. However, when the cluster size of dependency clusters exceeds 2, a non-monotonic behavior is observed. In particular, when the failure tolerance is the range from 0 to 0.5, the robustness of the system weakens with the growing in the number of dependency clusters of size 2. While, this tendency reverses when the failure tolerance is in the range from 0.5 to 1. Moreover, we observe that due to the asymmetric interdependency between the two networks, network B always undergoes first-order phase transition, whereas network A could exhibit different types of phase transitions, which depends on the size of dependency clusters. In addition, the failure tolerance may have opposite effects on the two networks with the growing of dependency cluster sizes. The conclusions of the study may provide useful implications and enrich the understanding in the robustness of edge-coupled interdependent networks.