Information is growing more rapidly on the World Wide Web (WWW) has made it necessary to make all this information not only available to people but also to the machines. Ontology and token are widely being used to add the semantics in data processing or information processing. A concept formally refers to the meaning of the specification which is encoded in a logic-based language, explicit means concepts, properties that specification is machine readable and also a conceptualization model how people think about things of a particular subject area. In modern scenario more ontologies has been developed on various different topics, results in an increased heterogeneity of entities among the ontologies. The concept integration becomes vital over last decade and a tool to minimize heterogeneity and empower the data processing. There are various techniques to integrate the concepts from different input sources, based on the semantic or syntactic match values. In this paper, an approach is proposed to integrate concept (Ontologies or Tokens) using edit distance or n-gram match values between pair of concept and concept frequency is used to dominate the integration process. The proposed techniques performance is compared with semantic similarity based integration techniques on quality parameters like Recall, Precision, F-Measure & integration efficiency over the different size of concepts. The analysis indicates that edit distance value based interaction outperformed n-gram integration and semantic similarity techniques.