2009 International Association of Computer Science and Information Technology - Spring Conference 2009
DOI: 10.1109/iacsit-sc.2009.61
|View full text |Cite
|
Sign up to set email alerts
|

Swarm Based Text Summarization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
22
0

Year Published

2009
2009
2022
2022

Publication Types

Select...
5
4
1

Relationship

1
9

Authors

Journals

citations
Cited by 56 publications
(22 citation statements)
references
References 14 publications
0
22
0
Order By: Relevance
“…In designing the prototype, there are two activities involved, (i) determining the representation of PSO parameters and (ii) constructing the algorithm for the optimisation. PSO parameters include how to determine position's best (pBest), global's best position (gBest) and velocity [15].…”
Section: Signature Representationmentioning
confidence: 99%
“…In designing the prototype, there are two activities involved, (i) determining the representation of PSO parameters and (ii) constructing the algorithm for the optimisation. PSO parameters include how to determine position's best (pBest), global's best position (gBest) and velocity [15].…”
Section: Signature Representationmentioning
confidence: 99%
“…Most of the techniques employed in single-document summarization are also employed in multi-document summarization. There exist some notable difficulties 8 . (1) The degree of redundancy of information available in a group of topically-related documents is significantly larger than the redundancy of information within a single document, since each document incorporates important concepts and also the required shared background.…”
Section: Introductionmentioning
confidence: 99%
“…Where: w i = The weight of feature i f i = The score of feature i Binwahlan et al (2009) proposed a text summarization model based on Particle Swarm Optimization (PSO) to determine the feature weights. Bossard and Rodrigues (2011) used genetic algorithm to approximate the best weight combination for their multi document summarizer.…”
Section: Proper Nounmentioning
confidence: 99%