In order to deal with huge data set alternative design with proper function is desirable. These functions may perform operations sorting, searching, updating DBMS frequently. Apart from time and space metrics, energy, pattern and size of input, exact match or approximations are also key issues to be considered. So there is versatile need to seek improvement for performance. Among state of art approaches binary search relies on divide and conquer approach explore key item at mid element of array after each iteration and accordingly moves interval to new sub range. In this paper an analysis of state of art bisection algorithm has been presented with certain parameters as effectively with some dynamic alteration in input. Further, analysis has been done with parallel processing. Keywords: Searching, Parallel Processing, Time Taken for Searching, Comparison, Order
I. INTRODUCTIONBinary search is a searching algorithm that gives result with order of complexity O(log n) in an average case and in worst case too [1][2].The best case complexity order is equivalent to O(1) . It follows Divide and Conquer approach. The necessary and sufficient condition for this searching technique it the working array must be sorted [3]. The concept is to divide the array into two sub arrays. Initially do it with the help of the highest value (last element if ascending sorted order has been followed) and with least value i.e. first value of the array as per discussed sorting mechanism [3] Under binary search the target element is searched with comparing the middle element of the sorted array [4]. If it matches with the element which we are searching for, the index value of that matched element will be given out else it is noticed that whether this middle element is larger than the element which we are searching or smaller. If the middle element is larger than our value then the searching will be performed again in the left sub array of the main array, else we approach towards right sub array. These iterations will be continued until we get the key or either we met with the last or first element of the array [5]. This concludes that if we are searching any key element in a given array so the number of comparison will be either less than or equal to the half of the number of elements in array. Thus the comparisons taken will be less with respect to the linear search [6] [7]. The thing which cumulatively affects the time taken to the complete execution is the processing speed and number of processing elements executing simultaneously [7]. The invention of single instructions multiple data approach beats vector and array processors on certain features specially dealing capacity of multiple instructions at a time. The need of time reduction is truly mandatory in contrast to modern data and performance scenario. Since the need of storage capacity is growing exponentially thus there is always a demand of performance improvement in time, space, order, iterations and recursion etc. In sequential processing the number of elements those ca...