Volume 2, Number 2, August 2004

IJCIS

Computing and Information Sciences is a peer reviewed journal that is committed to timely publication of original research, surveying and tutorial contributions on the analysis and development of computing and information science. The journal is designed mainly to serve researchers and developers, dealing with information and computing. Papers that can provide both theoretical analysis, along with carefully designed computational experiments, are particularly welcome. The journal is published 2-3 times per year with distribution to librarians, universities, research centers, researchers in computing, mathematics, and information science. The journal maintains strict refereeing procedures through its editorial policies in order to publish papers of only the highest quality. The refereeing is done by anonymous Reviewers. Often, reviews take four months to six months to obtain, occasionally longer, and it takes an additional several months for the publication process.

Paper 1: Numerical Solving of Geometric Constraints By Bisection

Numerical Solving of Geometric Constraints By Bisection

Samy Ait-Aoudia and Imad Mana

Abstract: In computer-aided design, geometric modeling by constraints enables users to describe shapes by relationships called constraints between geometric elements. The problem is to derive automatically these geometric elements. Many resolution methods have been proposed for solving systems of geometric constraints .Geometric methods can be very efficient but are only applicable to particular kinds of problems (typically rule and compass constructive problems). Some schemes can't be solved by any geometric method. A numerical method is then necessary. We use the bisection method to solve these cases. The bisection method enables users to reliably find all solutions to a system of non-linear equations within a region defined by bounds on each individual co-ordinate of the geometric objects. But bisection is very time consuming. To overcome this problem, we present first a method that speed-up the resolution process. This improvement is still insufficient; we propose then to distribute the processing on several PC machines operating in parallel.

[.ZIP]|[.PDF]

 

Paper 2: Modeling and Analysis of an On-Line FMS Scheduler Using Colored Petri Nets

Modeling and Analysis of an On-Line FMS Scheduler Using Colored Petri Nets

Ali A. Al-Titinchi and Kasim M. Al-Aubidy

Abstract: An interactive hierarchical model based on colored Petri net (CPN) for general flexible manufacturing system (FMS) scheduler is presented in this paper. The proposed model is partitioned into two levels to face the complexity of the manufacturing systems. The first level (cell-level) supervises the jobs scheduling between cells. The second level (machine-level) supervises the scheduling of the manufacturing jobs between machines inside the active cells. The goal of the scheduler model is to interpret the output of the manufacturing route planner into graphical representation and to achieve the sequencing, dispatching and monitoring the real time execution of these manufacturing plans. Also, the proposed CPN model generates feedback status to monitor the abnormal conditions. Sub CPN priority structures are embedded in the main model to control the election from the competitive jobs to use the limited resources in the system. The hierarchy organization will enhance the real time response of the sequencing and monitoring capabilities. The designed scheduler has been evaluated for different case studies and the achieved results are very encouraging.

[.ZIP]|[.PDF]

 

Paper 3: Iterative Window Size Estimation on Self-Similarity Measurement for Network ...

Iterative Window Size Estimation on Self-Similarity Measurement for Network Traffic Anomaly Detection

Mohd Yazid Idris , Abdul Hanan Abdullah, Mohd Aizaini Maarof

Abstract: An iterative method for estimating the optimum sample time (or simply window size) in self-similarity measurement of network traffic is introduced. The main purpose of this measurement is to identify anomaly in network traffic. When the network traffic is close to the self-similarity model, it is considered as normal while otherwise it is not. Since, this model is related to a long-range dependence process, providing data in long period of time will increase the closeness of the network traffic towards the model. On the other hand, increasing the time range is one of the factors that will increase detection loss probability where an intrusive pattern may hide inside the normal data. Thus, the purpose of this method is to minimize the curve-fitting error on self-similarity measurement and detection loss probability in anomaly detection. This iterative method was applied to network traffic data provided by Lincoln Lab, Massachusetts Institute of Technology (MIT). The result has shown, that this method is able to estimate an optimum window size that is capable to reduce detection loss probability and maintain a low error rate.

[.ZIP]|[.PDF]

 

Paper 4: Efficient Electronic Payment Systems by Using a Sparse Elliptic ...

Efficient Electronic Payment Systems by Using a Sparse Elliptic Curve Cryptography 

Essam Al-Daoud,  Khalid Al-Tahat and Hamed Al-Fawareh

Abstract: This paper introduces new techniques and algorithms to reduce the size of the elliptic curve parameters. The basic idea is to use sparse coefficients and sparse base points. The sparse elements are introduced with a compact representation, thus the public key parameters are reduced about 37-49 percent. The elliptic curve application such as e-payment and e-commerce can be implemented with better performance using the suggested approach.

[.ZIP]|[.PDF]

 

Paper 5: A Heuristic for Minimizing Makespan in Proportional Parallel Flowshops

A Heuristic for Minimizing Makespan in Proportional Parallel Flowshops

Ameer Al-Salem

Abstract: This paper addresses the problem of minimizing the makespan on a two-stage parallel flow shops with proportional processing times. The problem of scheduling n independent jobs on m proportional parallel flow shops to minimize makespan is known to be NP-hard and hence too difficult to solve in polynomial time. A multi-phase heuristic algorithm to minimize the makespan on two proportional parallel flow shops is proposed. A simulation study is conducted to examine the effectiveness of the proposed heuristic algorithm for small and large problems. The results and analysis of quite extensive computational experiments for small size problems show that the proposed algorithm can yield solutions within a few percent of optimal solutions. The computational results for large size problems show that the performance of the proposed algorithm is relatively more effective than the existing SKV algorithm.

[.ZIP]|[.PDF]

 

Paper 6: Running Multicast Applications Using Reconfigurable Parallel System

Running Multicast Applications Using Reconfigurable Parallel System

Buhari and Hamid R. Arabnia

Abstract: In this paper we address the important problem of Multicasting using a reconfigurable network topology called the Multi-Ring network. The Multi-Ring network is shown to have a rich set of nice properties that support applications in medical imaging, scientific computing, and artificial intelligence. The most important property of the Multi-Ring topology is “scalability”. It is feasible and cost-effective to interconnect hundreds of thousands of computing notes using the Multi-Ring network topology. Efficient Multicasting operation on a platform composed of thousands of computing nodes is a challenge that is being addressed in this paper.

[.ZIP]|[.PDF]

 

Paper 7: Constructing An Automatic Lexicon for Arabic Language

Constructing An Automatic Lexicon for Arabic Language

Riyad Al-Shalabi and Ghassan Kanaan

Abstract: In  this  paper,  we  have  designed  and  implemented  a  system  for  building an automatic lexicon  for the Arabic language. Our Arabic Lexicon  contains word specific information. These pieces of information include: morphological information such as the root (stem) of the word, its pattern and its affixes, the part-of-speech tag of the word, which  classifies it as a noun, verb  or particle; lexical attributes such as gender, number, person, case, definiteness, aspect, and mood are also extracted and stored with the word in the lexicon. A lexicon is a collection of representations or words used by a  natural language processor as a source of words specific information; this representation may contain information about the morphology, phonology,  syntactic argument structure and semantics of the word. A good lexicon is badly needed for many natural language applications such as: parsing, text  generation, noun phrase and verb phrase construction and  so on. Many  rules  based on the grammar of the Arabic language were used in our system to identity the part-of-speech tag and the related lexical attributes of the word  [13]. We have tested  our system using a vowelized  and  non-vowelized Arabic  text documents taken from the holly Qur'an and 242 Arabic  abstracts  chosen  randomly from the proceedings of the Saudi Arabian national computer conference, and we achieved an accuracy of about 96%. We discuss the  actors behind these errors and how this accuracy rate can be enhanced.

[.ZIP]|[.PDF]

Contacts

Editor-in-Chief
Prof. Jihad Mohamad Alja'am 
Email: journal.editor.ijcis@gmail.com

The Journal Secretary
Eng. Dana Bandok
Ontario, Canada 
Email: sec.ijcis@gmail.com 

 

Home Page »

 

 

Published Volumes: Volume 1 | Volume 2 | Volume 3 | Volume 4 | Volume 5 | Volume 6 | Volume 7 | Volume 8 | Volume 9  | Volume 12