• Volume 15,Issue 1,2004 Table of Contents
    Select All
    Display Type: |
    • Study of Application Framework Meta-Model Based on Component Technology

      2004, 15(1):1-8.

      Abstract (4565) HTML (0) PDF 640.02 K (5926) Comment (0) Favorites

      Abstract:This paper proposes an application framework meta-model based on UML (unified modeling language) notation from the perspective of construction and composition of application frameworks, gives the semantic of framework elements, and specially discusses the hot-spots’ description methods and their implementation mechanisms. Finally, the paper presents an example of the application framework meta-model, i.e., the telecommunication integrated business system framework, and proposes a reuse method of the application framework.

    • Key Techniques of a Hierarchical Simulation Runtime Infrastructure—StarLink

      2004, 15(1):9-16.

      Abstract (4164) HTML (0) PDF 687.67 K (5861) Comment (0) Favorites

      Abstract:A distributed RTI cannot be well adapted to large-scale simulations in a wide area network, especially on maintaining the consistency of the whole federation state. However a hierarchical RTI (runtime infrastructure) is very useful to reduce the computational cost of the large-scale simulations. This paper explains why the hierarchical simulation runtime infrastructure StarLink can be applied to the wide area network, and discusses three implementation techniques of StarLink, including CORBA based networking technique, multiple-threaded data transferring technique and XML based data exchanging technique. In addition, test methods for StarLink are discussed, and a new idea is put forward to test RTI’s interface services using a common framework test program. StarLink can effectively provide a powerful technology support for the large-scale simulations.

    • Workflow Model Analysis Based on Time Constraint Petri Nets

      2004, 15(1):17-26.

      Abstract (5029) HTML (0) PDF 807.43 K (5660) Comment (0) Favorites

      Abstract:The ultimate goal of workflow management is to implement the right person executes the right activity at the right time. To make enterprises more competitive, time-related restrictions of business processes should be considered in workflow models. A workflow model, which considers time related factors, requires time specification and verification before it goes into production so as to guarantee the time coordination in workflow executions. Through extending time attributes for the elements in WF-nets, this paper investigates the integration of the time constraints imposed on business processes into their workflow models and the new nets are called TCWF-nets. Based on analyzing the schedulability of business activities, a time consistency verification method is put forward to assure safe time interactions between activities during workflow executions. The schedulability analysis method can not only check for the time feasibility of its execution for a given workflow schedule when the time constraints are imposed on business processes, but also give an optimal schedule to guarantee the minimum duration of workflow execution for a specific case. Research results show that this method supports the time modeling and analysis in business processes, and has an important value in enhancing time management functionality as well as the adaptability to dynamic business environments of current WFMS.

    • A General Model for Component-Based Software Reliability

      2004, 15(1):27-32.

      Abstract (4692) HTML (0) PDF 609.23 K (6562) Comment (0) Favorites

      Abstract:The approach of aggregating components into complex software systems is maturing with the rapid development of component technology. How to analyze software reliability from system architecture and components’ reliabilities should be answered. Software is static, while development process is dynamic. To enable reliability tracing through a dynamic process, this paper presents a general model for component-based software reliability-component probability transition diagram-based on function abstractions. Different from other related work emphasizing on mathematical modeling, the model presented here focuses on reliability tracing through the dynamic development process.

    • Illumination-Constrained Inpainting for Single Image Highlight Removal

      2004, 15(1):33-40.

      Abstract (5029) HTML (0) PDF 829.55 K (6817) Comment (0) Favorites

      Abstract:Specular detection and removal are always hot problems in computer vision. Advanced results have a great impact on computer vision algorithms. In this paper, a specular detection and removal algorithm is proposed. First, through the comparison between highlight and diffuse chromaticity, a user interactive detection method for a single colored surface is developed. Second, a removal algorithm is proposed by introducing an image inpainting algorithm to this field and adding an illumination constraint. Different from traditional inpainting algorithms, the method described in this paper utilizes more cues which are embedded in the specular region. By integrating the embedded information such as the observed pixel color and illumination chromaticity, this method can overcome the shortcomings of the traditional inpainting methods, which can not keep shading variance in the highlight region. Experimental results show that this method can give a better illumination chromaticity estimation and more convincing results.

    • Enhancing Content-Based Image Retrieval by Exploiting Relevance Feedback Logs

      2004, 15(1):41-48.

      Abstract (3737) HTML (0) PDF 752.67 K (5669) Comment (0) Favorites

      Abstract:Relevance feedback (RF) has been successfully used in content-based image retrieval (CBIR). However, most CBIR systems seldom reuse the latent semantic correlation among images revealed by RF log to guide retrieving across sessions. In this paper, concurrence of images in a RF record is regarded as a kind of semantic homogeneity in certain context and the image-retrieving problem is cast as an authority-image-finding task. Records in RF logs first extend the result from traditional CBIR systems. This produces a relevant graph of images related to the query with multiplex contexts. Then, a modified HITS algorithm is applied to it to distill consensus about semantic relevance. As a result, both visual content and semantic relevance can be maintained in image retrieval and the efficiency is much improved compared with traditional CBIR methods. Experimental results demonstrate its superiority in both objective criteria and semantic clustering capability against the Corel database with 60 000 images.

    • A Virus Coevolution Genetic Algorithm for Project Optimization Scheduling

      2004, 15(1):49-57.

      Abstract (4494) HTML (0) PDF 514.23 K (6382) Comment (0) Favorites

      Abstract:In this paper, a virus coevolution genetic algorithm (multi-mode project scheduling-virus co-evolution genetic algorithm, MPS-VEGA) for the precedence and resource constrained multi-mode project scheduling problem is presented, and the encoding of the solution and the operators such as selection, crossover, mutation and virus_infection are given. MPS-VEGA is used to obtain the optimal scheduling sequences and resource modes for the activities of the project so that the project cost is minimized, which can transmit evolutionary genes not only between parent and child generations vertically by the genetic operators but also in the same generation horizontally by the virus_infection operator so as to perform a global search and a local search, respectively. The schema theorem is adopted to analyze the performance of MPS-VEGA. The theoretical analysis and experimental results show that the MPS-VEGA outperforms the GA. For the multi-mode project scheduling problem with different optimization objectives, MPS-VEGA can simutaneously give standard the optimal scheduling sequences subject to the precedence constraints and the optimal resource modes for the activities of the project.

    • >Review Articles
    • An Overview of the Core Technology of JVT Draft's

      2004, 15(1):58-68.

      Abstract (6926) HTML (0) PDF 882.18 K (9444) Comment (0) Favorites

      Abstract:As one stream standard which will be produced by the ITU-T & ISO in the future, Draft JVT (joint video team) is a new video-audio coding standard. Its goal is to produce a significant compression performance relative to all the existing video coding standards in bit saving, procession quality and compression efficiency by some measures which include slice technology, high resolution variable pixels, different block sizes and shapes, intra/inter prediction and coding from multiple reference pictures, etc. On the other hand, JVT would be designed to offer a transparent translation over different networks and increase network friendliness by information encapsulation and precedence control technologies. Although the basic coding framework of the standard is similar to that of currently popular video standards, JVT includes many new coding features which are presented in the paper. First, a broad view on video-audio coding standards is given. Second, the structure of JVT is presented. Then, both the detailed technologies which improve on coding efficiency of the VCL (video coding layer) and the key terms which are used in NAL (network abstraction layer) are introduced, which is the core in the paper. Finally, the current problems and the challenges of JVT抯 future research are discussed and analyzed.

    • Evidence Ullage Analysis in D-S Theory and Development

      2004, 15(1):69-75.

      Abstract (4167) HTML (0) PDF 713.71 K (5252) Comment (0) Favorites

      Abstract:In this paper, it is found that there are errors in D-S combination rules even though the evidences are unconflict. The error, unanti-jamming and opposition during the evidence combination in traditional D-S theory are studied. An effort has been made to prove that the evidence ullage is caused by ill-suited definitions and ill-suited rules in the combination process. In this paper, new generic definitions and combination rules are also proposed. The evidence ullage in the compatible evidence combination process in terms of the definitions and rules is found and then solved. The experimental results show that the methodology developed here can not only solve the conflicts efficiently, but also produce reasonable results.

    • A Study on Architecture of Massive Information Management for Digital Library

      2004, 15(1):76-85.

      Abstract (4148) HTML (0) PDF 814.07 K (5597) Comment (0) Favorites

      Abstract:This paper investigates the challenging issues and technologies in managing very large digital contents and collections, and gives an overview of the works and enabling technologies in the related areas. Based on the analysis and comparison of the related work, a novel architecture of massive information management for digital library is designed. The key components and core services are described in detail. Finally, a case study THADL (Tsinghua University architecture digital library) that complies with the architectural framework is presented.

    • A Long-Term Learning Based Similarity Retrieval of Multimedia Database

      2004, 15(1):86-93.

      Abstract (4553) HTML (0) PDF 715.59 K (5573) Comment (0) Favorites

      Abstract:An approach is presented for multimedia similarity query using an on-line analysis of feedback sequence logs. The approach is based on user's feedback sequence accumulation and on-line collaborative filtering to predict the semantic correlation between the media objects in database and query sample. Edit distance is used to evaluate the similarity between current retrieval's feedback sequence and the prefixes of the records in the feedback logs. A prototype image retrieval system is implemented. Integrated with the retrieval method based on the generalized Euclidean distance, the performance of similarity query can be improved apparently. Experiments over 11 000 images demonstrate that this method outperforms the conventional ones.

    • Mining Frequent Closed Patterns by Adaptive Pruning

      2004, 15(1):94-102.

      Abstract (4677) HTML (0) PDF 850.61 K (5835) Comment (0) Favorites

      Abstract:The set of frequent closed patterns determines exactly the complete set of all frequent patterns and is usually much smaller than the laster. Yet mining frequent closed patterns remains to be a memory and time consuming task. This paper tries to develop an efficient algorithm to solve this problem. The compound frequent item set tree is employed to organize the set of frequent patterns, which consumes much less memory than other structures. The tree is grown quickly by integrating depth first and breadth first search strategies, opportunistically choosing between two different structures to represent projected transaction subsets, and heuristically deciding to build unfiltered pseudo or filtered projections. Efficient pruning methods are used to reduce the search space. The balance of the efficiency and scalability of tree growth and pruning maximizes the performance. The experimental results show that the algorithm is a factor of five to three orders of magnitude more time efficient than several recently proposed algorithms, and is also the most scalable one. It can be used in the discovery of non-redundant association rules, sequence analysis, and many other data mining problems.

    • Direction Relation Query Processing Using R-Trees

      2004, 15(1):103-111.

      Abstract (4342) HTML (0) PDF 837.22 K (5533) Comment (0) Favorites

      Abstract:Direction relations deal with order in space. Recently, direction relation query processing has gradually gained attention in geospatial databases applications, such as Spatial Data Mining (SDM) and GIS (geographic information system). The processing of direction relation queries needs spatial join operations. Until now, the research work on processing of spatial joins has primarily focused on topological and distance relations. There is little work on processing joins with direction predicates. This paper presents an efficient method for processing direction relation queries using R-trees. The quad-tuples model is defined to represent direction relations between MBRs (minimum bounding rectangles) of spatial objects. An algorithm of processing the filter step using R-trees is given and the refinement step is further decomposed into three different operations. The method presented can efficiently process direction relation queries between objects of any data types in a 2D space. Using both direction and distance constraints restricting the search space when traversing R-trees, this paper also presents an algorithm of direction relation query processing in SDM. Performance evaluation of the proposed method is conducted using real world datasets and the experiment results show that it performs well with respect to both I/O- and CPU-time.

    • A Mechanism to Efficiently Transport Wireless Real-Time IP Services over Cellular Links

      2004, 15(1):112-119.

      Abstract (3700) HTML (0) PDF 583.42 K (4788) Comment (0) Favorites

      Abstract:When transporting real-time IP services such as VoIP in a cellular communication system with limited channel bandwidth and high bit error rate, there exist some problems such as the relatively large overhead imposed by protocol headers and poor data discarding policies. In this paper, a new scheme, ROHC/UDP Lite, is proposed to solve these problems, which combines ROHC with UDP Lite protocol effectively. With the ROHC/UDP Lite, it is possible to efficiently transport IP-based packet-switched voice services in the future IP cellular systems.

    • Study on RTI Congestion Control Based on the Layer of Interest

      2004, 15(1):120-130.

      Abstract (4258) HTML (0) PDF 986.59 K (4947) Comment (0) Favorites

      Abstract:The data explosion in large-scale distributed simulations cripples the performance of simulation and restricts the scalability and persistence. Because HLA (high level architecture) has inherent limitations to define the level of demands for data, RTI (runtime infrastructure) cannot utilize the simulation features effectively to make congestion control. The LOI (layer of interest) depicts the differences of demands for data in various receivers and provides a way to make some encouraging technologies feasible, such as QoS (quality of service), congestion control and layered DDM (data distribution management) strategy. In this paper, the LOI is combined into RTI抯 working mechanisms and a phase filtering based on the LOI is put forward to control congestions in RTI. Finally, experimental results are presented to demonstrate the efficient control over congestions in RTI using the LOI-based phase filtering, together with the critical data and transportation stability guarantee.

    • Secure Authentication Protocol for Trusted Copyright Management Based on Dynamic License

      2004, 15(1):131-140.

      Abstract (3926) HTML (0) PDF 824.36 K (6379) Comment (0) Favorites

      Abstract:Trusted software copyright protection is one of the most important issues in digital rights management. However, most of the current solutions could not meet the demand of End User License Agreement (EULA) in security and efficiency. In this paper, a new and secure authentication protocol for trusted copyright protection based on dynamic license is proposed to solve the above problem. A third part Certificate Authority (CA) is adopted for an atomic authorization and a forced revocation of software license dynamically according to the software and hardware identity and their usage status. Thus under the control of the dynamic license, the copyright is protected safely and the software entity can be transferred freely without copyright damage and resource leakage. Considering the integrity and security of the dynamic license, symmetric and public key cryptography algorithms are used for data encryption and digital signature respectively, while random verification of coding signature is adopted to resist possible attack and runtime crack. Analysis manifests that the proposed protocol is feasible and secure with a high integrity. It can meet the demand of EULA and provide a new and reliable approach for software copyright management.

    • >Review Articles
    • A Survey of Research on Key Management for Multicast

      2004, 15(1):141-150.

      Abstract (7654) HTML (0) PDF 861.11 K (8919) Comment (0) Favorites

      Abstract:The absence of security mechanism has limited the use of multicast. Key management for multicast is used for group members in one multicast session to generate, refresh and transfer keys which are used for encryption and authentication. In addition to the maintenance of keys, issues about scalability, reliability and robustness should be carefully considered. In this paper, the existing problems in key management for multicast are analyzed, and some typical schemes are reviewed.

    • Reliable Detection of Spatial LSB Steganography Based on Difference Histogram

      2004, 15(1):151-158.

      Abstract (4651) HTML (0) PDF 1.21 M (6635) Comment (0) Favorites

      Abstract:Detection of hidden messages in images is of great importance for both the network information security and the improvement of security of steganographic algorithms. Based on the statistical observations on the difference histogram of images, a new steganalytic technique capable of a reliable detection of the spatial LSB (least significant bit) steganography is proposed. Translation coefficients between the difference histograms are defined as a measure of the weak correlation between the LSB plane and the remained bit planes, and then used to construct a classifier to discriminate the stego-image from the carrier-image. The algorithm can not only detect the existence of hidden messages embedded using LSB replacement in images reliably, but also estimate the amount of hidden messages exactly. It has a distinct physical meaning and can be implemented conveniently. Experimental results show that for raw losslessly stored images, the new algorithm has a better performance than the RS (regular singular) steganalysis method and improves the computation speed significantly. The new approach is also applicable for color images.

Current Issue


Volume , No.

Table of Contents

Archive

Volume

Issue

联系方式
  • 《Journal of Software 》
  • 主办单位:Institute of Software, CAS, China
  • 邮编:100190
  • 电话:010-62562563
  • 电子邮箱:jos@iscas.ac.cn
  • 网址:https://www.jos.org.cn
  • 刊号:ISSN 1000-9825
  •           CN 11-2560/TP
  • 国内定价:70元
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063