• Volume 19,Issue 10,2008 Table of Contents
    Select All
    Display Type: |
    • LCS and MSC Reasoning of Hybrid Terminological Cycles in Description Logic εL

      2008, 19(10):2483-2497.

      Abstract (4839) HTML (0) PDF 746.78 K (6566) Comment (0) Favorites

      Abstract:Current research progresses and the existing problems of terminological cycles in description logics are analyzed in this paper. Based on the works of F. Baader, the LCS (least common subsumer) and MSC (most specific concept) reasoning of hybrid terminological cycles in description logic εL is further studied. The syntax and semantics of hybrid terminological cycles in description logic εL are given. Under the requirement of the LCS and MSC reasoning of hybrid terminological cycles in description logic εL, TBox-completion is presented, and the description graph is redefined. The LCS and MSC reasoning algorithms of hybrid terminological cycles in description logic εL w.r.t. greatest fixpoint semantics are presented by using TBox-completion and description graph. The correctness of reasoning algorithms is proved, and it is also proved that the LCS and MSC reasoning w.r.t. greatest fixpoint semantics can be computed in the polynomial time. Theoretical foundation for the LCS and MSC reasoning for hybrid terminological cycles in description logic εL is provided through the reasoning algorithms.

    • Extended Fuzzy Description Logics with Comparisons Between Fuzzy Membership Degrees

      2008, 19(10):2498-2507.

      Abstract (4292) HTML (0) PDF 674.26 K (6156) Comment (0) Favorites

      Abstract:The representation and application of fuzzy knowledge on the semantic Web often relate to the comparisons between fuzzy membership degrees. However, the current fuzzy extensions of description logics do not support the expression of such comparisons. This paper proposes an extended fuzzy description logic that supports the comparisons of fuzzy membership degrees and the concept constructors from description logic ALCN (attributive concept description language with complements and number restriction), written FCALCN (fuzzy comparable ALCN). FCALCN introduces new forms of atom concepts in order to support the comparisons between fuzzy membership degrees. A reasoning algorithm for FCALCN is proposed, and the complexity of the reasoning problems of FCALCN with empty TBox is proved to be PSpace-complete. FCALCN can represent expressive fuzzy knowledge involving the comparisons of fuzzy membership degrees on the semantic Web and enable reasoning of them.

    • Two Values Passing CPS Transformation for Call-by-Name Calculus with Constants

      2008, 19(10):2508-2516.

      Abstract (4984) HTML (0) PDF 551.95 K (6407) Comment (0) Favorites

      Abstract:In this paper, a new CPS (continuation-passing-style) transformation for Plotkin's call-by-name ( calculus with constants is proposed. It is based on evaluation contexts transformation and the features that two values, instead of one, are passed to the continuation every time. With encodings, a CPS language is introduced. Then, Plotkin's simulation theorem is proved by establishing 1-to-1 correspondence between the source language and CPS language. Compared with Plotkin's work, a reduction closed CPS language is defined in which all continuations are explicitly expressed as functional encodings and it is simpler to prove both the soundness and completeness directions of simulation theorem.

    • Dynamics Analysis Method of Cellular Automata for Complex Networking Storage System

      2008, 19(10):2517-2526.

      Abstract (5089) HTML (0) PDF 2.81 M (5673) Comment (0) Favorites

      Abstract:Some inherent dynamics rules are concealed in large-scale network storage system on account of the complexity of data transmission behaviors. This paper studies object-based storage system and proposes two storage cellular automata models called SNCA and OSDCA from macro and micro aspect respectively, and these models can be used to analyze behaviors and rules of complex and dynamic network storage by capturing the intelligent and initiative properties of storage object. In the model SNCA, the lifetime attribute of storage object is used to analyze the data flow rules in storage network to ascertain the congestion degree from macro aspect based on special lattice network topology architecture, and simulation results show that data object flow has global relativity with the phase transition of data flow. In the model OSDCA, data migration and replication mechanism are combined to analyze hotspot migration rule based on the load distribution condition among storage nodes, and simulation results show that data distribution in OBS system has characters of some self-organization.

    • An Approach to Solve Deadlock Problem of Component Connection

      2008, 19(10):2527-2538.

      Abstract (4669) HTML (0) PDF 750.41 K (7083) Comment (0) Favorites

      Abstract:Procedure call-based component connections which are latent in hard code do not only restrict the flexibility of software but also cause hidden problems to software reliability because of the existing deadlock connection loops. To solve this problem, first, a formal semantic model called call-based connector has been built which explicitly separates connection from components. Second, mapping rules used to convert call-based connectors into component the connection directed graph are proposed. Then, two algorithms, TPDCC (two phases deadlock connection check) used to find all existing deadlock connection loops, and DCEMRF (deadlock connection elimination based on maximum reuse frequency) used to find locations with the least number of connections that must be eliminated to eliminate the loops, are provided respectively. Last, its application and experimental results show that the presented approach is feasible and effective, so it can be used to enhance the reliability of software, also be fit as a basis to further design and implement adaptive connector due to its separative way of description and storage of component connection in semantic.

    • Design Calculus Based Approach to Modeling Use Case

      2008, 19(10):2539-2549.

      Abstract (4423) HTML (0) PDF 647.03 K (6463) Comment (0) Favorites

      Abstract:This paper proposes a formal approach to modeling use case which captures requirements from multi-angle views: The class diagrams, the use case sequence diagrams, the use case state diagrams, the specification mapping and the system invariant. By defining formal semantics of those views, each aspect of requirements is given exact formal descriptions. As a result, integrated specification of one method can be built by integrating formal descriptions of its interaction specification and its functional specification. At the same time, properties of use case models can be specified and analyzed through the proof in design calculus. As an application, rules for checking the consistence of use case models are studied. An example to illustrate the feasibility of the proposed method is given.

    • Value Equality Analysis in C Program API Conformance Validation

      2008, 19(10):2550-2561.

      Abstract (4392) HTML (0) PDF 644.93 K (6753) Comment (0) Favorites

      Abstract:In the attempt to apply static analysis method to API (application programming interface) conformance validation, It is noticed that the gap between numeric value based temporal properties in API specification and variable symbol based predicates extracted through static analysis. Therefore, the paper first investigates value equality relations between variable symbols in C program, and then designs an ECS (equality class space) based value equality analysis method. This flow-sensitive method can maintain the correspondence between variable symbol domain and numeric value domain during the process of API conformance validation, and in addition, can effectively support the optimization of subsequent analysis by hiding all irrelevant information except value equality relations.

    • FJ Extended Calculus for Multi-Version Class Dynamic Update

      2008, 19(10):2562-2572.

      Abstract (4658) HTML (0) PDF 566.02 K (6127) Comment (0) Favorites

      Abstract:Aiming at resolving the problem of type-safety in dynamic updating O-O (object-oriented) software, a simple formal system, MCUFJ (multi-version class dynamic updatable calculus based on FJ (featherweight Java) calculus) calculus, is established with the goal of understanding the underlying foundations of updating classes dynamically. MCUFJ is formulated as an extension of a core calculus for Featherweight Java with an update operator. Multi-Version classes make objects with different versions coexisting. This study also discusses what kind of change is type-safe, such as adding, deleting, modifying methods/fields, or changing methods'/fields' type, and concludes some restrictions on type-safe updating. The paper also proves the results formally. This calculus can be used as a foundation of Java and O-O update.

    • Main Memory Database TPC-H Workload Characterization on Modern Processor

      2008, 19(10):2573-2584.

      Abstract (5451) HTML (0) PDF 933.63 K (7636) Comment (0) Favorites

      Abstract:In 1999, the research of database systems' execution time breakdown on modern computer platforms has been analyzed by Ailamaki, et al. The primary motivation of these studies is to improve the performance of Disk Resident Databases (DRDBs), which form the main stream of database systems until now. The typical benchmark used in those studies is TPC-C. However, continuing hardware advancements have "moved-up" on the memory hierarchy, such as the larger and larger on-chip and off-chip caches, the steadily increasing RAM space, and the commercial availability of huge flash memory (solid-state disk) on top of regular disk, etc. To reflect such a trend, the target of workload characterization research along the memory hierarchy is also studied. This paper focuses on Main Memory Databases (MMDBs), and the TPC-H benchmark. Unlike the performance of DRDB which is I/O bound and may be optimized by high-level mechanisms such as indexing, the performance of MMDB is basically CPU and memory bound. In this study, the paper first compares the execution time breakdown of DRDB and MMDB, and the paper proposes an optimize strategy to optimize the memory resident aggregate. Then, the paper explores the difference between column-oriented and row-oriented storage models in CPU and cache utilization. Furthermore, the paper measures performance of MMDBs on different generational CPUs. In addition, the paper analyzes the index influence and gives a strategy for main memory database index optimization. Finally, the paper analyzes each query in the full TPC-H benchmark in detail, and obtains systematic results, which help design micro-benchmarks for further analysis of CPU cache stall. Results of this study are expected to benefit the performance optimization of MMDBs, and the architecture design memory-oriented databases of the next generation.

    • Mining the Frequent Patterns in an Arbitrary Sliding Window over Online Data Streams

      2008, 19(10):2585-2596.

      Abstract (5093) HTML (0) PDF 627.59 K (7297) Comment (0) Favorites

      Abstract:Because of the fluidity and continuity of data stream, the knowledge embedded in stream data is most likely to be changed as time goes by. Thus, in most data stream applications, people are more interested in the information of the recent transactions than that of the old. This paper proposes a method for mining the frequent patterns in an arbitrary sliding window of data streams. As data stream flows, the contents of the data stream are captured with a compact prefix-tree by scanning the stream only once. And the obsolete and infrequent items are deleted by periodically pruning the tree. To differentiate the patterns of recently generated transactions from those of historic transactions, a time decaying model is also applied. Extensive simulations are conducted and the experimental results show that the proposed method is efficient and scalable, and also superior to other analogous algorithms.

    • Credible Association Rule and Its Mining Algorithm Based on Maximum Clique

      2008, 19(10):2597-2610.

      Abstract (5433) HTML (0) PDF 812.22 K (7364) Comment (0) Favorites

      Abstract:Existing association-rule mining algorithms mainly rely on the support-based pruning strategy to prune its combinatorial search space. This strategy is not quite effective in the process of mining potentially interesting low-support patterns. To solve this problem, the paper presents a novel concept of association pattern called credible association rule (CAR), in which each item has the same support level. The confidence directly reflects the credible degree of the rule instead of the traditional support. This paper also proposes a MaxCliqueMining algorithm which creates 2-item credible sets by adjacency matrix and then generates all rules based on maximum clique. Some propositions are verified and which show the properties of CAR and the feasibility and validity of the algorithm. Experimental results on the alarm dataset and Pumsb dataset demonstrate the effectiveness and accuracy of this method for finding CAR.

    • Structural Query Expansion Based on Weighted Query Term for XML Documents

      2008, 19(10):2611-2619.

      Abstract (4967) HTML (0) PDF 531.52 K (6902) Comment (0) Favorites

      Abstract:The main reason of low precision in information retrieval (IR) is that it is difficult for the users to submit a precise query expression for their query intensions. Furthermore, XML documents have characteristics not only in the content, but also in its structure. Therefore it is more difficult for users to submit precise query expressions. In order to solve this problem, this paper puts forward a new query expansion method based on relevance feedback. It can help users to construct a content and structure query expression which can satisfy users' intentions. This method includes two steps. The first step is to expand keywords for finding the weighted keyword which can represent the user's intentions. The second step is structural expansion based on the weighted keywords. Finally a full-edged content-structure query is formalized. Experimental results show that the method can obtain better retrieval results. The average precision of prec@10 and prec@20 is 30% higher than the original query.

    • Service Selection Approach Considering the Trustworthiness of QoS Data

      2008, 19(10):2620-2627.

      Abstract (6377) HTML (0) PDF 604.18 K (7692) Comment (0) Favorites

      Abstract:As the number of web services with the similar function is increasing, QoS-based service selection at runtime has become an important research topic. The existing QoS-based services selection approaches always assume that the QoS data coming from service providers and users are effective and trustworthy, which is actually impossible in reality. This paper proposes a service selection approach considering the trustworthiness of QoS data, which classifies and computes the QoS attributes according to the source of QoS data. For the QoS attributes whose data come from service providers, the statistics of past runtime data to revise the providers' QoS data are used. For the QoS attributes whose data come from users, feedback similarity to weigh users' QoS data is used. Furthermore, an implementation framework and a set of experiments are given, which show that this approach can effectively weaken the influence of untrustworthy QoS data on the services selection, thus can strengthen the accuracy of the service selection.

    • ServLoc: Secure Location Service for Wireless Sensor and Actuator Network

      2008, 19(10):2628-2637.

      Abstract (5047) HTML (0) PDF 603.00 K (6667) Comment (0) Favorites

      Abstract:To solve the problem of secure location service, this paper proposes a range-free secure localization protocol—ServLoc localization protocol. By means of authenticating messages, hidden actors passively receiving localization requests and filtering false location reports, etc, ServLoc localization protocol can defend location attacks, keep location in privacy and locate sensor node in a distributive way. In addition, this paper also proposes a voting-based location verification scheme—ServLoc verification protocol and ways to defend actor attacks. The analysis illustrates that ServLoc verification protocol can trade off the security and effective localization problem of WSAN. ServLoc scheme can also effectively provide secure location service, even if the WSANs suffer location attacks.

    • A Method for Multipath Routing Based on Network Coding in Wireless Sensor Network

      2008, 19(10):2638-2647.

      Abstract (5742) HTML (0) PDF 1001.49 K (9280) Comment (0) Favorites

      Abstract:Reliability is crucial in many wireless sensor network (WSN) applications. Most of existing approaches are redundancy-based, such as employing multi-path or retransmission schemes. However, those designs often waste energy, and thus shorten the network lifetime. To address this issue, this paper proposes an energy aware method which employs network coding scheme based on multi-path routings. By encoding a group of data into independent new packets and transmitting them along multiple paths, this paper offsets the effect of link failure with a little extra overhead. The other strength of this design is that it only needs small-scale linear operations. An approximate method to effectively estimate the number of paths needed is also employed. Comprehensive simulations and results verify the validation of the theoretical results in the paper.

    • Multicast Fast Handover Algorithm Based on Neighbor Information Exchange

      2008, 19(10):2648-2658.

      Abstract (4645) HTML (0) PDF 494.90 K (5934) Comment (0) Favorites

      Abstract:The handover latency and the packet loss rate are important criterions that determine if the mobile multicast algorithm could adapt to real-time multicast appliances. This paper proposes a multicast fast handover algorithm based on neighbor hood information exchange (M-FMIPv6/NIE). Before L2 trigger event occurs, M-FMIPv6/NIE can configure new care-of-address (nCoA) and request new access routers (AR) to join multicast tree. The performance analysis and simulation results indicate that, the multicast service disruption time of M-FMIPv6/NIE is less than that of existing multicast fast handover algorithms and it has good performance in buffer size and packet loss rate.

    • Rectangle and Boomerang Attacks on DES

      2008, 19(10):2659-2666.

      Abstract (5029) HTML (0) PDF 521.93 K (6959) Comment (0) Favorites

      Abstract:In spite of being replaced by AES (advanced encryption standard), DES (data encryption standard) still plays an important role as encryption standard. DES and the triple DES are still widely used in many areas, especially in the financial sector. Recently, some new cryptanalytic techniques are introduced and of which the Rectangle attack and the Boomerang attack had proved to be very powerful. Therefore, it is necessary to re-evaluate the effects that these new cryptanalytic techniques may have on DES. This paper examines the strength of DES against the Rectangle attack and the Boomerang attack. By using the best differential characteristic of DES, the paper gets an attack against up to 12-round DES using the Rectangle attack and an attack against 11-round DES using the Boomerang attack respectively. The Rectangle attack on 12-round DES requires 262 chosen plaintexts and the time complexity is equivalent to 242 12-round encryptions, while the Boomerang attack on 11-round DES requires 258 adaptive chosen plaintexts and ciphertexts and the time complexity is equivalent to 238 11-round encryptions. Because the differential characteristics used in the attacks are all the best ones, it is believed that the attacks are the best results that the Rectangle attack and the Boomerang attack can get on DES.

    • An Integrated Indexing Structure for Large-Scale Cross-Media Retrieval

      2008, 19(10):2667-2680.

      Abstract (4728) HTML (0) PDF 905.37 K (6501) Comment (0) Favorites

      Abstract:This paper proposes a novel integrated indexing structure for the large-scale cross-media retrieval. In the cross-media retrieval, first a cross reference graph (CRG) is generated by hyperlink analysis of the webpages, which supports the cross-media retrieval. Then a refinement process of the CRG is conducted by users' relevance feedbacks. Three steps are made. First, when the user submits a query media object, the candidate media objects are quickly identified by searching the cross reference graph. Then the distance computation of the candidate vectors is conducted to get the answer set. The analysis and experimental results show that the performance of the algorithm is superior to that of sequential scan.

    • A Conceptual Framework for Developing Adaptive Pen-Based User Interface

      2008, 19(10):2681-2693.

      Abstract (5373) HTML (0) PDF 683.02 K (6579) Comment (0) Favorites

      Abstract:This paper presents CFAPUI (a conceptual framework for developing adaptive pen-based user interface), a conceptual framework for developing adaptive pen-based user interface. CFAPUI propos