Problems in Software Integrated Code-128 in Software Problems

11.6 Problems use software uss code 128 drawer toassign code 128 code set c for software rfid 11.6 Problems 11.1 ACID properties Which ACID properties are illustrated by the following transactions (a) While a new row is being added to a table by one process, another process reads the entire table but the new row is not included. (b) While a new row is being added to a table by one process, the process is interrupted and the transaction does not complete within a specified time-out period.

The transaction is rolled back. (c) A disk crash wipes out one copy of a database. Transactions are routed to a back-up location.

(d) A data aging process inspects the date at which data were entered and removes them if a lifetime tag associated with the data is exceeded. (e) A complicated transaction is broken into a series of actions so that if one action is interrupted, the series can restart at that point of interruption. However, other processes see no change to the database until the entire transaction is complete.

(f) A transaction requires intermediate quantities to be stored, which are deleted at the completion of the transaction. 11.2 Database layers and programming How do the database layers of physical, conceptual, and user levels correspond to the layering of computer code between assembly code and compiled C programs 11.

3 Resolution of uncommitted dependency Transaction B begins with an update request at time t1 and would ordinarily finish by time t3. In the meantime transaction A begins at time t2,t1 < t2 < t3 with a request for retrieval. Illustrate how the locking protocol would operate to enable transaction A to reflect the changes made by transaction B, and discuss the cost in performance.

11.4 Replication for reduced communication Data replication can provide both robustness and faster response time. A further potential benefit is reduction of long-range communication costs.

While on the one hand it is undesirable to replicate all records, on the other hand it is clearly beneficial to replicate records that will be frequently accessed. Suppose that remote data records access probabilities are as follows. The probability of a record being randomly selected over its lifetime is 0.

1. The probability of another access of a record, given a selection, is 0.7.

The probability of a record being selected for a first time given a neighboring record has been selected is also 0.7. Devise a replication strategy that will lead to minimal communication cost.

11.5 Sufficient sampling A field has the spatial autocorrelation function sinc 2x=d ;. Data management where d is some constant and x is the distance variable. (a) What is the maximum permitted separation of sensors to be able to reconstruct the field perfectly (b) Why is sampling at exactly this frequency a poor idea in practice (c) Why is perfect reconstruction not possible in real systems 11.6 Data storage times Consider a three-tier sensor network.

The first tier generates the raw data and keeps it for a period T1. It passes to the second tier its decisions and associated likelihoods. The second tier keeps these for a period T2 > T1.

It may also request raw data from the first tier within T1, with a probability p1 0.1. If it does so, the data are stored for a period T2.

The second tier passes to the third tier its decisions and associated likelihoods. The third tier requests raw data with probability 0.8 if tier 2 had requested raw data, or 0.

02 otherwise. Such a request, if it occurs, is generated with certainty within T2 with uniform probability on time. (a) What value of T1 (with respect to T2) is required if raw data requests from the third tier must be met with probability 0.

9 (b) Suppose that, in fact, requests to higher tiers for raw data are related to the closeness of the decisions (i.e., first and second choices have similar likelihoods).

Devise an adaptive procedure to adjust storage times, assuming costs are known for failing to meet data requests or for taking memory away from other node operations. 11.7 Refinement with age As data age they are progressively refined into more compact (lossy) representations.

Suppose, e.g., that data might be stored in three buffers: raw, reduced resolution, histogram.

The value per bit stored increases as the data are compressed, yet the value of the data prior to compression is generally higher than that after compression. Given fixed total memory, one optimization is to maximize the expected value by partitioning the data among these classes based upon their values and request probabilities . To simplify, suppose the compression ratios at each step are 100:1 and the histogram must be stored permanently as it is frequently accessed.

Thus the optimization reduces to partitioning the memory between the raw and reduced resolution data. Suppose that a raw data record has probability of 0.1 of being requested within a period T of first being acquired and 0.

01 otherwise. The revenue per bit is 1 unit. Corresponding figures for the reduced resolution records are twice as large.

How should the memory be partitioned to maximize the expected value 11.8 Interest diffusion Repeat the problem in Example 11.8 but with sink 1 having interest 4, sink 2 having interest 2, and the cost of each link being 1.

11.9 Adaptive sampling The sampling density required in time and space depends on the variability of the phenomenon, the noise, and the reconstruction method. Suppose linear interpolation is to be used and noise is negligible.

The function sin x is to be sampled.
Copyright © . All rights reserved.