A method of encoding data into a chain reaction code includes generating a set of input symbols from input data. Subsequently, one or more non-systematic output symbols is generated from the set of input symbols, each of the one or more non-systematic output symbols being selected from an alphabet of non-systematic output symbols, and each non-systematic output symbol generated as a function of one or more of the input symbols. As a result of this encoding process, any subset of the set of input symbols is recoverable from (i) a predetermined number of non-systematic output symbols, or (ii) a combination of (a) input symbols which are not included in the subset of input symbols that are to be recovered, and (b) one or more of the non-systematic output symbols.

Patent
   RE43741
Priority
Oct 05 2002
Filed
Nov 17 2011
Issued
Oct 16 2012
Expiry
Oct 01 2023
Assg.orig
Entity
Large
45
389
all paid
12. A method of encoding data into a chain reaction code having systematic output symbols and non-systematic output symbols, the method comprising:
generating, from the data, a set of input symbols, the input symbols comprising systematic output symbols;
computing systematic keys for the set of input symbols;
generating, from the set of input symbols and corresponding systematic keys, a plurality of intermediate input symbols
encoding the plurality of intermediate input symbols into one or more non-systematic output symbols, wherein one or more intermediate input symbols are encoded into one non-systematic output symbol, wherein each of the one or more non-systematic output symbols is selected from an alphabet of non-systematic output symbols, and wherein each non-systematic output symbol is generated as a function of one or more of the input symbols,
wherein any subset of the set of input symbols is recoverable from (i) a predetermined number of non-systematic output symbols, or (ii) a combination of (a) input symbols which are not included in the subset of input symbols that are to be recovered and (b) one or wore of the non-systematic output symbols;
wherein computing systematic keys for the plurality of input symbols comprises:
(i) computing L unique keys D(0)-D(L−1), wherein L is a predefined number;
(ii) constructing a modified decoding matrix having k columns and rows, wherein k corresponds to the number of input symbols, and wherein for any value of j between 0 and L−1 the row entries along the jth row are computed as a function of the key D(j); and
(iii) solving the set of linear equations described by the modified decoding matrix, wherein the systematic keys are computed as a function of the solutions of the linear equations;
wherein computing L unique keys is done using a random number generator.
1. A method of encoding data into a chain reaction code having systematic output symbols and non-systematic output symbols, the method comprising:
generating, from the data, a set of input symbols, the input symbols comprising systematic output symbols;
computing systematic keys for the set of input symbols;
generating, from the set of input symbols and corresponding systematic keys, a plurality of intermediate input symbols
encoding the plurality of intermediate input symbols into one or more non-systematic output symbols, wherein one or more intermediate input symbols are encoded into one non-systematic output symbol, wherein each of the one or more non-systematic output symbols is selected from an alphabet of non-systematic output symbols, and wherein each non-systematic output symbol is generated as a function of one or more of the input symbols,
wherein any subset of the set of input symbols is recoverable from (i) a predetermined number of non-systematic output symbols, or (ii) a combination of (a) input symbols which are not included in the subset of input symbols that are to be recovered and (b) one or more of the non-systematic output symbols;
wherein computing systematic keys for the plurality of input symbols comprises:
(i) computing L unique keys D(0)-D(L−1), wherein L is a predefined number;
(ii) constructing a decoding matrix having k columns and L rows, wherein k corresponds to the number of input symbols, and wherein each row corresponds to a function of the key D(j), wherein j is equal to a value between 0 and L−1;
(iii) initializing a set S to contain no symbols;
(iv) applying chain reaction decoding to the decoding matrix to identify an output node to be added to set S;
(v) adding the output node to set S;
(vi) updating the decoding matrix to remove the output node;
(v) comparing size of set S to k;
(vi) if the size of set S is less than k, repeating steps (iv)-(v); and
(vii) if the size of set S is equal to k, sorting the elements in set S from smallest to largest and using the sorted set S to create the systematic keys.
13. A method of decoding a chain reaction code having systematic output symbols and non-systematic output symbols into a set of input symbols, the input symbols comprising data which is sought, the method comprising:
providing a first subset of the set of input symbols, the first subset of input symbols comprising one or more systematic output symbols;
providing one or more non-systematic output symbols, wherein each non-systematic output symbol is selected from an alphabet of non-systematic output symbols, and wherein each non-systematic output symbol is generated as a function of one or more of the input symbols; and
recovering a remaining subset of the input symbols comprising one or more input symbols not included in the first set of input symbols, the remaining subset of input symbols recovered from: (i) a predetermined number of non-systematic output symbols; or (ii) a combination of (a) one or more input symbols from the first subset, and (b) one or more non-systematic output symbols;
wherein recovering a remaining subset of the input symbols comprises:
(i) creating a matrix b, wherein the number of rows in b corresponds to the number of provided non-systematic output symbols and wherein the number of columns in b corresponds to the number of input symbols;
(ii) creating a matrix c, wherein the number of rows in c corresponds to the number of systematic keys and wherein the number of columns in c corresponds to the number of input symbols.
(iii) creating a matrix A as the inverse of matrix c;
(iv) creating a matrix h from the product of b and A;
(v) creating a set E, wherein E is the set of indices of the non-provided systematic symbols;
(vi) creating a set r, wherein r is the set of indices of the provided systematic symbols;
(vii) dividing matrix h into sub-matrices He and Hr, wherein He corresponds to the indices of the systematic symbols not provided and wherein Hr corresponds to the indices of the systematic symbols provided;
(ix) creating vector y from the product of Hr with a vector formed by the provided systematic symbols;
(x) creating vector b from the provided non-systematic output symbols and vector y;
(xi) solving the system of equations for x, wherein the system of equations is He*x=y+b; and
(xii) using x to recover input symbols.
6. A method of encoding data into a chain reaction code having systematic output symbols and non-systematic output symbols, the method comprising:
generating, from the data, a set of input symbols, the input symbols comprising systematic output symbols;
computing systematic keys for the set of input symbols;
generating, from the set of input symbols and corresponding systematic keys, a plurality of intermediate input symbols
encoding the plurality of intermediate input symbols into one or more non-systematic output symbols, wherein one or more intermediate input symbols are encoded into one non-systematic output symbol, wherein each of the one or more non-systematic output symbols is selected from an alphabet of non-systematic output symbols, and wherein each non-systematic output symbol is generated as a function of one or more of the input symbols,
wherein any subset of the set of input symbols is recoverable from (i) a predetermined number of non-systematic output symbols, or (ii) a combination of (a) input symbols which are not included in the subset of input symbols that are to be recovered and (b) one or more of the non-systematic output symbols;
wherein computing systematic keys for the plurality of input symbols comprises:
(i) computing L unique keys D(0)-D(L−1), wherein L is a predefined number;
(ii) determining whether it is possible if k symbols can be decoded using the L keys; wherein k corresponds to the number of input symbols;
(iii) if it is determined that k symbols cannot be decoded using the L keys, aborting the current attempt to compute systematic keys for the plurality of input symbols;
(iv) initializing a systematic set, a non-systematic set, and an unvisited set, wherein the systematic set is initialized to be empty, wherein the non-systematic set is initialized to be empty, and wherein the unvisited set is initialized to contain keys D(0)-D(L−1);
(v) removing a key, c, from the unvisited set;
(vi) determining whether it is possible that k symbols can be decoded using the union of the unvisited set and the systematic set;
(vi) if it is possible to decode k symbols in step (vi), adding key c to the non-systematic set;
(vii) if it is not possible to decode k symbols in step (vi), adding key c to the systematic set;
(viii) repeating steps (v)-(vii) until the systematic set contains at least k symbols; and
(ix) using the systematic set as the systematic keys.
2. The method of claim 1 wherein computing L unique keys is done using a random number generator.
3. The method of claim 1 wherein computing L unique keys is done using a fixed-list of reusable keys.
4. A computer-readable medium comprising code for performing the method of claim 1.
5. An encoder with a processor and the computer-readable medium of claim 4.
7. The method of claim 6 wherein the current attempt to compute systematic keys is followed by another attempt to compute the systematic keys by restarting the method at step (i).
8. The method of claim 6 wherein computing L unique keys is done using a random number generator.
9. The method of claim 6 wherein computing L unique keys is done using a fixed-list of reusable keys.
10. A computer-readable medium comprising code for performing the method of claim 6.
11. An encoder with a processor and the computer-readable medium of claim 10.
14. The method of claim 13 wherein step (iii) creates the inverse matrix using Gaussian elimination.
15. The method of claim 13 wherein step (iii) creates the inverse matrix using chain reaction decoding.
16. The method of claim 13 wherein step (xi) solves the system of equations using Gaussian elimination.
17. The method of claim 13 wherein step (xi) solves the system of equations using chain reaction decoding.
18. The method of claim 13 wherein step (xi) solves the system of equations using inactivation decoding.
19. A computer-readable medium comprising code for performing the method of claim 13.
20. A decoder with a processor and the computer-readable medium of claim 19.

This application claims the benefit of U.S. Provisional Application No. 60/319,597 entitled “Systematic Encoding and Decoding of Chain Reaction Codes,” filed Oct. 5, 2002, the contents of which are herein incorporated by reference in its entirety for all purposes.

The following invention relates to systems and methods for encoding and decoding data of all types, and more particularly to systems and methods for encoding and decoding data using chain reaction codes.

Transmission of data between a sender and a recipient over a communications channel has been the subject of much literature. Preferably, but not exclusively, a recipient desires to receive an exact copy of data transmitted over a channel by a sender with some level of certainty. Where the channel does not have perfect fidelity (which covers most of all physically realizable systems), one concern is how to deal with data lost or garbled in transmission. Lost data (erasures) are often easier to deal with than corrupted data (errors) because the recipient cannot always tell when corrupted data is data received in error. Many error-correcting codes have been developed to correct for erasures and/or for errors. Typically, the particular code used is chosen based on some information about the infidelities of the channel through which the data is being transmitted and the nature of the data being transmitted. For example, where the channel is known to have long periods of infidelity, a burst error code might be best suited for that application. Where only short, infrequent errors are expected a simple parity code might be best.

Another consideration in selecting a code is the protocol used for transmission. In the case of the Internet, a packet protocol is used for data transport. That protocol is called the Internet Protocol or “IP” for short. When a file or other block of data is to be transmitted over an IP network, it is partitioned into equal size input symbols and input symbols are placed into consecutive packets. The “size” of an input symbol can be measured in bits, whether or not the input symbol is actually broken into a bit stream, where an input symbol has a size of M bits when the input symbol is selected from an alphabet of 2M symbols. In such a packet-based communication system, a packet oriented coding scheme might be suitable.

A transmission is called reliable if it allows the intended recipient to recover an exact copy of the original file even in the face of erasures in the network. On the Internet, packet loss often occurs because sporadic congestion causes the buffering mechanism in a router to reach its capacity, forcing it to drop incoming packets. Protection against erasures during transport has been the subject of much study.

The Transport Control Protocol (“TCP”) is a point-to-point packet control scheme in common use that has an acknowledgment mechanism. Using TCP, a sender transmits ordered packets and the recipient acknowledges receipt of each packet. If a packet is lost, no acknowledgment will be sent to the sender and the sender will resend the packet. With protocols such as TCP, the acknowledgment paradigm allows packets to be lost without total failure, since lost packets can just be retransmitted, either in response to a lack of acknowledgment or in response to an explicit request from the recipient.

Although acknowledgment-based protocols are generally suitable for many applications and are in fact widely used over the current Internet, they are inefficient, and sometimes completely infeasible, for certain applications as described in Luby I.

One solution that has been proposed to solve the transmission problem is to avoid the use of an acknowledgment-based protocol, and instead use Forward Error-Correction (FEC) codes, such as Reed-Solomon codes, Tornado codes, or chain reaction codes, to increase reliability. The basic idea is to send output symbols generated from the content instead of just the input symbols that constitute the content. Traditional erasure correcting codes, such as Reed-Solomon or Tornado codes, generate a fixed number of output symbols for a fixed length content. For example, for K input symbols, N output symbols might be generated. These N output symbols may comprise the K original input symbols and N-K redundant symbols. If storage permits, then the server can compute the set of output symbols for each content only once and transmit the output symbols using a carousel protocol.

One problem with some FEC codes is that they require excessive computing power or memory to operate. Another problem is that the number of output symbols must be determined in advance of the coding process. This can lead to inefficiencies if the loss rate of packets is overestimated, and can lead to failure if the loss rate of packets is underestimated.

For traditional FEC codes, the number of possible output symbols that can be generated is of the same order of magnitude as the number of input symbols the content is partitioned into. Typically, but not exclusively, most or all of these output symbols are generated in a preprocessing step before the sending step. These output symbols have the property that all the input symbols can be regenerated from any subset of the output symbols equal in length to the original content or slightly longer in length than the original content.

“Chain Reaction Coding” as described in U.S. Pat. No. 6,307,487 entitled “Information Additive Code Generator and Decoder for Communication Systems” (hereinafter “Luby I”) and in U.S. patent application Ser. No. 10/032,156 entitled “Multi-Stage Code Generator and Decoder for Communication Systems” (hereinafter “Raptor”) represents a different form of forward error-correction that addresses the above issues. For chain reaction codes, the pool of possible output symbols that can be generated is orders of magnitude larger than the number of the input symbols, and a random output symbol from the pool of possibilities can be generated very quickly. For chain reaction codes, the output symbols can be generated on the fly on an as needed basis concurrent with the sending step. Chain reaction codes have the property that all input symbols of the content can be regenerated from any subset of a set of randomly generated output symbols slightly longer in length than the original content.

Other descriptions of various chain reaction coding systems can be found in documents such as U.S. patent application Ser. No. 09/668,452, filed Sep. 22, 2000 and entitled “On Demand Encoding With a Window” and U.S. patent application Ser. No. 09/691,735, filed Oct. 18, 2000 and entitled “Generating High Weight Output symbols Using a Basis.”

Some embodiments of a chain reaction coding system consist of an encoder, and a decoder. Data may be presented to the encoder in the form of a block, or a stream, and the encoder may generate output symbols from the block or the stream on the fly. In some embodiments, for example those described in Raptor, data may be pre-encoded off-line using a static encoder, and the output symbols may be generated from the plurality of the original data symbols and the static output symbols.

In some embodiments of a chain reaction coding system, the encoding and the decoding process rely on a weight table. The weight table describes a probability distribution on the set of source symbols. That is, for any number W between 1 and the number of source symbols, the weight table indicates a unique probability P(W). It is possible that P(W) is zero for substantially many values of W, in which case it may be desirable to include only those weights W for which P(W) is not zero.

In some embodiments of a chain reaction coding system the output symbols are generated as follows: for every output symbol a key is randomly generated. Based on the key, a weight W is computed from the weight table. Then a random subset of W source symbols is chosen. The output symbol will then be the XOR of these source symbols. These source symbols are called the neighbors or associates of the output symbol hereinafter. Various modifications and extensions of this basic scheme are possible and have been discussed in the above-mentioned patents and patent applications.

Once an output symbol has been generated, it may be sent to the intended recipients along with its key, or an indication of how the key may be regenerated. In some embodiments, many output symbols may make up one transmission packet, as for example described in the U.S. patent application Ser. No. 09/792,364, filed Feb. 22, 2001 and entitled “Scheduling of multiple files for serving on a server.”

In certain applications it may be preferable to transmit the source symbols first, and then to continue transmission by sending output symbols. Such a coding system is referred to herein as a systematic coding system. On the receiving side, the receiver may try to receive as many original input symbols as possible, replace the input symbols not received by one or more output symbols and use them to recover the missing input symbols. The transmission of output symbols may be done proactively, without an explicit request of the receiver, or reactively, i.e., in response to an explicit request by the receiver. For example, for applications where no loss or a very small amount of loss is anticipated, it might be advantageous to send the original input symbols first, and to send additional output symbols only in case of erasures. This way, no decoding needs to be performed if there were no losses. As another application, consider the transmission of a live video stream to one or more recipients. Where there is expectation of some loss, it may be advantageous to protect the data using chain reaction coding. Because of the nature of a live transmission, the receiver may be able to buffer a specific part of the data only for at most a predetermined amount of time. If the number of symbols received after this amount of time is not sufficient for complete reconstruction of data, it may be advantageous in certain applications to forward the parts of the data received so far to the video player. In certain applications, and where appropriate source coding methods are used, the video player may be able to play back the data in a degraded quality. In general, where applications may be able to utilize even partially recovered data, it may be advantageous to use a systematic coding system.

Straightforward modifications of embodiments of chain reaction coding systems as described in Luby I or Raptor to produce systematic coding systems generally leads to inefficiencies. For example, if in a chain reaction coding system the first transmitted symbols comprise the original symbols, then it may be necessary to receive a number of pure output symbols which is of the same order of magnitude as the original symbols in order to be able to recover the original data. In other words, reception of the original symbols may only minimally help the decoding process, so that the decoding process has to rely entirely on the other received symbols. This leads to an unnecessarily high reception overhead.

What is therefore needed is a systematic version of a chain reaction coding system, which has efficient encoding and decoding algorithms, and has a similar reception overhead as a chain reaction coding system.

The present invention provides systems and methods for encoding and decoding data using systematic chain reaction encoding and decoding processes. These present can be used in numerous applications, one being a data communication system in which data is communicated faster, more reliably, and with less computational expense.

In one embodiment of the present invention, a method of encoding data into a chain reaction code is presented. Initially a set of input symbols is generated from the data. Subsequently, one or more non-systematic output symbols are generated from the set of input symbols, each of the one or more non-systematic output symbols being selected from an alphabet of non-systematic output symbols, and each non-systematic output symbols generated as a function of one or more of the input symbols. As a result of this encoding process, any subset of the set of input symbols is recoverable from (i) a predetermined number of non-systematic output symbols, or (ii) a combination of (a) input symbols which are not included in the subset of input symbols that are to be recovered, and (b) one or more of the non-systematic output symbols.

Additional embodiments and features of the invention will be better understood in view of the following drawings and detailed description.

FIGS. 1A and 1B illustrate exemplary embodiments of a non-systematic chain reaction encoder and decoder, respectively.

FIG. 2 illustrates the generation of an output symbol from the original input symbols used in the chain reaction decoding process.

FIG. 3 illustrates an exemplary decoding graph used in the chain reaction decoding process.

FIG. 4 illustrates a decoding matrix for the decoding graph shown in FIG. 3.

FIG. 5 illustrates an exemplary procedure for obtaining a modified decoding graph used in the chain reaction decoding process.

FIG. 6 illustrates a modified decoding equation used in the chain reaction decoding process.

FIG. 7A illustrates an exemplary method for encoding data using systematic chain reaction codes in accordance with the present invention.

FIG. 7B illustrates an exemplary method for decoding systematic chain reaction codes in accordance with the present invention.

FIG. 7C illustrates a block diagram of a communications system employing systematic coding and decoding in accordance with one embodiment of the present invention.

FIG. 8A illustrates the operation of the systematic encoder in accordance with one embodiment of the present invention.

FIG. 8B illustrates the operation of the systematic decoder in accordance with one embodiment of the present invention.

FIG. 9A illustrates one embodiment of the systematic encoder in accordance with the present invention.

FIG. 9B illustrates one embodiment of the systematic decoder in accordance with the present invention.

FIG. 10 illustrates one method for generating the systematic keys in accordance with the present invention.

FIG. 11 illustrates a second method for generating the systematic keys in accordance with the present invention.

FIG. 12 illustrates a third method for generating the systematic keys in accordance with the present invention.

FIG. 13 illustrates a forth method for generating the systematic keys in accordance with the present invention.

FIG. 14 illustrates a method for decoding a chain reaction code having systematic and non-systematic symbols in accordance with the present invention.

FIGS. 15-17 illustrate the encoding and decoding processes in an exemplary embodiment of the present invention.

For clarity and convenience, features and components which are identified in earlier drawings retain their reference numerals in subsequent drawings.

I. Non-Systematic Chain Reaction Encoder & Decoder

FIGS. 1A and 1B depict exemplary embodiments of a non-systematic chain reaction encoder 130 and decoder 170, respectively, as described in Luby I and Raptor. While not referred to as such in Luby I and Raptor, these embodiments are referred to herein as “non-systematic” to differentiate their architecture and operation from the systematic encoders and decoders presented below.

Referring now to FIG. 1A, the non-systematic encoder 130 accepts as input symbols IS(0), IS(1), . . . , and keys I0, I1, . . . generated by key generator 120. The number of input symbols may or may not be known in advance. In some embodiments, the non-systematic encoder 130 generates for each key I an output symbol. In FIG. 1A the output are denoted B(I0), B(I1), . . . corresponding to the keys I0, I1, . . . . The number of generated output symbols is potentially limitless. Key generator 120 may have access to a random number generator from which it generates the keys. Alternatively, the keys I may be generated by some other mechanism. Encoder 130 may include static and dynamic encoders, as described for example in Raptor. It may have access to an additional key generator used to describe a static encoder.

There are various methods for obtaining the output symbols from the input symbols for which reference is made to Luby I and Raptor. One illustrative embodiment of such an encoding method is given in FIG. 2. It describes the generation of an output symbol 270 from the original input symbols. The original input symbols are denoted 210(a)-210(f). In some embodiments the first step of the coding process is static encoding, as described in Raptor. This step may produce the source symbols, denoted 220(a)-220(f), and 260(a)-260(c). In some embodiments, static encoding may be systematic, so that the values of the source symbols 220(a)-220(f) are equal to those of 210(a)-210(f). In some embodiments, there may be no static encoding, in which case the input symbols coincide with the source symbols. The source symbols can be produced off-line, or on-line, as the data symbols become available.

Once the source symbols have been created, the output symbols are generated from the source symbols. In some embodiments, the output symbol's value is the XOR of the values of some of the source symbols. For each output symbol, key generator 120 produces a key, from which the weight of the output symbol is determined from a weight table 250. Once the weight W is determined, W random or pseudorandom source symbols are chosen, and the value of the output symbol is computed as the XOR of the values of these source symbols. For example, in FIG. 2, the weight of the output symbol 270 is equal to 3 and its value is determined as the XOR of the source symbols 220(a), 220(d), and 260(b). The weight of an output symbol will also sometimes be referred to as the degree of the output symbol in this document. If a source symbol S contributes to the value of an output symbol O, then S and O are called neighbors. For example, in the situation depicted in FIG. 2 the output symbol 270 is a neighbor of each of the source symbols 220(a), 220(b), and 220(d).

Various embodiments of the chain reaction decoder 170 of FIG. 1B are described in detail in Luby I and Raptor. In some embodiments the decoding process starts as soon as enough output symbols have been collected. In some embodiments the number of collected output symbols is slightly larger than the number of original input symbols. In other embodiments, the number of collected output symbols needed to start the decoding process can be significantly smaller than the number of original input symbols.

In some embodiments, for each received output symbol key regenerator 160 calculates the corresponding key for the symbol, and from the key determines the neighboring source symbols.

One possible description of an embodiment of a decoding process for a chain reaction decoding can be described in terms of the corresponding Decoding Graph, as exemplified in FIG. 3. This graph consists of two sets of nodes, the source nodes, and the output nodes, corresponding to the source symbols and to the received output symbols, respectively. The source nodes correspond to the source symbols, and similarly the output nodes correspond to output symbols. An output node is connected to a source node if the source symbol corresponding to the source node is a neighbor of the output symbol corresponding to the output node. In this case said output node and said source node are called neighbors.

In some embodiments the decoding starts by identifying an output node O1 of degree one. Then the unique neighbor of O1 is declared recovered and is removed from the Decoding Graph, and the process is continued by identifying another output node O2 of degree one. For example, in the situation depicted in FIG. 3, O1 could be the output node denoted 330(a). Removal of its unique neighbor, 320(b), from the Decoding Graph, leads to another output node of degree one, namely 330(c). The process continues until all the source nodes are recovered, or until there are no output node of degree one left.

For example, in the situation of FIG. 3, the following sequence of output nodes may be chosen to recover the corresponding source nodes:

Output node Recovered source node
330(a) 320(b)
330(c) 320(e)
330(h) 320(h)
330(d) 320(i)
330(b) 320(a)
330(j) 320(f)
330(e) 320(c)
330(f) 320(g)
330(g) 320(d)

In this case decoding is successful.

In some embodiments, the graph interpretation may be used to set up a schedule for the actual computations required for the decoding, as illustrated in Luby I or Raptor. Moreover, the idealized decoder described above may be changed in a variety of ways to reduce the resources required, and to speed up the decoding process, as described in the above mentioned patents and patent applications.

In some embodiments, the decoder may output the sequence of output nodes that were used to recover the corresponding input nodes. For example, in the case outlined above, the decoder may output the indices corresponding to the output nodes 330(a), 330(c), 330(h), 330(d), 330(i), 330(b), 330(j), 330(e), 330(f), and 330(g).

It is sometimes advantageous to consider a matrix representation of the Decoding Graph, and an interpretation of the decoding algorithm in terms of this matrix, called the Decoding Matrix hereinafter. In some embodiments of the present invention the Decoding Matrix corresponding to the Decoding Graph has as many rows as there are output nodes, and as many columns as there are source nodes, and has entries 0 or 1. There is a 1 at position (k,j) of the Decoding Matrix if the j-th source node is a neighbor of the k-th output node.

FIG. 4 is an illustration of the Decoding Matrix for the Decoding Graph of FIG. 3. As is known to those skilled in the art, the decoding problem can be phrased in terms of solving a system of equations given by the Decoding Matrix. If M denotes the Decoding Matrix corresponding to the Decoding, and if the vector of values of the output symbols is denoted by b, and if there are K source nodes, then the unknown source symbol values x1, x2, . . . , xK satisfy the matrix equation:
M·x=b,
where x is the column vector (x1, x2, . . . , xK). The chain reaction decoding is successful if there is a permutation of rows and columns of M such that the resulting matrix is a lower triangular matrix, i.e., such that the values in the matrix above the main diagonal are zero. For example, by performing the permutation (3→2, 8→3, 2→5, 10→6, 5→7, 6→8, 7→9) on the rows, and the permutation (2→1, 5→2, 8→3, 9→4, 1→5, 3→7, 7→8, 4→9) on the columns of M a lower triangular matrix is produced. Stated in terms of matrices, this means that the chain reaction decoding algorithm produces permutation matrices P and Q such that P·M·Q is a lower triangular matrix. There are various methods for solving a system of linear equations, as is known to those of skill in the art. For example, it is possible to use the Gaussian elimination algorithm.

The matrix view of the decoding is for illustrative purposes only and not restrictive. In particular, the actual operations of the decoder may differ substantially from the preceding discussions, as described in Luby I, Raptor, and the above mentioned patent applications.

In some embodiments, if a multi-stage chain reaction coding system is used, as described in Raptor, the Decoding Graph may be augmented by a secondary graph which describes the relationship among the source symbols given by the particular static encoding used. For example, where a low-density parity-check code is used for the static encoding process, then a number of output nodes equal to the number of check symbols in this code may be added to the Decoding Graph, their value set to 0, and the Decoding Graph may be augmented by the graph of the low-density parity-check code between the source nodes and the check nodes, and the Decoding Graph may be replaced by the new graph. The choice of low-density parity-check codes is not essential to this application. In general, for any type of static encoding, the corresponding parity-check matrix defines a bipartite graph by which the Decoding Graph may be augmented. This new graph will be referred to as the Modified Decoding Graph hereinafter.

FIG. 5 is an illustrative embodiment of a procedure for obtaining the Modified Decoding Graph. The source nodes are denoted 510(a)-510(f), the output nodes are denoted 520(a)-520(g), and the check nodes are denoted 530(a)-530(d). The source nodes correspond to the source symbols. The graph between the output nodes and the source nodes is the Decoding Graph, given by the neighborhood structure of the output nodes. The graph between the check nodes and the source nodes describes the relations among the source nodes. For example, node 530(a) shows that the XOR of the values of the source symbols corresponding to source nodes 510(a), 510(b), 510(e), and 510(f) is zero.

To the Modified Decoding Graph corresponds a Modified Decoding Matrix consisting of zeros and ones, which has as many columns as there are source nodes, and as many rows as the aggregate value of output nodes and check nodes. Correspondingly, the Modified Decoding Matrix consists of two sets of rows, one corresponding to the output nodes, and one corresponding to the check nodes. Where there are L output nodes, C check nodes, and K source nodes, the Modified Decoding Matrix may be decomposed into a submatrix Mo consisting of L rows and K columns, and a matrix Mc consisting of C rows and K columns. If x1, . . . , xK denote the unknown values of the source symbols, and b1, . . . , bL denote the known values of the received output symbols, the task of the decoder may be to solve the system of equations given by Mo·x=b, and Mc·x=0. The combined system of equations would be the one given in FIG. 6.

In some embodiments of a chain reaction decoder a different decoder, called an Inactivation Decoder, may be used. This Decoder is described in greater detail in the commonly assigned co-pending U.S. patent application Ser. No. 10/459,370, entitled “Systems and Process for Decoding a Chain Reaction Code through Inactivation,” herein incorporated by reference, and referred to as the “Inactivation Decoder.”

II. Systematic Chain Reaction Encoder & Decoder and Methods of Operation

FIG. 7A illustrates an exemplary method for encoding data using systematic chain reaction codes in accordance with the present invention. As used herein, the term “output symbol(s)” refers to a chain reaction code, examples of which are described in Luby I and Raptor. Systematic and non-systematic output symbols are, accordingly, specific types of chain reaction codes, a systematic output symbol comprising a transmitted input symbol, and a non-systematic output symbol comprising a output symbols which is a function of one or more input symbols.

The method of FIG. 7A may be used for a variety of applications, such as encoding data for transmission across a real-time channel, such as a path through the Internet or a broadcast link from a television transmitter to a television recipient or a telephone connection from one point to another, or the communications channel can be a storage channel, such as one or multiple CD-ROMs, disk drives, Web sites, or the like. The communications channel might even be a combination of a real-time channel and a storage channel, such as a channel formed when one person transmits an input file from a personal computer to an Internet Service Provider (ISP) over a telephone line, the input file is stored on a Web server and is subsequently transmitted to a recipient over the Internet.

Referring now to FIG. 7A, the encoding process begins at 702 when a set of input data is received, and a set of input symbols is generated therefrom. Exemplary embodiments of this process are described in Luby I and Raptor, although other techniques may be used in alternative embodiments under the present invention. As described in this document and those referred to or incorporated herein by reference, the input data may be of any format and type, including live data in which the entire set is not known a priori.

Next, one or more non-systematic output symbols are generated from the input symbols. In a particular embodiment of that process, intermediate input symbols are initially generated from the input symbols (704). Subsequently, one or more non-systematic output symbols are generated from the intermediate input symbols (706). In alternative embodiments under the invention, the process of 706 may be omitted and the non-systematic output symbols are generated from the input symbols. Each of these processes are illustrated in greater detail below.

As will be further described below, the input symbols are provided by an input symbol generator for the input data. As explained above, the input data may be data obtained in real-time from a secondary device, such as a video capture module, or it can be static, for example when the input data resides in a file or a buffer created by a secondary application. In other applications of the present invention the input data may be acquired by a combination of a real-time and a static method, for example by receiving the data from a secondary device or application, such as a network card, and storing it on a storage device for further processing by the input symbol generator.

FIG. 7B illustrates an exemplary method for decoding systematic chain reaction codes in accordance with the present invention. Initially at 712, a first subset of the input symbols is acquired. The application will ordinarily determine how this process is accomplished. For instance, when used in a communication system, this process is performed by receiving input symbols of a chain reaction codes transmitted across a communication channel. As explained above, in particular embodiments of the present invention a communications channel can be a real-time channel, or it can be a storage channel, a combination thereof. In a particular embodiment further illustrated below, acquisition of the input symbols is accomplished by transmitting the input symbols to a receiver, the transmitted input symbols comprising systematic output symbols. Because of expected channel losses, some of the transmitted input symbols (i.e., the systematic output symbols) may be lost. Accordingly, only a subset of the original set of input symbols may be acquired by the receiver.

Next at 714, one or more non-systematic output symbols are acquired. Typically, the acquisition of the non-systematic output symbols will follow the same modality as the input symbols, although other means may be used in alternative embodiments.

The method continues at 716, where one or more of the input symbols which were not acquired, are recovered. In a specific embodiment of this process, the missing input symbols may be recovered either from the non-systematic output symbols, or from a combination of non-systematic output symbols and the acquired input symbols.

The recovery process at 716 may be used to recover one, several, or all of the missing input symbols. Once the desired number of missing input symbols is recovered, they may be added to the acquired input symbols to re-form the original set of input symbols, and accordingly, a copy of the original data.

FIG. 7C is a block diagram of an exemplary communications system 700 that uses systematic coding and decoding in accordance with one embodiment of the present invention. In the communication system 700, an input file 721, or an input stream 725, is provided to an input symbol generator 726. Input symbol generator 726 generates a sequence of one or more input symbols (IS(0), IS(1), IS(2), . . . ) from the input file or stream, with each input symbol having a value and a position (denoted in FIG. 7 as a parenthesized integer). As explained above, the possible values for input symbols, i.e., its alphabet, is typically an alphabet of 2M symbols, so that each input symbol codes for M bits of the input file. The value of M is generally determined by the use of communication system 700, but a general purpose system might include a symbol size input for input symbol generator 726 so that M can be varied from use to use. The output of input symbol generator 726 is provided to a systematic encoder 728.

The non-systematic key generator 727 generates keys I0, I1, I2, . . . corresponding to the input symbols provided to the encoder 728, the non-systematic keys being used to compute the values of the non-systematic output symbols B(I0), B(I1), B(I2), . . . output from the encoder 728. Each non-systematic key I0, I1, I2, . . . is generated so that a large fraction of the keys for the same input file are unique. In one embodiment, the non-systematic key generator 727 comprises the key regenerator 120 illustrated in FIG. 1A above and described in Luby I and Raptor, although in other embodiments another type of device operable to generate non-systematic keys may be used.

Systematic key generator 730 generates systematic keys C0, C1, C2, . . . corresponding to the input symbols provided to the encoder 728, these keys being used to recover one or more of the input symbols not received, as will be further described below. It may use random numbers generated by random number generator 735 to generate the keys. The generation of the systematic keys will be subsequently described in greater detail. The outputs of non-systematic key generator 727 and the systematic key generator 730 are provided to encoder 728.

From each non-systematic key I provided by the non-systematic key generator 727, encoder 728 generates a non-systematic output symbol, with a value B(I), from the input symbols provided by the input symbol generator. The non-systematic output symbol generated may be that as described in Luby I (single stage encoding/decoding) or the output symbol described in Raptor (multiple stage encoding/decoding). The operation of an exemplary systematic encoder 728 will be described in more detail below. The value of each output symbol is generated based on its key, and on some function of one or more of the input symbols.

In some embodiments, the number K of input symbols is used by the systematic encoder 728 to select the associates. If K is not known in advance, such as where the input is a streaming file, K can be just an estimate. The value K might also be used by systematic encoder 728 to allocate storage for input symbols and any intermediate symbols generated by systematic encoder 728.

Systematic encoder 728 forwards the input symbols IS(0), IS(1), . . . together with the systematic keys C0, C1, . . . , CK−1, or an indication on how to regenerate the systematic keys to transmit module 740. When transmitted, the symbols IS(0), IS(1), . . . are herein referred to as “systematic output symbols”. Systematic encoder 728 may create a copy of the input symbols for the generation of further output symbols before forwarding the input symbols to the transmit module.

Systematic encoder 728 also provides the non-systematic output symbols B(I0), B(I1), B(I2), . . . to transmit module 740. Transmit module 740 is also provided the non-systematic keys (I0, I1, I2, . . . ) for each such output symbol from the non-systematic key generator 727. Transmit module 740 transmits the systematic and non-systematic output symbols, and depending on the keying method used, transmit module 740 might also transmit some data about the keys of the transmitted output symbols, over a channel 745 to a receive module 750. Channel 745 is assumed to be an erasure channel, but that is not a requirement for proper operation of communication system 700. Modules 740, 745 and 750 can be any suitable hardware components, software components, physical media, or any combination thereof, so long as transmit module 740 is adapted to transmit output symbols and any needed data about their keys to channel 745 and receive module 750 is adapted to receive symbols and potentially some data about their keys from channel 745. The value of K, if used to determine the associates, can be sent over channel 745, or it may be set ahead of time by agreement of encoder 728 and decoder 755.

As explained above, channel 745 can be a real-time channel, such as a path through the Internet or a broadcast link from a television transmitter to a television recipient or a telephone connection from one point to another, or channel 745 can be a storage channel, such as a CD-ROM, disk drive, Web site, or the like. Channel 745 might even be a combination of a real-time channel and a storage channel, such as a channel formed when one person transmits an input file from a personal computer to an Internet Service Provider (ISP) over a telephone line, the input file is stored on a Web server and is subsequently transmitted to a recipient over the Internet.

Receive module 750 receives the non-systematic and/or systematic output symbols from the channel 745 which it supplies to a decoder 755. Data corresponding to the keys of the received output symbols are provided to the non-systematic key regenerator 760, and the systematic key regenerator 780. In the illustrated embodiment of FIG. 7, a set of systematic output symbols denoted by IS(x), IS(y), . . . , IS(z) is received along with a set of non-systematic output symbols B(Ia), B(Ib), B(Ic), . . . In alternative embodiments, the receive module 750 may receive systematic output symbols exclusively, or a combination of systematic and non-systematic output symbols.

The non-systematic key regenerator 760 regenerates the non-systematic keys for the received non-systematic output symbols and provides these keys to the systematic decoder 755. In one embodiment, the non-systematic key regenerator 760 comprises the key regenerator 160 illustrated in FIG. 1B above and described in Luby I and Raptor, although in other embodiments another type of device operable to regenerate non-systematic keys may be used. Systematic key regenerator 180 regenerates the systematic keys C0, C1, . . . and provides them to the systematic decoder 755. The systematic key regenerator 780 may have access to some shared information with the systematic key generator 730 which facilitates the regeneration of the systematic keys. Alternatively, systematic key regenerator 780 may regenerate the keys based on additional information transmitted through channel 745. In some embodiments, systematic key regenerator 780 may have access to the same random number generator 735 which may be used to generate the systematic keys. This can be in the form of access to the same physical device if the random numbers are generated on such device, or in the form of access to the same algorithm for the generation of random numbers to achieve identical behavior.

Decoder 755 uses the non-systematic keys provided by non-systematic key regenerator 760 and systematic key generator 780 together with the corresponding output symbols, to recover the input symbols (again IS(0), IS(1), IS(2), . . . ). The recovered input symbols are forwarded to the input file reassembler 765. Systematic decoder 755 may forward the received systematic output symbols IS(x), IS(y), . . . , IS(z) directly to the input file reassembler 765, before recovering the remaining input symbols. In particular, if all input symbols are received, the decoder may choose to just forward the received data to input file reassembler without further computation. Input file reassembler 765 generates a copy 770 of input file 721 or input stream 725.

In the following the operations of the systematic encoder 728 and decoder 755 will be described in greater detail. In some embodiments of the present invention these units may use chain reaction encoding and decoding, as described above.

FIG. 8A illustrates the operation of the systematic encoder 728 in a specific embodiment of the invention. Initially, the systematic encoder 728 receives the input symbols IS(0), IS(1), . . . , IS(K−1) from input symbol generator 726 in FIG. 7. The input symbols may be known in their entirety at the start of the encoding, or they may only be partially known.

In this embodiment, the systematic encoder 728 has access to the non-systematic key generator 727, which generates as many non-systematic keys I0,I1, . . . as the number of non-systematic output symbols generated. In addition, the systematic key generator 730 generates as many systematic keys C0, C1, . . . , CK−1 as there are input symbols. Systematic Encoder 728 passes the original input symbols to the transmit module 750, these symbols being transmitted as the systematic output symbols. The systematic encoder 728 also operates to generate non-systematic output symbols B(I0), B(I1), . . . for each of the keys I0, I1, . . . generated by non-systematic key generator 727. The operation of the systematic key generator 730 is further described below.

Systematic key generator 730 and systematic key regenerator 780 (FIG. 7) may have access to some shared information so systematic key regenerator 780 can succeed in generating the same keys as the systematic key generator 730. In some embodiments the shared information may be transmitted to the systematic key regenerator 780. In other embodiments the systematic keys may be a deterministic function of other parameters of the code, e.g., the number of input symbols and the weight table.

In some embodiments, the systematic keys may have been pre-computed for some or all relevant values of the number of input symbols. In some embodiments, the systematic keys may be re-used for different sets of input symbols. In other embodiments, the systematic keys may be re-computed for every input block, using some shared information between the systematic key generator 730 and the systematic key regenerator 780.

FIG. 8B illustrates the operation of the systematic decoder 755 in a specific embodiment of the invention. Systematic decoder 755 receives systematic and non-systematic output symbols from receive module 750 denoted as IS(x), IS(y), . . . , IS(z), and B(Ia), B(Ib), . . . , respectively. In a particular embodiment, systematic decoder 755 has access to the systematic key regenerator 780, and to non-systematic key regenerator 760. The output of the systematic chain reaction decoder is the set of initial input symbols IS(0), IS(1), . . . , IS(K−1).

FIG. 9A illustrates the systematic encoder 728 in more detail. The systematic encoder 728 includes a chain reaction decoder 910, and a chain reaction encoder 920. Additionally, it may have access to a memory device (not shown) to store intermediate symbols S(0), S(1), . . . , S(K−1).

Upon receiving the input symbols IS(0), IS(1), . . . , IS(K−1), and the systematic keys C0, C1, . . . , CK−1, chain reaction decoder 910 computes a set of intermediate input symbols S(0), S(1), . . . , S(K−1) using, for example, the decoding methods for chain reaction codes described in the patents and patent applications incorporated herein. In some embodiments of the present invention the intermediate input symbols may be stored in memory, or on disk. In other embodiments, the intermediate input symbols may be forwarded to chain reaction encoder 920 as they become available.

Chain reaction encoder 920 uses the intermediate input symbols generated by chain reaction decoder 910 together with non-systematic keys I0, I1, I2, . . . generated by non-systematic key regenerator 727, to generate non-systematic output symbols B(I0), B(I1), . . . . In some embodiments, this encoding process may be accomplished using the input symbol encoding process described in either Luby I or Raptor, with the modification that the intermediate input symbols of the present invention are used as the input symbols of Luby I. In a particular embodiment the non-systematic output symbols are supplied to the transmit module 140 after the input symbols IS(0), IS(1), . . . IS(K−1). This is however not essential for the functioning of this invention. Further, the order of transmission from the transmit module 740 may vary as well.

FIG. 9B is an illustrative embodiment of the systematic decoder 755, which includes a chain reaction decoder 930, and a chain reaction encoder 940. The input to the systematic decoder includes the received output symbols some of which comprise the received systematic output symbols IS(x), IS(y), IS(z), . . . , and some of which may comprise received non-systematic output symbols B(Ia), B(Ib), . . . . In some embodiments, the decoder may copy the received systematic symbols to a memory device, and directly forward them to input file reassembler 765.

Chain reaction decoder 930 uses the symbols IS(x), IS(y), . . . , IS(z), B(a), B(Ib), the systematic keys Cx, Cy, . . . , Cz, generated by the systematic key regenerator 780, and the non-systematic keys Ia, Ib, . . . generated by non-systematic key regenerator 760 to produce intermediate input symbols S(0), S(1), . . . , S(K−1). The systematic keys Cx, Cy, . . . , Cz, correspond to the received input symbols IS(x), IS(y), . . . , IS(z). In some embodiments, the recovered intermediate symbols may be stored to a secondary storage before being passed to the chain reaction encoder 440. In other embodiments, these intermediate symbols may be passed directly to the chain reaction encoder 940.

Chain reaction encoder 940 uses the intermediate input symbols and the systematic keys Cu, Cv, . . . Cw corresponding to erased systematic output symbols IS(u), IS(v), . . . , IS(w) to generate and output the missing original input symbols IS(u), IS(v), . . . , IS(w). As an exemplary embodiment, for each of the initial keys Cu, Cv, . . . , Cw, the decoder identifies a weight W and W symbols among the intermediate input symbols S(0), . . . , S(K−1), and XOR's the values of output symbols to obtain the erased input symbols IS(u), IS(v), . . . , IS(w) corresponding to the systematic keys Cu, Cv, . . . , Cw. The amount of computational resources used by chain reaction encoder 940, in one embodiment, will be proportional to the number of systematic output symbols that are erased. For example, if all the systematic output symbols are received, then the decoder may not perform any computations, and forward the received symbols to input file reassembler 765.

In particular embodiments, the chain reaction encoder 940 and chain reaction decoder 910 will have access to the same weight table, and use the same static encoding/decoding, if static encoding is used. Similarly, chain reaction encoder 920 and chain reaction decoder 930 may have access to the same weight table, and use the same static encoding/decoding.

Methods for Calculating the Systematic Keys

In a specific embodiment of the present invention, the systematic keys are calculated by systematic key generator 730 before symbol transmission, and re-computed by the systematic key regenerator 780 after symbol reception. The systematic keys are used by the chain reaction decoder 910 and encoder 930 to obtain the intermediate input symbols S(0), S(1), . . . S(K−1).

In particular embodiments of the present invention the systematic keys are calculated in such a way that unique and efficient chain reaction decoding of K symbols is possible using exactly K output symbols generated with these keys. Here decoding can be any of the decoding methods described in Luby I, Raptor, or Inactivation Decoding, or more generally decoding methods based on the Gaussian elimination algorithm as for example described in Inactivation Decoding.

FIG. 10 is an exemplary embodiment of the systematic key generation process. One input to the systematic key generator may be the number K of input symbols IS(0), IS(1), . . . , IS(K−1). Systematic key generation starts by setting a variable j equal to 0. During the algorithm a matrix M with K columns, which, initially, has zero rows, is updated by adding rows as the algorithm progresses. For every different value of j the algorithm generates a different key D(j) at 1020. This key may be generated by the methods described in Luby I or Raptor, and may use the random number generator 135 shown in FIG. 1. Next at 1030, the key D(j) is used to compute the entries of the j-th row of the matrix M. One possible embodiment of such a computation would be to use key D(j) in the chain reaction coding process. In this case, using the weight table, the key D(j) identifies a weight W and W values among the values 0, 1, . . . , K−1. It then may set a 1 at position m of the jth row of M if m is one of the random or pseudorandom values generated, and set the other values of the jth row to zero.

At 1040, a determination is made as to whether the presently configured matrix M has K rows that are linearly independent over the binary field GF(2), the binary filed GF(2) referring to the set consisting of 0 and 1 in which multiplication and addition are performed modulo the integer 2. This process in 1040 can be performed in a variety of ways. For example, Gaussian elimination over the binary field GF(2) could be used to check this. However, there are many other ways as known to those skilled in the art. For example, if the teachings of Inactivation Decoding are applied to the matrix M, then M contains K linearly independent rows only if the Inactivation Decoder applied to M is successful.

If the test in 1040 is positive, and rows r(0), r(1), . . . , r(K−1) of M are discovered to be linearly independent, then the systematic keys C0, C1, . . . , CK−1 are set to the keys D(r(0)), . . . , D(r(K−1)), and the keys are output. If the test in 1040 is negative, then the counter j is incremented in 1060, and the computation is repeated from 1020 on.

Other equivalent or substantially similar methods of generating the systematic keys can be envisioned by those skilled in the art. For example, instead of generating the keys D(j) one at a time during the course of the algorithm, a set of L such keys could be generated beforehand, and key D(j) could be taken from this pool of keys at step j of the algorithm. Herein, L could be a function of the number of input symbols.

A second method for generating the systematic keys is exemplified in FIG. 11. In this method, the input to this algorithm consists of the number K of input symbols, and a number L which is typically larger than or equal to K. In some embodiments, L may be the number of output symbols to be collected to guarantee, with high probability, that the decoding is successful, as described in Luby I or Raptor.

At 1110, L keys D(0), . . . , D(L−1) are generated. This process may be accomplished through the use of a random number generator 735. In other embodiments, these keys may be generated from a fixed list of re-usable keys. This process may also provide an indication of how the keys were generated. For example, if a random number generator is used, the seed for the generator may be recorded for future use by the systematic key regenerator.

Using the keys D(0), D(1), . . . , D(L−1) a Modified Decoding Graph is set up in 1120 as described above and exemplified in FIG. 5. This process may employ the knowledge of the specific weight table for the code, as well as the knowledge of any static encoding used, as described in Raptor.

At 1130, the Modified Decoding Graph is decoded using any of the methods presented earlier. As a by-product of the decoding, the indices r(0), r(1), . . . , r(K−1) of those output nodes that trigger the recovery of an input node are recorded. At 1140, the systematic keys are outputted as C1=D(r(0)), . . . , CK=D(r(K−1)).

FIG. 12 illustrates a third method for computing the systematic keys. Similar to the method of FIG. 11 the keys D(0), . . . , D(L−1) are generated in 1210, and the Decoding Graph is set up using these keys, and possibly the weight table. Next a set S is initialized as the empty set in 1230. The set S will contain the indices of those output symbols which are used in the chain reaction decoding process to recover the value of an input node. In 1240 the chain reaction decoding process is applied to the Decoding Graph by identifying an output node of degree one. The index of this output node is added to the set S in accordance with the above-mentioned role of this set. A test is performed at 1250 as to whether the set S already has the right number of elements. If not, the algorithm loops back to 1240 where another input node of degree one is chosen to continue the decoding process. If the size of S is K, then the elements of S are sorted starting with the smallest element to yield the sorted elements S0, . . . , SK−1 and the systematic keys are calculated as C0=D(S0), . . . , CK−1=D(SK−1) in 1260.

FIG. 13 illustrates a fourth method for computing systematic keys in accordance with the present invention. In this method it is assumed that a decoding algorithm is available which on input K and a set of keys can decide whether the original K symbols are decodable from the given set of keys. Examples of such algorithms are provided by the decoders described in Luby I, Raptor, of Inactivation Decoding.

At 1310 L keys D(0), . . . , D(L−1) are generated. Similar to the above description, this process may be accomplished through the use of a random number generator 735, or the keys may be generated from a fixed set of re-usable keys. At 1315, the decoder is used to decide whether or not it is possible to decode the K symbols from the set of keys D(0), . . . , D(L−1). If decoding is not successful, then the given set of keys does not contain as a subset the systematic keys, and the algorithm aborts at 1325. Otherwise, three sets are initialized at 1330. These sets are called Systematic, Non_Systematic, and Unvisited, respectively. At the end of the algorithm, the set Systematic will contain the set of systematic keys. Originally, at 1330 the sets Systematic and Non_Systematic are initialized to empty sets, while the set Unvisited contains all the original keys D(0), . . . , D(L−1). At processes 1335 through 1360 a key is removed from the set Unvisited and a decoding attempt is made on the keys contained in the sets Systematic and Unvisited. If the attempt is successful, then the chosen key C does not belong to the set of systematic keys. On the contrary, if decoding is not successful, then the key does belong to the set of systematic keys. The procedure consisting of removal of an unvisited key and decoding (1335), a test as to whether decoding was successful (1340), and the following addition of the chosen key to the set Systematic or Non_Systematic based on the outcome of the decoder (1345 and 1350) are repeated as long as the set Systematic has fewer than the number K of original input symbols.

FIG. 14 illustrates a method for decoding a chain reaction code having systematic and non-systematic symbols in accordance with the present invention. At 1410, non-systematic keys Ia, Ib, . . . corresponding to the received non-systematic output symbols B(I0), B(Ib), . . . are used to generate a matrix B which has as many rows as there are received non-systematic output symbols and as many columns as there are input symbols. For each key the same mechanism as for encoding chain reaction codes is used to generate a weight W and a set J1,J2, . . . , Jw of indices of input symbols from which the output symbol corresponding to the key is generated. Then, in the corresponding row of the matrix B the positions corresponding to J1,J2, . . . , Jw are set to 1, while the other positions in that row are set to 0. The procedure is repeated until all keys corresponding to non-systematic received symbols are exhausted.

Next at 1420, a similar procedure is applied to construct a square matrix C with as many rows and columns as the number of input symbols from the systematic keys C0, C1, . . . , CK−1. This process also computes the inverse of the matrix C, called A. Computing the inverse of A can be performed in a variety of ways, as is known to those of skill in the art. For example, a Gaussian elimination algorithm can be used to calculate A. In other embodiments a version of chain reaction decoding can be utilized to perform this step. This is further illustrated in an example later in this disclosure.

At 1430, the product of the matrices B and A is calculated over the binary field GF(2) to obtain a matrix H. Next at 1440, two sets of indices E and R are determined: E is the set of indices of the non-received systematic symbols, while R is the set of indices of the received systematic symbols. For example, assume there are 11 input symbols with indices 0, 1, 2, . . . , 10. If, after the transmission, the systematic symbols corresponding to the indices 0, 3, 9, 10 are received, then R={0,3,9,10}, while E={1,2,4,5,6,7,8}. The matrix H, computed in 1430 as the product of B and A is then subdivided into two submatrices HE and HR: HE is the submatrix of H obtained by taking the columns of H corresponding to the indices of the systematic symbols not received, and HR is the submatrix of H corresponding to the indices of the received systematic symbols. In the example above, HE would be the submatrix of H formed by the columns 1, 2, 3, 4, 5, 6, 7, and 8 of H.

At 1450, the matrix HR is multiplied with the vector formed by the received systematic symbols IS(x), IS(y), . . . , IS(z). For example, in the scenario above, HR would be multiplied with the values of the systematic symbols 0, 3, 9, 10 (in this ordering). The actual multiplication can be performed in a variety of ways, as is known to those skilled in the art. The result of this multiplication, called the vector y in the following, may be stored for future use. At 1460, the non-systematic received output symbols are used to set up a vector b. Where there are L such symbols, the number of entries in the vector b is L. This step may only be logical. In other words, this step may not require any computations. Next, the results of the previous multiplication stored in the vector y is component-wise XOR'd with the entries of the vector b, i.e., each of the non-systematic received output symbols are XOR'd with the corresponding symbols of the vector y. The result of this operation may be stored in place of the received non-systematic symbols, or it may be stored at a different location.

Once this XOR has been determined, a system of linear equations is set up using the matrix HE corresponding to the erased systematic symbols. The solution x of the system HE*x=y+b then corresponds to the values of the erased systematic symbols. These values are output in 1470. Again, this process can be performed in a variety of ways, for example using Gaussian elimination, or any of the variants of chain reaction decoding disclosed in Luby I, Raptor, or Inactivation Decoding.

This matrix view of the decoding is for illustrative purposes only and not restrictive. Many variations of this decoding procedure will become apparent to those of skill in the art upon review of this disclosure.

III. Exemplary Systematic Coding and Decoding

A brief example of some aspects of the operations of some embodiments of a systematic chain reaction coding system will now be given with reference to FIGS. 15-17. In all the examples given the effect of the weight table is only implicitly stated in terms of the list of neighbors of a given symbol, given its key.

Computing the Systematic Keys

FIG. 15A describes a Decoding Graph used to obtain systematic keys C0, C1, . . . , C8. It is assumed that 12 keys D(O), D(1), . . . , D(11) have already been generated, for example by the operation in 1110 of FIG. 11. The graph in FIG. 15A describes the Modified Decoding Graph between the input nodes denoted 1520(a), . . . , 1520(i), and output nodes denoted 1530(a), . . . , 1530(l) using the keys D(0), . . . , D(11). Chain reaction decoding may now be applied to this graph to obtain the systematic keys as the keys of those output nodes which trigger the recovery of an input node in the course of chain reaction decoding.

In operation, node 1530(a) may be used to recover the input node 1520(b). Accordingly, the first systematic key C0 is then equal to the first of the generated keys, namely D(0). Recovery of input node 1520(b) causes output node 1530(c) to become of degree 1, and hence to trigger recovery of node 1520(e). Continuing in this way, it can be seen that the nodes colored light gray in FIG. 15A can be used to recover the input nodes. The sequence of output nodes used to recover the input nodes is equal to 1530(a), 1530(b), 1530(c), 1530(d), 1530(e), 1530(f),1530(g), 1530(h), 1530(j). As a result, the sequence of systematic keys may be chosen as shown in FIG. 15B.

It should be noted that the recovery process for the illustrated chain reaction decoding is only conceptual. In particular, no XOR operation is performed in this particular example.

Systematic Encoding

As outlined in FIG. 9A, a systematic chain reaction encoder consists of a chain reaction decoder 910 and a chain reaction encoder 920. Accordingly, the operation of systematic chain reaction encoding is divided into two parts. These two parts are exemplified in FIG. 16A and FIG. 16B, respectively.

FIG. 16B exemplifies the operation of the chain reaction decoder 910. The input symbols are denoted by IS(0), . . . , IS(8). The keys C0, C1, . . . , C8 are used to set up the graphical dependency between the input symbols and the intermediate input symbols S(0), . . . , S(8). For example, the key C0 shows that IS(0) is equal to the value of S(1), while the key C4 shows that IS(4) is equal to the XOR of the values of S(2), S(5), and S(7). Chain reaction decoding can now be applied to obtain the values S(0), S(1), . . . , S(8). The schedule to obtain these values may have been forwarded to the chain reaction decoder 910 from the systematic key generator 730 in FIG. 7, since this schedule was set up to obtain the keys C0, C1, . . . , C8. Unlike the operation of the systematic key generator, this step may employ XOR'ing the values of the individual symbols.

In the example of FIG. 16A the schedule may first produce the value of S(1), which in turn may produce the value of S(4) using the value of IS(1). This triggers the recovery of the values of S(0), and S(7), etc.

FIG. 16B exemplifies the operation of the chain reaction encoder 920 in FIG. 9A by showing the generation of the first 11 non-systematic output symbols O(0), . . . , O(10). (The illustrated output symbols O(i) refers to previously described output symbols B(Ii).) As was described before, the output of the systematic encoder consists of the systematic output symbols IS(0), . . . , IS(8), followed by the output symbols O(0) . . . , O(10), . . . . This particular ordering is only exemplary, and other orderings can be used in alternative embodiments under the present invention.

Systematic Decoding

FIGS. 17A and 17B exemplify an embodiment of the process of systematic chain reaction decoding. It is assumed that the received systematic output symbols are IS(1), IS(6), and IS(7), while the received non-systematic output symbols are O(0), O(3), O(4), O(6), O(7), O(8), O(9), and O(10). The task of the decoder is to compute the values of the missing systematic output symbols, i.e., the values IS(0), IS(2), IS(3), IS(4), IS(5), and IS(8). FIG. 17A is an example of how the chain reaction decoder 930 and the chain reaction encoder 940 in FIG. 9B may be combined into one decoder. In some applications, such a combination may lead to computational savings.

Using the keys C1, C6, and C7 corresponding to the received systematic output symbols, and the keys corresponding to the received non-systematic output symbols, a graph is set up between the received output symbols, and the intermediate input symbols S(0), . . . , S(8). A connecting line is drawn between an output symbol and all the intermediate input symbols whose XOR yields the value of the output symbol. The individual connections are the same as the ones shown in FIG. 16A and FIG. 16B. The particular ordering of the received output symbols may not be equal to the ordering chosen to represent the Decoding Graph.

This graph is extended by another layer of nodes, corresponding to the erased systematic output symbols. This graph corresponds to the upper part of FIG. 17A, in which the input symbols IS(0), IS(2), IS(3), IS(4), IS(5), and IS(8) are connected via dotted lines to those intermediate input symbols of which they are an XOR of. Again, these connections may be verified against the corresponding connections in FIG. 17A.

The process of decoding in this particular example may start by applying the chain reaction decoding to the lower graph; every time one of the intermediate symbols is recovered, its value may be XOR'd to the value of the all the neighbors of this symbol among the non-received original symbols in the upper part of the figure. Originally, the values of these symbols may be set to zero.

For example, output symbol O(4) may be used to recover the value of S(3). The value of S(3) may then be XOR'd into the current value of IS(S). After this step, the value of IS(5) is equal to that of S(3). Recovery of S(3) reduces the degree of the output node O(10) to one. This output node in turn recovers the value of the intermediate symbol S(6). This value is XOR'd into the current value of IS(5), so that after this step the value of IS(5) is recovered. The process may continue until all the non-received systematic input symbols are recovered.

FIG. 17B illustrates the process by which the missing output symbols are recovered. The recovered symbols are framed in rectangles. The recovered systematic output symbols are framed in gray rectangles. The labels of the edges in this figure describe the symbols used for the recovery.

For example, symbol O(4) is used to recover S(3). Symbol O(10) is used to recover S(6). S(3) and S(6) together recover S(5). Recovery of S(6) triggers the recovery of S(8) (using O(9)) and the recovery of S(0) (using the received systematic output symbol IS(7)). Recovery of S(8) triggers the recovery of IS(3). Recovery of S(0) triggers the recovery of S(4) (using IS(1)). On the other hand, using O(0), the recovery of S(8) triggers that of S(1), which together with S(4) recovery IS(2). Furthermore, recovery of S(1) leads to recovery of IS(0), since these values are identical. Using O(8), and the recovered value of S(4), the value of S(5) is obtained. This, in turn, recovers the value of IS(8), since the latter is the XOR of S(5), S(4), and S(0), and all these values are known at this stage. Using IS(6) and S(4), the value of S(7) is obtained. Using O(7), this recovers the value of S(2), which together with S(7) recovers the value of the last remaining input symbol, namely IS(4).

The foregoing description has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed, and obviously many modifications and variations are possible in light of the above teaching. The described embodiments were chosen in order to best explain the principles of the invention and its practical application to thereby enable others skilled in the art to best utilize the invention in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the claims appended hereto.

Documents Herein Incorporated by Reference:

U.S. Pat. No. 6,307,487 to Michael G. Luby, entitled “Information Additive Code Generator and Decoder for Communication Systems” (referred to herein as Luby I);

U.S. patent application Ser. No. 09/792,364, filed Feb. 22, 2001, entitled “Scheduling of Multiple Files for Serving on a Server”;

U.S. patent application Ser. No. 10/032,156, filed Dec. 21, 2001, entitled “Multi-Stage Code Generator and Decoder for Communication Systems” (referred to herein as “Raptor”); and

U.S. patent application Ser. No. 10/459,370, filed Jun. 10, 2003, entitled “Systems and Processes for Decoding Chain Reaction Codes through Inactivation” (referred to herein as “Inactivation Decoding”).

Luby, Michael G., Shokrollahi, M. Amin

Patent Priority Assignee Title
10015486, Oct 26 2012 Apple Inc Enhanced video decoding with application layer forward error correction
10855736, Sep 22 2009 Qualcomm Incorporated Enhanced block-request streaming using block partitioning or request controls for improved client-side handling
11477253, Sep 22 2009 Qualcomm Incorporated Enhanced block-request streaming system using signaling or block creation
11743317, Sep 22 2009 Qualcomm Incorporated Enhanced block-request streaming using block partitioning or request controls for improved client-side handling
11770432, Sep 22 2009 Qualcomm Incorporated Enhanced block-request streaming system for handling low-latency streaming
8806050, Aug 10 2010 Qualcomm Incorporated Manifest file updates for network streaming of coded multimedia data
8887020, Oct 06 2003 Qualcomm Incorporated Error-correcting multi-stage code generator and decoder for communication systems having single transmitters or multiple transmitters
8918533, Jul 13 2010 Qualcomm Incorporated Video switching for streaming video data
8958375, Feb 11 2011 Qualcomm Incorporated Framing for an improved radio link protocol including FEC
9136878, May 09 2005 Qualcomm Incorporated File download and streaming system
9136983, Feb 13 2006 Qualcomm Incorporated Streaming and buffering using variable FEC overhead and protection periods
9143785, Oct 25 2012 COFDM broadcast systems employing turbo coding
9166734, Aug 14 2013 National Cheng Kung University Method and device for frame aggregation transmission of wireless network system
9178535, Apr 16 2007 Qualcomm Incorporated Dynamic stream interleaving and sub-stream based delivery
9185439, Jul 15 2010 Qualcomm Incorporated Signaling data for multiplexing video components
9191151, Sep 22 2009 Qualcomm Incorporated Enhanced block-request streaming using cooperative parallel HTTP and forward error correction
9209934, Sep 22 2009 Qualcomm Incorporated Enhanced block-request streaming using cooperative parallel HTTP and forward error correction
9236885, Oct 05 2002 Qualcomm Incorporated Systematic encoding and decoding of chain reaction codes
9236887, May 07 2004 Qualcomm Incorporated File download and streaming system
9236976, Dec 21 2001 Qualcomm Incorporated Multi stage code generator and decoder for communication systems
9237101, Sep 12 2007 Qualcomm Incorporated Generating and communicating source identification information to enable reliable communications
9240810, Jun 11 2002 Qualcomm Incorporated Systems and processes for decoding chain reaction codes through inactivation
9246633, Sep 23 1998 Qualcomm Incorporated Information additive code generator and decoder for communication systems
9253233, Aug 31 2011 Qualcomm Incorporated Switch signaling methods providing improved switching between representations for adaptive HTTP streaming
9264069, May 10 2006 Qualcomm Incorporated Code generator and decoder for communications systems operating using hybrid codes to allow for multiple efficient uses of the communications systems
9270299, Feb 11 2011 Qualcomm Incorporated Encoding and decoding using elastic codes with flexible source block mapping
9270414, Feb 21 2006 Qualcomm Incorporated Multiple-field based code generator and decoder for communications systems
9281847, Feb 27 2009 Qualcomm Incorporated Mobile reception of digital video broadcasting—terrestrial services
9288010, Oct 22 2010 Qualcomm Incorporated Universal file delivery methods for providing unequal error protection and bundled file delivery services
9294226, Mar 26 2012 Qualcomm Incorporated Universal object delivery and template-based file delivery
9319448, Aug 10 2010 Qualcomm Incorporated Trick modes for network streaming of coded multimedia data
9380096, Sep 22 2009 Qualcomm Incorporated Enhanced block-request streaming system for handling low-latency streaming
9386064, Sep 22 2009 Qualcomm Incorporated Enhanced block-request streaming using URL templates and construction rules
9419749, Aug 19 2009 Qualcomm Incorporated Methods and apparatus employing FEC codes with permanent inactivation of symbols for encoding and decoding processes
9432433, Sep 22 2009 Qualcomm Incorporated Enhanced block-request streaming system using signaling or block creation
9456015, Aug 10 2010 Qualcomm Incorporated Representation groups for network streaming of coded multimedia data
9485546, Jun 29 2010 Qualcomm Incorporated Signaling video samples for trick mode video representations
9596447, Jul 21 2010 Qualcomm Incorporated Providing frame packing type information for video coding
9602802, Jul 21 2010 Qualcomm Incorporated Providing frame packing type information for video coding
9628536, Sep 22 2009 Qualcomm Incorporated Enhanced block-request streaming using cooperative parallel HTTP and forward error correction
9660763, Aug 19 2009 Qualcomm Incorporated Methods and apparatus employing FEC codes with permanent inactivation of symbols for encoding and decoding processes
9843844, Oct 05 2011 Qualcomm Incorporated Network streaming of media data
9876607, Aug 19 2009 Qualcomm Incorporated Methods and apparatus employing FEC codes with permanent inactivation of symbols for encoding and decoding processes
9917874, Sep 22 2009 Qualcomm Incorporated Enhanced block-request streaming using block partitioning or request controls for improved client-side handling
9992555, Jun 29 2010 Qualcomm Incorporated Signaling random access points for streaming video data
Patent Priority Assignee Title
3909721,
4365338, Jun 27 1980 Harris Corporation Technique for high rate digital transmission over a dynamic dispersive channel
4589112, Jan 26 1984 International Business Machines Corporation System for multiple error detection with single and double bit error correction
4901319, Mar 18 1988 Lockheed Martin Corporation Transmission system with adaptive interleaving
5136592, Jun 28 1989 Quantum Corporation Error detection and correction system for long burst errors
5153591, Jul 05 1988 British Telecommunications public limited company Method and apparatus for encoding, decoding and transmitting data in compressed form
5329369, Jun 01 1990 FUNAI ELECTRIC CO , LTD Asymmetric picture compression
5331320, Nov 21 1991 International Business Machines Corporation Coding method and apparatus using quaternary codes
5371532, May 15 1992 TTI Inventions A LLC Communications architecture and method for distributing information services
5372532, Jan 26 1993 Swivel head cap connector
5379297, Apr 09 1992 Network Equipment Technologies, Inc Concurrent multi-channel segmentation and reassembly processors for asynchronous transfer mode
5421031, Aug 23 1989 DETA TECHNOLOGY DEVELOPMENT LLC Program transmission optimisation
5425050, Oct 23 1992 Massachusetts Institute of Technology Television transmission system using spread spectrum and orthogonal frequency-division multiplex
5432787, Mar 24 1994 Lockheed Martin Corporation Packet data transmission system with adaptive data recovery method
5455823, Nov 05 1990 VIATECH COMMUNICATIONS, LLC Integrated communications terminal
5465318, Mar 28 1991 Nuance Communications, Inc Method for generating a speech recognition model for a non-vocabulary utterance
5517508, Jan 26 1994 Sony Corporation; SONY TRANS COM, INC Method and apparatus for detection and error correction of packetized digital data
5524025, Nov 07 1990 THE CHASE MANHATTAN BANK, AS COLLATERAL AGENT Coding for digital transmission
5568614, Jul 29 1994 International Business Machines Corporation; International Business Machines Data streaming between peer subsystems of a computer system
5583784, May 14 1993 FRAUNHOFER-GESELLSCHAFT ZUR FORDERUNG DER ANGEWANDTEN FORSCHUNG E V Frequency analysis method
5608738, Nov 10 1993 Juniper Networks, Inc Packet transmission method and apparatus
5617541, Dec 21 1994 International Computer Science Institute System for packetizing data encoded corresponding to priority levels where reconstructed data corresponds to fractionalized priority level and received fractionalized packets
5642365, Jul 05 1993 Mitsubishi Denki Kabushiki Kaisha Transmitter for encoding error correction codes and a receiver for decoding error correction codes on a transmission frame
5659614, Nov 28 1994 DOMINION VENTURE FINANCE L L C Method and system for creating and storing a backup copy of file data stored on a computer
5699473, Oct 10 1995 Samsung Electronics Co., Ltd. Method for recording and reproducing intercoded data using two levels of error correction
5701582, Aug 23 1989 DETA TECHNOLOGY DEVELOPMENT LLC Method and apparatus for efficient transmissions of programs
5751336, Oct 12 1995 Cisco Technology, Inc Permutation based pyramid block transmission scheme for broadcasting in video-on-demand storage systems
5754563, Sep 11 1995 ECC Technologies, Inc. Byte-parallel system for implementing reed-solomon error-correcting codes
5757415, May 26 1994 Sony Corporation On-demand data transmission by dividing input data into blocks and each block into sub-blocks such that the sub-blocks are re-arranged for storage to data storage means
5805825, Jul 26 1995 Intel Corporation Method for semi-reliable, unidirectional broadcast information services
5835165, Jun 07 1995 AVAGO TECHNOLOGIES GENERAL IP SINGAPORE PTE LTD Reduction of false locking code words in concatenated decoders
5844636, May 13 1997 Hughes Electronics Corporation Method and apparatus for receiving and recording digital packet data
5852565, Jan 30 1996 DOLBY LABORATORIES, INC ; Dolby Laboratories Licensing Corporation Temporal and resolution layering in advanced television
5870412, Dec 12 1997 Hewlett Packard Enterprise Development LP Forward error correction system for packet based real time media
5903775, Jun 06 1996 International Business Machines Corporation Method for the sequential transmission of compressed video information at varying data rates
5917852, Jun 11 1997 L-3 Communications Corporation Data scrambling system and method and communications system incorporating same
5926205, Oct 19 1994 Google Technology Holdings LLC Method and apparatus for encoding and formatting data representing a video program to provide multiple overlapping presentations of the video program
5933056, Jul 15 1997 Exar Corporation Single pole current mode common-mode feedback circuit
5936659, Jan 31 1996 HANGER SOLUTIONS, LLC Method for video delivery using pyramid broadcasting
5936949, Sep 05 1996 QUARTERHILL INC ; WI-LAN INC Wireless ATM metropolitan area network
5953537, Feb 12 1993 ALTERA CORPORATION, A DE CORP Method and apparatus for reducing the number of programmable architecture elements required for implementing a look-up table in a programmable logic device
5970098, May 02 1997 Ikanos Communications, Inc Multilevel encoder
5983383, Jan 17 1997 Qualcom Incorporated Method and apparatus for transmitting and receiving concatenated code data
5993056, Apr 27 1995 CONGREVE ATMOSPHERIC LLC High integrity transport for time critical multimedia networking applications
6005477, Apr 17 1997 ABB Research LTD Method and apparatus for information transmission via power supply lines
6011590, Jan 03 1997 TERADATA US, INC Method of transmitting compressed information to minimize buffer space
6012159, Jan 17 1996 KENCAST, INC Method and system for error-free data transfer
6014706, Jan 30 1997 Microsoft Technology Licensing, LLC Methods and apparatus for implementing control functions in a streamed video display system
6018359, Apr 24 1998 Massachusetts Institute of Technology System and method for multicast video-on-demand delivery system
6041001, Feb 25 1999 U S BANK NATIONAL ASSOCIATION, AS COLLATERAL AGENT Method of increasing data reliability of a flash memory device without compromising compatibility
6044485, Jan 03 1997 Ericsson Inc.; Ericsson Inc Transmitter method and transmission system using adaptive coding based on channel characteristics
6073250, Nov 06 1997 Qualcomm Incorporated Loss resilient decoding technique
6079042, Apr 27 1995 ZARBAÑA DIGITAL FUND LLC High integrity transport for time critical multimedia networking applications
6081907, Jun 09 1997 Microsoft Technology Licensing, LLC Data delivery system and method for delivering data and redundant information over a unidirectional network
6081909, Nov 06 1997 Qualcomm Incorporated Irregularly graphed encoding technique
6081918, Nov 06 1997 Loss resilient code with cascading series of redundant layers
6088330, Sep 09 1997 California Institute of Technology Reliable array of distributed computing nodes
6097320, Jan 20 1998 Silicon Systems, Inc. Encoder/decoder system with suppressed error propagation
6134596, Sep 18 1997 Microsoft Technology Licensing, LLC Continuous media file server system and method for scheduling network resources to play multiple files having different data transmission rates
6141053, Jan 03 1997 TERADATA US, INC Method of optimizing bandwidth for transmitting compressed video data streams
6141788, Mar 13 1998 THE CHASE MANHATTAN BANK, AS COLLATERAL AGENT Method and apparatus for forward error correction in packet networks
6154452, May 26 1999 SIRIUS XM RADIO INC Method and apparatus for continuous cross-channel interleaving
6163870, Nov 06 1997 Qualcomm Incorporated Message encoding with irregular graphing
6175944, Jul 15 1997 WSOU Investments, LLC Methods and apparatus for packetizing data for transmission through an erasure broadcast channel
6178536, Aug 14 1997 TREND MICRO INCORPORATED Coding scheme for file backup and systems based thereon
6185265, Apr 07 1998 Worldspace Corporation System for time division multiplexing broadcast channels with R-1/2 or R-3/4 convolutional coding for satellite transmission via on-board baseband processing payload or transparent payload
6195777, Nov 06 1997 Hewlett Packard Enterprise Development LP Loss resilient code with double heavy tailed series of redundant layers
6223324, Jan 05 1999 AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE LIMITED Multiple program unequal error protection for digital audio broadcasting and other applications
6229824, May 26 1999 SIRIUS XM RADIO INC Method and apparatus for concatenated convolutional endcoding and interleaving
6243846, Dec 12 1997 Hewlett Packard Enterprise Development LP Forward error correction system for packet based data and real time media, using cross-wise parity calculation
6272658, Oct 27 1997 Kencast, Inc. Method and system for reliable broadcasting of data files and streams
6278716, Mar 23 1998 MASSACHUSETTS, UNIVERSITY OF Multicast with proactive forward error correction
6298462, Jun 25 1997 SAMSUNG ELECTRONICS CO , LTD Data transmission method for dual diversity systems
6307487, Sep 23 1998 Qualcomm Incorporated Information additive code generator and decoder for communication systems
6314289, Dec 03 1998 Fraunhofer-Gesellschaft zur Forderung der Angewandten Forschung E.V. Apparatus and method for transmitting information and apparatus and method for receiving information
6320520, Feb 05 1999 Qualcomm Incorporated Information additive group code generator and decoder for communications systems
6333926, Aug 11 1998 Ericsson AB Multiple user CDMA basestation modem
6373406, Sep 23 1998 Qualcomm Incorporated Information additive code generator and decoder for communication systems
6393065, Aug 29 1997 Canon Kabushiki Kaisha Coding and decoding methods and devices and equipment using them
6411223, Oct 18 2000 Qualcomm Incorporated Generating high weight encoding symbols using a basis
6415326, Sep 15 1998 Microsoft Technology Licensing, LLC Timeline correlation between multiple timeline-altered media streams
6420982, Mar 23 2000 CONVERSANT INTELLECTUAL PROPERTY MANAGEMENT INC Multi-stage lookup for translating between signals of different bit lengths
6421387, May 15 1998 North Carolina State University Methods and systems for forward error correction based loss recovery for interactive video transmission
6430233, Aug 30 1999 Hughes Electronics Corporation Single-LNB satellite data receiver
6445717, May 01 1998 Niwot Networks, Inc.; NIWOT NETWORKS, INC System for recovering lost information in a data stream
6459811, Apr 02 1998 Akikaze Technologies, LLC Bursty data transmission of compressed video data
6466698, Mar 25 1999 The United States of America as represented by the Secretary of the Navy Efficient embedded image and video compression system using lifted wavelets
6473010, Apr 04 2000 CAVIUM INTERNATIONAL; MARVELL ASIA PTE, LTD Method and apparatus for determining error correction code failure rate for iterative decoding algorithms
6486803, Sep 22 2000 Qualcomm Incorporated On demand encoding with a window
6487692, Dec 21 1999 AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE LIMITED Reed-Solomon decoder
6496980, Dec 07 1998 Intel Corporation Method of providing replay on demand for streaming digital multimedia
6497479, Apr 27 2001 HEWLETT-PACKARD DEVELOPMENT COMPANY, L P Higher organic inks with good reliability and drytime
6523147, Nov 11 1999 iBiquity Digital Corporation Method and apparatus for forward error correction coding for an AM in-band on-channel digital audio broadcasting system
6535920, Apr 06 1999 Microsoft Technology Licensing, LLC Analyzing, indexing and seeking of streaming information
6577599, Jun 30 1999 Oracle America, Inc Small-scale reliable multicasting
6584543, Jul 22 1999 Round Rock Research, LLC Reconfigurable memory with selectable error correction storage
6609223, Apr 06 1999 KENCAST, INC METHOD FOR PACKET-LEVEL FEC ENCODING, IN WHICH ON A SOURCE PACKET-BY-SOURCE PACKET BASIS, THE ERROR CORRECTION CONTRIBUTIONS OF A SOURCE PACKET TO A PLURALITY OF WILDCARD PACKETS ARE COMPUTED, AND THE SOURCE PACKET IS TRANSMITTED THEREAFTER
6614366, Sep 23 1998 Qualcomm Incorporated Information additive code generator and decoder for communication systems
6618451, Feb 13 1999 AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE LIMITED Efficient reduced state maximum likelihood sequence estimator
6633856, Jun 15 2001 Qualcomm Incorporated Methods and apparatus for decoding LDPC codes
6641366, Jan 26 2001 Wind power generating system with an obstruction lighting or night marking device
6643332, Jul 09 1999 AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE LIMITED Method and apparatus for multi-level coding of digital signals
6677864, Apr 18 2002 Telefonaktiebolaget L.M. Ericsson Method for multicast over wireless networks
6678855, Dec 02 1999 Microsoft Technology Licensing, LLC Selecting K in a data transmission carousel using (N,K) forward error correction
6694476, Jun 02 2000 MICROSEMI STORAGE SOLUTIONS, INC Reed-solomon encoder and decoder
6704370, Oct 09 1998 Apple Inc Interleaving methodology and apparatus for CDMA
6732325, Nov 08 2000 ARRIS ENTERPRISES LLC Error-correction with limited working storage
6742154, May 25 2000 Ciena Corporation Forward error correction codes for digital optical network optimization
6748441, Dec 02 1999 Microsoft Technology Licensing, LLC Data carousel receiving and caching
6751772, Jul 06 1999 Samsung Electronics Co., Ltd.; SAMSUNG ELECTRONICS CO , LTD Rate matching device and method for a data communication system
6765866, Feb 29 2000 CONVERSANT INTELLECTUAL PROPERTY MANAGEMENT INC Link aggregation
6810499, Jun 02 2000 MICROSEMI COMMUNICATIONS, INC Product code based forward error correction system
6820221, Apr 13 2001 Hewlett Packard Enterprise Development LP System and method for detecting process and network failures in a distributed system
6831172, Nov 11 1998 SIGMA ITALIA SPA Cross-linked hyaluronic acids and medical uses thereof
6849803, Jan 15 1998 Arlington Industries, Inc. Electrical connector
6850736, Dec 21 2000 MATSUSHITA ELECTRIC INDUSTRIAL CO , LTD Method and apparatus for reception quality indication in wireless communication
6856263, Jun 11 2002 Qualcomm Incorporated Systems and processes for decoding chain reaction codes through inactivation
6868083, Feb 16 2001 VALTRUS INNOVATIONS LIMITED Method and system for packet communication employing path diversity
6882618, Sep 07 1999 Redwood Technologies, LLC Transmitting apparatus, receiving apparatus, communication system, transmission method, reception method, and communication method
6895547, Jul 11 2001 GLOBALFOUNDRIES U S INC Method and apparatus for low density parity check encoding of data
6909383, Oct 05 2002 Qualcomm Incorporated Systematic encoding and decoding of chain reaction codes
6928603, Jul 19 2001 NETGEAR, Inc System and method for interference mitigation using adaptive forward error correction in a wireless RF data transmission system
6937618, May 20 1998 Sony Corporation Separating device and method and signal receiving device and method
6956875, Jun 19 2002 MAGNOLIA LICENSING LLC Technique for communicating variable bit rate data over a constant bit rate link
6965636, Feb 01 2000 ARRIS SOLUTIONS, INC System and method for block error correction in packet-based digital communications
6995692, Oct 14 2003 Panasonic Corporation Data converter and method thereof
7010052, Apr 16 2001 The Ohio University Apparatus and method of CTCM encoding and decoding for a digital communication system
7030785, Jun 11 2002 Qualcomm Incorporated Systems and processes for decoding a chain reaction code through inactivation
7057534, Sep 23 1998 Qualcomm Incorporated Information additive code generator and decoder for communication systems
7068729, Dec 21 2001 Qualcomm Incorporated Multi-stage code generator and decoder for communication systems
7072971, Nov 13 2000 Qualcomm Incorporated Scheduling of multiple files for serving on a server
7110412, Sep 18 2001 SBC Technology Resources, INC Method and system to transport high-quality video signals
7139660, Jul 14 2004 GM Global Technology Operations LLC System and method for changing motor vehicle personalization settings
7139960, Oct 06 2003 Qualcomm Incorporated Error-correcting multi-stage code generator and decoder for communication systems having single transmitters or multiple transmitters
7154951, Mar 14 1997 Microsoft Technology Licensing, LLC Motion video signal encoder and encoding method
7164370, Oct 06 2005 Analog Devices, Inc.; ANALOG DEVICES INC System and method for decoding data compressed in accordance with dictionary-based compression schemes
7168030, Oct 17 2003 TELEFONAKTIEBOLAGET LM ERICSSON PUBL Turbo code decoder with parity information update
7219289, Mar 15 2005 Tandberg Data Corporation Multiply redundant raid system and XOR-efficient method and apparatus for implementing the same
7231404, Jan 31 2003 SAMSUNG ELECTRONICS CO , LTD Datacast file transmission with meta-data retention
7233264, Sep 23 1998 Qualcomm Incorporated Information additive code generator and decoder for communication systems
7240236, Mar 23 2004 Hitachi Vantara Corporation Fixed content distributed data storage using permutation ring encoding
7240358, Dec 08 2000 Qualcomm Incorporated Methods and apparatus for scheduling, serving, receiving media-on demand for clients, servers arranged according to constraints on resources
7243285, Feb 05 1999 Qualcomm Incorporated Systems and methods for broadcasting information additive codes
7249291, Feb 15 2002 Qualcomm Incorporated System and method for reliably communicating the content of a live data stream
7254754, Jul 14 2003 CAVIUM INTERNATIONAL; MARVELL ASIA PTE, LTD Raid 3+3
7257764, Nov 03 2003 AVAGO TECHNOLOGIES GENERAL IP SINGAPORE PTE LTD FEC (Forward Error Correction) decoder with dynamic parameters
7265688, Jun 11 2002 Qualcomm Incorporated Systems and processes for decoding a chain reaction code through inactivation
7293222, Jan 29 2003 Qualcomm Incorporated Systems and processes for fast encoding of hamming codes
7318180, Apr 17 1998 AT&T INTELLECTUAL PROPERTY 1, L P Method and system for adaptive interleaving
7320099, Aug 25 2004 Fujitsu Limited Method and apparatus for generating error correction data, and a computer-readable recording medium recording an error correction data generating program thereon
7391717, Jun 30 2003 Microsoft Technology Licensing, LLC Streaming of variable bit rate multimedia content
7394407, Oct 05 2002 Qualcomm Incorporated Systematic encoding and decoding of chain reaction codes
7398454, Dec 21 2004 Tyco Electronics Subsea Communications LLC System and method for forward error correction decoding using soft information
7409626, Jul 28 2004 Ikanos Communications, Inc Method and apparatus for determining codeword interleaver parameters
7412641, Dec 01 2003 Qualcomm Incorporated Protection of data from erasures using subsymbol based codes
7418651, May 07 2004 Qualcomm Incorporated File download and streaming system
7451377, Oct 06 2003 Qualcomm Incorporated Error-correcting multi-stage code generator and decoder for communication systems having single transmitters or multiple transmitters
7483489, Jan 30 2002 III Holdings 6, LLC Streaming multimedia data over a network having a variable bandwith
7512697, Nov 13 2000 Qualcomm Incorporated Scheduling of multiple files for serving on a server
7525994, Jan 30 2003 INTELLECTUAL VENTURES ASSETS 3 LLC Packet data flow identification for multiplexing
7532132, Oct 05 2002 Qualcomm Incorporated Systematic encoding and decoding of chain reaction codes
7559004, Oct 01 2003 INNOVATIVE MEMORY SYSTEMS, INC Dynamic redundant area configuration in a non-volatile memory system
7570665, Jun 11 2002 TELEFONAKTIEBOLAGET LM ERICSSON PUBL Generation of mixed media streams
7590118, Dec 23 2003 AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE LIMITED Frame aggregation format
7597423, Nov 23 2002 Memjet Technology Limited Printhead chip with high nozzle areal density
7613183, Oct 31 2000 AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE LIMITED System and method for router data aggregation and delivery
7633413, Jun 11 2002 Qualcomm Incorporated Systems and processes for decoding a chain reaction code through inactivation
7633970, May 07 2004 AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE LIMITED MAC header compression for use with frame aggregation
7644335, Jun 10 2005 Qualcomm Incorporated In-place transformations with applications to encoding and decoding various classes of codes
7650036, Oct 16 2003 Sharp Kabushiki Kaisha System and method for three-dimensional video coding
7711068, Dec 21 2001 Qualcomm Incorporated Multi-stage code generator and decoder for communication systems
7720174, Dec 21 2001 Qualcomm Incorporated Multi-stage code generator and decoder for communication systems
7721184, Aug 11 2004 Qualcomm Incorporated Method and apparatus for fast encoding of data symbols according to half-weight codes
7812743, Sep 23 1998 Qualcomm Incorporated Information additive code generator and decoder for communication systems
7831896, Sep 11 2003 Runcom Technologies Ltd Iterative forward error correction
20010033586,
20020049947,
20020053062,
20020081977,
20020085013,
20020087685,
20020133247,
20020191116,
20030037299,
20030058958,
20030086515,
20030101408,
20030106014,
20030226089,
20030235219,
20040031054,
20040066854,
20040075592,
20040075593,
20040117716,
20040151109,
20040151206,
20040153468,
20040207548,
20040231004,
20050018635,
20050028067,
20050041736,
20050102371,
20050102598,
20050138286,
20050152359,
20050169379,
20050195899,
20050195900,
20050219070,
20050249222,
20050254575,
20050257106,
20060020796,
20060036930,
20060037057,
20060048036,
20060080588,
20060093634,
20060109805,
20060136797,
20060193524,
20060212782,
20060262856,
20060279437,
20060280254,
20070028099,
20070078876,
20070081562,
20070081586,
20070110074,
20070127576,
20070134005,
20070157267,
20070176800,
20070195894,
20070201549,
20070204196,
20070230568,
20070233784,
20070300127,
20080034273,
20080052753,
20080058958,
20080059532,
20080086751,
20080134005,
20080152241,
20080168133,
20080168516,
20080172712,
20080232357,
20080256418,
20080303893,
20080303896,
20080309525,
20090003439,
20090031199,
20090043906,
20090067551,
20090106356,
20090125636,
20090150557,
20090158114,
20090189792,
20090195640,
20090201990,
20090204877,
20090210547,
20090222873,
20090287841,
20090307565,
20090319563,
20100011274,
20100020871,
20100046906,
20100103001,
20100165077,
20100211690,
20100223533,
20100235472,
20110019769,
20110083144,
20110096828,
20110103519,
20110119394,
20110119396,
20110216541,
20110231519,
20110231569,
20110238789,
20110239078,
20110258510,
20110280311,
20110280316,
20110299629,
20120013746,
20120016965,
20120020413,
20120042050,
20120042089,
20120042090,
CN1425228,
CN1792056,
EP669587,
EP784401,
EP854650,
EP903955,
EP1024672,
EP1051027,
EP1124344,
EP1241795,
EP1298931,
EP1455504,
EP1468497,
EP1501318,
EP1670256,
EP1700410,
EP1755248,
EP2046044,
EP2071827,
EP2096870,
EP2323390,
JP11112479,
JP2000216835,
JP2000307435,
JP2001036417,
JP2001189665,
JP2001274855,
JP2002204219,
JP2003018568,
JP2003507985,
JP2004048704,
JP2004135013,
JP2004165922,
JP2004289621,
JP2005204170,
JP2005223433,
JP2005514828,
JP2006074335,
JP2006074421,
JP2006174032,
JP2006505177,
JP2007520961,
JP2008502212,
JP2008508762,
JP2008543142,
JP3809957,
JP3976163,
KR20100028156,
RU2189629,
RU2265960,
WO18017,
WO14921,
WO52600,
WO120786,
WO157667,
WO158130,
WO158131,
WO227988,
WO247391,
WO3056703,
WO3105350,
WO2004008735,
WO2004015948,
WO2004019521,
WO2004030273,
WO2004034589,
WO2004040831,
WO2004047455,
WO2005036753,
WO2005041421,
WO2005078982,
WO2005112250,
WO2006020826,
WO2006084503,
WO2007042916,
WO2007090834,
WO2008054100,
WO2008085013,
WO2008148708,
WO2008156390,
WO2010085361,
WO2010088420,
WO2010120804,
WO9634463,
WO9804973,
WO9832231,
////
Executed onAssignorAssigneeConveyanceFrameReelDoc
Nov 04 2003LUBY, MICHAEL G DIGITAL FOUNTAIN, INC ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0281940238 pdf
Nov 06 2003SHOKROLLAHI, M AMINDIGITAL FOUNTAIN, INC ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0281940238 pdf
Nov 17 2011Qualcomm Incorporated(assignment on the face of the patent)
Mar 15 2018DIGITAL FOUNTAIN, INC Qualcomm IncorporatedASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0456410207 pdf
Date Maintenance Fee Events
Oct 29 2012M1551: Payment of Maintenance Fee, 4th Year, Large Entity.
Oct 27 2016M1552: Payment of Maintenance Fee, 8th Year, Large Entity.
Sep 28 2020M1553: Payment of Maintenance Fee, 12th Year, Large Entity.


Date Maintenance Schedule
Oct 16 20154 years fee payment window open
Apr 16 20166 months grace period start (w surcharge)
Oct 16 2016patent expiry (for year 4)
Oct 16 20182 years to revive unintentionally abandoned end. (for year 4)
Oct 16 20198 years fee payment window open
Apr 16 20206 months grace period start (w surcharge)
Oct 16 2020patent expiry (for year 8)
Oct 16 20222 years to revive unintentionally abandoned end. (for year 8)
Oct 16 202312 years fee payment window open
Apr 16 20246 months grace period start (w surcharge)
Oct 16 2024patent expiry (for year 12)
Oct 16 20262 years to revive unintentionally abandoned end. (for year 12)