A device includes a first memory circuit and a processing circuit. The first memory circuit is configured to store first hash data. The processing circuit is coupled to the first memory circuit. The processing circuit is configured to: at least based on a volume of the device, define a size of a distinguishable identification (id) and a size of second hash data; based on a combination of at least one bit of each of the distinguishable id and ids of the device, generate the second hash data; and compare the first hash data with the second hash data, in order to identify whether the device is tampered. A method is also discloses herein.
|
1. A device, comprising:
a first memory circuit configured to store first hash data; and
a processing circuit coupled to the first memory circuit and configured to:
at least based on a volume of the device, define a size of a distinguishable identification (id) and a size of second hash data;
based on a bit sequence formed by at least one bit of each of the distinguishable id and ids of the device in sequence, generate the second hash data; and
compare the first hash data with the second hash data, in order to identify whether the device is tampered.
15. A method, comprising:
receiving first hash data that has a defined size associated with a volume of a device;
combining at least one bit of each of a distinguishable identification (id), a program id, and a factory id of the device in sequence, to form id data based on a bit sequence formed by the at least one bit of each of the distinguishable id, the program id, and the factory id of the device in sequence;
processing the id data through a selected hash algorithm, to generate second hash data; and
comparing the first hash data with the second hash data, to perform an authentication of the device.
8. A method, comprising:
calculating a bit length of a distinguishable identification (id) based on at least one parameter that is associated with a volume of a device;
combing at least one bit of each of a factory id, the distinguishable id, and a program id of the device in sequence, to generate first hash data having a bit length that is defined based on the volume of the device based on a bit sequence formed by the at least one bit of each of the factory id, the distinguishable id, and the program id of the device in sequence; and
initiating authentication of the device, when the first hash data is matched with second hash data.
2. The device of
the size of the distinguishable id is defined further based on a collision rate of the ids and 0/1 probability, and
the size of the second hash data is defined further based on the collision rate of the ids.
3. The device of
based on the volume of the device and a collision rate of the ids, define a size of the first hash data,
wherein the size of the first hash data is equal to the size of the second hash data.
4. The device of
select a hash algorithm; and
based on the selected hash algorithm, process the distinguishable id and the ids, to generate the first hash data.
5. The device of
select the at least one bit of each of the distinguishable id, a factory id and a program id of the ids; and
combine the selected at least one bit of each of the distinguishable id, the factory id and the program id of the ids.
6. The device of
a second memory circuit coupled to the processing circuit, and configured to store the distinguishable id and the ids comprising a factory id and a program id.
7. The device of
9. The method of
10. The method of
calculating the bit length of the first hash data based on the volume of the device and a collision rate of ids,
wherein the bit length of the first hash data is the same as a bit length of the second hash data.
11. The method of
comparing the first hash data with the second hash data, to perform an off-line authentication.
12. The method of
selecting at least one bit of the distinguishable id, at least one bit of the factory id and at least one bit of the program id, to be combined to form id data; and
generating the second hash data based on the id data.
13. The method of
selecting a hash algorithm based on a secure requirement; and
processing the id data via the selected hash algorithm.
14. The method of
generating the second hash data based on the distinguishable id, the factory id and the program id, in response to a power-on event or a request.
16. The method of
defining a size of the distinguishable id based on the volume of the device, a collision rate of ids and 0/1 probability,
wherein the defined size of the first hash data is further associated with the collision rate of ids.
17. The method of
selecting a first number of bits from the distinguishable id, to be combined to form the id data;
selecting a second number of bits from the factory id, to be combined to form the id data; and
selecting a third number of bits from the program id, to be combined to form the id data.
18. The method of
each one of the first number, the second number and the third number is greater than one, and
at least one of the first number, the second number and the third number is different from the others.
19. The method of
defining a size of the second hash data based on the volume of the device and a collision rate of ids,
wherein the size of the second hash data is the same as the defined size of the first hash data.
20. The method of
when the first hash data and the second hash data are mismatched, determining that the device has been tampered.
|
The present application is a continuation application of the U.S. application Ser. No. 15/907,190, filed Feb. 27, 2018, which claims priority to U.S. Provisional Application No. 62/565,903, filed Sep. 29, 2017, all of which are herein incorporated by reference.
In Internet of things (IOT) application, authentication is important to ensure security during usage of an IOT device. Moreover, uniqueness of device identification (ID) is necessary in order to provide secure usage of the IOT device.
Aspects of the present disclosure are best understood from the following detailed description when read with the accompanying figures. It is noted that, in accordance with the standard practice in the industry, various features are not drawn to scale. In fact, the dimensions of the various features may be arbitrarily increased or reduced for clarity of discussion.
The following disclosure provides many different embodiments, or examples, for implementing different features of the provided subject matter. Specific examples of components, materials, values, steps, arrangements or the like are described below to simplify the present disclosure. These are, of course, merely examples and are not intended to be limiting. Other components, materials, values, steps, arrangements or the like are contemplated. For example, the formation of a first feature over or on a second feature in the description that follows may include embodiments in which the first and second features are formed in direct contact, and may also include embodiments in which additional features may be formed between the first and second features, such that the first and second features may not be in direct contact. In addition, the present disclosure may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed.
Further, spatially relative terms, such as “beneath,” “below,” “lower,” “above,” “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. The spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. The apparatus may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein may likewise be interpreted accordingly. The term mask, photolithographic mask, photomask and reticle are used to refer to the same item.
The terms used in this specification generally have their ordinary meanings in the art and in the specific context where each term is used. The use of examples in this specification, including examples of any terms discussed herein, is illustrative only, and in no way limits the scope and meaning of the disclosure or of any exemplified term. Likewise, the present disclosure is not limited to various embodiments given in this specification.
It will be understood that, although the terms “first,” “second,” etc., may be used herein to describe various elements, these elements should not be limited by these terms. These terms are used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the embodiments. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
As used herein, the terms “comprising,” “including,” “having,” “containing,” “involving,” and the like are to be understood to be open-ended, that is, to mean including but not limited to.
Reference is made to
As illustratively shown in
For illustration, the factory ID 111 includes, but not limited to, data indicating a serial number of a wafer.
The PUF ID 112 is a unique identifier obtained from a challenge-response mechanism employing variations, in manufacturing processes, of circuits, in some embodiments. For illustration, the relation between a challenge and a corresponding response is determined by statistical variations in logic components and interconnects in an integrated circuit. The PUF ID 112 in electronic circuits is used to distinguish integrated circuits (ICs) from one another. Using a PUF to distinguish one IC from another is a valuable way to authenticate ICs. The applications for such authentication include, for example, anti-counterfeiting, inventory control, multi-factor authentication, secret key generation for cryptographic and other security applications. An effective authentication mechanism can be executed in a number of ways, but typically involves the use of digital challenges (strings of 1's and 0's) which, when applied to a typical PUF circuit, generates corresponding digital responses (another string of 1's and 0's) that differ from one IC to the other. The PUFs include, for example, an analog PUF, a metal resistance PUF, an SRAM PUF and a circuit delay PUF which includes, for example, ring oscillator PUF. The types of PUF discussed above are given for illustrative purposes. Various types of PUF are within the contemplated scope of the present disclosure.
The program ID 113 is an identifier allocated to a program that indicates, for illustration, conditions and/or procedures of a process, and the program ID 113 is able to be programmed during manufacturing.
The IDs discussed above are given for illustrative purposes. Various types of IDs stored in the memory circuit 110 are within the contemplated scope of the present disclosure. For example, in various embodiments, the PUF ID 112 is replaced with an ID generated by a true random number generator (TRNG) (not shown). For illustration, the ID generated by the TRNG is also a unique identifier allowing the device to be distinguished from other devices. The true random number generator applied in the present disclosure is known and discussed in, for example, the U.S. application Ser. No. 15/724,671, filed Oct. 4, 2017, which claims the benefit of U.S. Provisional No. 62/491,541, filed Apr. 28, 2017, all of which are herein incorporated by reference in their entireties.
For illustration, the true random number generator, as discussed above, represents an electrical circuit that provides a sequence of random numbers. In some embodiments, the term “true random number” refers to the fact that the random number is generated taking account of physical noise of the circuitry environment, including physical noise of at least one of circuits/sub-circuits described in this document. Such circuits/sub-circuits include, for example, circuits 100, 110, 120, 230, etc., and corresponding sub-circuits 111, 112, 113, 121, 123, etc. During operation, one or more microscopic phenomena inside and/or outside of the true random number generator cause, for illustration, one or more low-level, statistically random entropy noise signals to be present within the true random number generator. The true random number generator utilizes the one or more low-level, statistically random entropy noise signals to provide the sequence of random numbers.
The one or more microscopic phenomena inside of the true random number generator include shot noise, flicker noise, burst noise, transit noise, and/or any other statistically random microscopic phenomenon existing inside of the true random number generator, in some embodiments. The one or more microscopic phenomena outside of the true random number generator include intermodulation noise, crosstalk, interference, atmospheric noise, industrial noise, extraterrestrial noise and/or any other statistically random microscopic phenomenon existing outside of the true random number generator, in some embodiments. The microscopic phenomena discussed above are given for illustrative purposes. Various microscopic phenomena are within the contemplated scope of the present disclosure.
The above implementation of the memory circuit 110 is given for illustrative purposes. Various implementations of the memory circuit 110 are within the contemplated scope of the present disclosure. In some embodiments, the memory circuit 110 includes, but not limited to, a static random access memory (SRAM).
In some embodiments, the processing circuit 120 includes a data generating circuit 121, a memory circuit 122 and a comparing circuit 123. The data generating circuit 121 is configured to generate, for illustration, hash data HD1, according to the factory ID 111, the PUF ID 112, the program ID 113, or the combination thereof. In some embodiments, the hash data HD1 include values that are derived by a hash function used to map data of arbitrary size to data of fixed size. In some embodiments, the data generating circuit 121 is also referred to as a hash engine.
The memory circuit 122 is configured to store the hash data HD1 generated by the data generating circuit 121. In some embodiments, the memory circuit 122 is a one-time programmable (OTP) memory circuit. The comparing circuit 123 is configured to compare the hash data HD1 from the memory circuit 122 and other hash data that are further generated by the data generating circuit 121.
The above configuration of the processing circuit 120 is given for illustrative purposes. Various configurations of the processing circuit 120 are within the contemplated scope of the present disclosure. For example, in various embodiments, the memory circuit 122 is configured outside the processing circuit 120 and is independent from the processing circuit 120. For another example, in various embodiments, the hash data HD1 are not generated by the data generating circuit 121, and, for illustration, the hash data HD1 are generated outside the device 100 and are pre-stored in the memory circuit 122.
In operation S401, for illustration of
In some embodiments, the processing circuit 120 is configured to define the bit length of the PUF ID 112. In various embodiments, an additional circuit (not shown) is configured to define the bit length of the PUF ID 112.
For illustration, the bit length of the PUF ID 112 is defined according to a number (i.e., volume) of the device 100, where the number of the device 100 is denoted by K. For example, the data generating circuit 121 defines a minimum bit length PUF_length of the PUF ID 112 by formulas Eq. 1, Eq. 2, and Eq. 3 as follows.
PUF_length>min_length/H(p) Eq. 1
min_length>(2 log K−log(2Cr)) Eq. 2
H(p)=−p log 2p−(1−p)log2(1−p) Eq.3
In the formulas Eqs. 1-3, H(p) is Shannon entropy, p is 0/1 probability, and Cr is a maximum limit of a collision rate of the IDs of the device 100. In some embodiments, the number K, the collision rate Cr and 0/1 probability p are associated with and/or determined by the volume of the device 100.
For example, by the formulas Eqs. 1-3, if the number K is one billion, the collision rate Cr is 0.2 per billion, and 0/1 probability p is 0.6, the data generating circuit 121 then defines 94 bits as the minimum bit length PUF_length. For another example, by the formulas Eqs. 1-3, if the number K is 100,000, the collision rate Cr is 0.2 per billion, and 0/1 probability p is 0.6, the data generating circuit 121 then defines 67 bits as the minimum bit length PUF_length. Based on the above, the bit length or size of the PUF ID 112 is able to be flexibly adjusted according to the volume of the device 100.
In some other approaches, a bit length or size of the PUF ID is fixed. Accordingly, too much data, more than necessary, corresponding to the PUF ID, is stored. As a result, the space of chip, for storing the data, are over designed and wasted.
Compared to the above approaches, the bit length or size of the PUF ID 112 in the present disclosure is able to be flexibly adjusted according to the volume of the device 100. With the flexibly adjusted PUF ID 112, overdesign of the memory circuit 110 including the PUF ID 112 is effectively avoided.
In various embodiments, the PUF ID 112 is replaced with an ID generated by a true random number generator (TRNG), as discussed above. Corresponding to the operation S401 discussed above, a minimum bit length TRNG_length of the ID generated by the TRNG is also able to be defined according to the volume (or the number K) of the device 100 in various embodiments. In some embodiments, the minimum bit length TRNG_length of the ID generated by the TRNG is also defined based on the algorithms as discussed above with respect to the PUF ID 112.
After operation S401, operations S402-S406 are performed. In some embodiments, the operations S402-S406 are performed in a manufacture stage.
In operation S402, with reference to
In operation S403, with reference to
The above ID configurations in the memory circuit 110 and the remote database 230 are given for illustrative purposes. Various ID configurations in the memory circuit 110 and the remote database 230 are within the contemplated scope of the present disclosure.
In operation S404, with reference to
For illustration in
hash_length>(2 log K−log(2Cr)) Eq. 4
In the formula Eq. 4, Cr is a maximum limit of a collision rate of hash data.
For example, by the formula Eq. 4, if the number K is one billion, and the collision rate Cr is 0.2 per billion, the data generating circuit 121 then defines 91 bits as the minimum bit length hash_length. For another example, by the formula Eq. 4, if the number K is 100,000, and the collision rate Cr is 0.2 per billion, the data generating circuit 121 then defines 65 bits as the minimum bit length hash_length.
In operation S405, for illustration in
In some embodiments, the processing circuit 120 selects at least one bit of the factory ID 111, at least one bit of the PUF ID 112, and at least one bit of the program ID 113, and combines the at least one bit of the factory ID 111, the at least one bit of the PUF ID 112, and the at least one bit of the program ID for forming the hash data HD1.
In some embodiments, the data generating circuit 121 selects T bits of the ID data corresponding to the factory ID 111, the PUF ID 112 and the program ID 113, in order for generating the hash data HD1. For illustration in
In some embodiments, the data generating circuit 121 selects T1 bits from the factory ID 111, T2 bits from the PUF ID 112, and T3 bits from the program ID 113. Each one of T1, T2 and T3 is larger than 1, and a sum of T1, T2 and T3 equals to T. For illustration, the selected T1 bits are included in data F[T1:1], the selected T2 bits are included in data U[T2:1], and the selected T3 bits are included in data P[T3:1]. The data generating circuit 121 then combines the data F[T1:1], the data U[T2:1] and the data P[T3:1] in sequence to form the ID data (F[T1:1], U[T2:1], P[T3:1]). The above combination of the data F[T1:1], the data U[T2:1] and the data P[T3:1] are given for illustrative purposes. Various combinations of the data F[T1:1], the data U[T2:1] and the data P[T3:1] are within the contemplated scope of the present disclosure. For example, for another illustration, the data generating circuit 121 combines the data U[T2:1], the data P[T3:1] and the data F[T1:1] in sequence to form the ID data (U[T2:1], P[T3:1], F[T1:1]).
As discussed above, the data generating circuit 121 selects a hash algorithm for processing the ID data, in order to generate the hash data HD1. Accordingly, for illustration, the data generating circuit 121 processes the ID data (F[T1:1], U[T2:1], P[T3:1]) via the selected message digest algorithm MD5 to generate the hash data HD1. For another illustration, the data generating circuit 121 processes the ID data (U[T2:1], P[T3:1], F[T1:1]) via the selected security hash algorithm SHA-1 to generate the hash data HD1.
The combination of bits of the factory ID 111, the PUF ID 112, and the program ID, for forming the hash data HD1, as discussed above, is given for illustrative purposes. Various ways of combing or mixing bits to form the hash data HD1 are within the contemplated scope of the present disclosure. For example, in various embodiments, bits of the factory ID 111, the PUF ID 112, and the program ID are concatenated to form various permutations of the hash data HD1. Various permutations of bits of the factory ID 111, the PUF ID 112, and the program ID are within the contemplated scope of the present disclosure.
In operation S406, the memory circuit 122 receives and stores the hash data HD1 generated by the data generating circuit 121, as illustratively shown in
In various embodiments, the processing circuit 120 further includes an OTP programming circuit (not shown) coupled to program the memory circuit 122. For illustration, after the OTP programming circuit writes (or programs) the hash data HD1 generated in operation S405 into the memory circuit 122, the processing circuit 120 disables the OTP programming circuit. Accordingly, the hash data HD1 stored in the memory circuit 122 are not able to be changed anymore. As a result, security performance of the device 100 is effectively improved.
In some embodiments, the hash data HD1 is generated outside the processing circuit 120 and pre-stored or programmed in the memory circuit 122.
After operation S406, operations S407-S410 are performed. In some embodiments, the operations S407-S410 are performed in a deployed stage, with reference to
To identify whether the device 100 is tampered or altered, an authentication or verification is performed. In some embodiments, data in the device 100 will be compared with pre-configured data. When the comparison result is a mismatch, the device 100 is determined to have been tampered or hacked.
In some embodiments, the authentication is performed, off line, by the device 100 itself. In some other embodiments, this authentication is performed, on line, by the device 100 communicating with the remote database 230.
For illustration of the authentication performed by the device 100 itself, in operation S407, with reference to
In some embodiments, during usage of the device 100, in response to a power-on event or a request, the processing circuit 120 generates the hash data HD2 based on the factory ID 111, the PUF ID 112 and the program ID 113. For illustration, when the device 100 is powered on, the processing circuit 120 generates the hash data HD2 based on the factory ID 111, the PUF ID 112 and the program ID 113. For another illustration, when the device 100 receives a request, the processing circuit 120 generates the hash data HD2 based on the factory ID 111, the PUF ID 112 and the program ID 113. Various conditions of generating the hash data HD2 are within the contemplated scope of the present disclosure.
Similar to operation S405, in some embodiments, the data generating circuit 121 selects at least one bit of the PUF ID 112, at least one bit of the factory ID 111, and at least one bit of the program ID 113, and combines the selected bits of the factory ID 111, the PUF ID 112 and the program ID 113 to form the hash data HD2.
As discussed above with respect to generating the hash data HD1, the data generating circuit 121 also selects T bits of ID data corresponding to the factory ID 111, the PUF ID 112 and the program ID 113, in order for generating the hash data HD2. The bit length of the ID data for generating the hash data HD2 is the same as the aforementioned bit length of the ID data for generating the hash data HD1. Correspondingly, the bit length of the ID data for generating the hash data HD2 is also greater than the bit length of the hash data HD2.
To generate the hash data HD2, the data generating circuit 121 also selects T1 bits from the factory ID 111, T2 bits from the PUF ID 112 and T3 bits from the program ID 113, and a sum of T1, T2 and T3 equals to T. The data generating circuit 121 then combines the data F[T1:1], the data U[T2:1] and the data P[T3:1] to form corresponding ID data. As discussed above, various combinations of the data F[T1:1], the data U[T2:1] and the data P[T3:1] are within the contemplated scope of the present disclosure. Afterwards, the data generating circuit 121 selects a hash algorithm for processing the corresponding ID data, in order to generate the hash data HD2.
For illustration, if the data generating circuit 121 processes the ID data (F[T1:1], U[T2:1], P[T3:1]) via the selected message digest algorithm to generate the hash data HD1 in operation S405, the data generating circuit 121 then processes the corresponding ID data via the selected message digest algorithm to generate the hash data HD2 in operation S407. For another illustration, if the data generating circuit 121 processes the ID data (U[T2:1], P[T3:1], F[T1:1]) via the selected security hash algorithm SHA-1 to generate the hash data HD1 in operation S403, the data generating circuit 121 then processes the corresponding ID data via the selected security hash algorithm SHA-1 to generate the hash data HD2 in operation S407.
In operation S408, the processing circuit 120 compares the hash data HD1 and the hash data HD2 to perform an off-line authentication, in order to identify whether the device 100 is tampered or altered. In some embodiments, the off-line authentication indicates that the device 100 is able to perform the authentication on itself, without being connected to or accessing a database and/or a network/cloud. For illustration, the comparing circuit 123 of the processing circuit 120 compares the hash data HD1 stored in the OTP memory circuit 122 and the hash data HD2 generated by the data generating circuit 121, to determine whether the hash data HD2 matches the hash data HD1. If the comparing circuit 123 determines that the hash data HD2 matches the hash data HD1, the processing circuit 120 determines that the off-line authentication is successful. The successful off-line authentication indicates that the device 100 is not tampered and is secure for usage. Accordingly, when the comparing circuit 123 determines that the hash data HD2 matches the hash data HD1, the device 100 is authenticated, in operation S409. As a result, the device 100 is qualified for usage and operation.
On the contrary, if the comparing circuit 123 determines that the hash data HD2 does not match the hash data HD1, the processing circuit 120 determines that the off-line authentication is unsuccessful. The unsuccessful off-line authentication indicates that the device 100 has been tampered and is not secure enough for usage. Accordingly, when the comparing circuit 123 determines that the hash data HD2 does not match the hash data HD1, the device 100 is not authenticated, in operation S410. As a result, the device 100 is not qualified for usage and operation.
Based on the above, when the device 100 is powered on or receives a request, the processing circuit 120 is able to perform the off-line authentication internally without being connected to, for example, the remote database 230. Therefore, usage of the device 100 is more secure and convenient.
In various embodiments, the processing circuit 120 is also able to perform an online authentication. As discussed above, for illustration in
On the contrary, if the comparing circuit 123 determines that the factory ID 111, the PUF ID 112 and the program ID 113 do not match the read only ID 231 and the program ID 232, the processing circuit 120 then determines that the online authentication is unsuccessful. Accordingly, the device 100 is not authenticated, as illustrated in operation S410. As a result, the device 100 qualified for usage and operation.
In some other approaches, the size of the PUF ID is fixed and unchangeable. Accordingly, too much data corresponding to the PUF ID is stored. As a result, the space of device, for storing the data, are over designed and wasted. Moreover, the IDs of the device are only able to be checked when the device is connected to a remote database (e.g., a cloud database), which results in inconvenience during the usage of the device.
Compared to the aforementioned approaches, the processing circuit 120 defines the minimum bit length PUF_length (bits) of the PUF ID 112 according to the volume of the device 100. Accordingly, the PUF ID 112 is able to be adjusted. As a result, the area of the memory circuit 110 that stores the PUF ID 112 is effectively saved.
In addition, in order to check whether the device 100 is tampered during usage, the processing circuit 120 is able to perform the off-line authentication and/or the online authentication. Accordingly, security of the device 100 is able to be checked with and/or without being connected to, for illustration, the remote database 230. Alternatively stated, without connection to the remote database 230, it is still able to check whether the device 100 is tampered, by comparing the hash data HD2 generated in response to the power-on event or the request, with the unchangeable hash data HD1 stored in the OTP memory circuit 122. Therefore, any change in the factory ID 111, the PUF ID 112 and/or the program ID 113 is internally checked by the device 100 itself. As a result, security performance of the device 100 is effectively improved.
The operations discussed above are given for illustrative purposes. Additional operations are within the contemplated scoped of the present disclosure. For example, in various embodiments, additional operations are provided before, during, and/or after the operations in the method 400 illustrated in
In some embodiments, a device is disclosed that includes a first memory circuit and a processing circuit. The first memory circuit is configured to store first hash data. The processing circuit is coupled to the first memory circuit. The processing circuit is configured to: at least based on a volume of the device, define a size of a distinguishable identification (ID) and a size of second hash data; based on a combination of at least one bit of each of the distinguishable ID and IDs of the device, generate the second hash data; and compare the first hash data with the second hash data, in order to identify whether the device is tampered.
In some embodiments, the size of the distinguishable ID is defined further based on a collision rate of the IDs and 0/1 probability. The size of the second hash data is defined further based on the collision rate of the IDs.
In some embodiments, the processing circuit is further configured to: based on the volume of the device and a collision rate of the IDs, define a size of the first hash data. The size of the first hash data is equal to the size of the second hash data.
In some embodiments, the processing circuit is further configured to: select a hash algorithm; and based on the selected hash algorithm, process the distinguishable ID and the IDs, to generate the first hash data.
In some embodiments, the processing circuit is further configured to: select the at least one bit of each of the distinguishable ID, a factory ID and a program ID of the IDs; and combine the selected at least one bit of each of the distinguishable ID, the factory ID and the program ID of the IDs.
In some embodiments, the device further includes a second memory circuit. The second memory circuit is coupled to the processing circuit, and is configured to store the distinguishable ID and the IDs comprising a factory ID and a program ID.
In some embodiments, the distinguishable ID comprises a physical unclonable function (PUF) ID or a true random number generator (TRNG) ID.
Also disclosed is a method that includes the operations: calculating a bit length of a distinguishable identification (ID) based on at least one parameter that is associated with a volume of a device; combing at least one bit of each of the distinguishable ID, a factory ID and a program ID of the device, to generate first hash data having a bit length that is defined based on the volume of the device; and initiating authentication of the device, when the first hash data is matched with second hash data.
In some embodiments, the at least one parameter includes a collision rate of IDs and 0/1 probability.
In some embodiments, the method further includes the operation: calculating the bit length of the first hash data based on the volume of the device and a collision rate of IDs. The bit length of the first hash data is the same as a bit length of the second hash data.
In some embodiments, the method further includes the operation: comparing the first hash data with the second hash data, to perform an off-line authentication.
In some embodiments, the method further includes the operation: selecting at least one bit of the distinguishable ID, at least one bit of the factory ID and at least one bit of the program ID, to be combined to form ID data; and generating the second hash data based on the ID data.
In some embodiments, the operation of generating the second hash data includes the operation: selecting a hash algorithm based on a secure requirement; and processing the ID data via the selected hash algorithm.
In some embodiments, the method further includes the operation: generating the second hash data based on the distinguishable ID, the factory ID and the program ID, in response to a power-on event or a request.
Also disclosed is a method that includes the operations: receiving first hash data that has a defined size associated with a volume of a device; combining at least one bit of each of a distinguishable identification (ID), a factory ID and a program ID of the device, to form ID data; processing the ID data through a selected hash algorithm, to generate second hash data; and comparing the first hash data with the second hash data, to perform an authentication of the device.
In some embodiments, the method further includes the operation: defining a size of the distinguishable ID based on the volume of the device, a collision rate of IDs and 0/1 probability. The defined size of the first hash data is further associated with the collision rate of IDs.
In some embodiments, the method further includes the operations: selecting a first number of bits from the distinguishable ID, to be combined to form the ID data; selecting a second number of bits from the factory ID, to be combined to form the ID data; and selecting a third number of bits from the program ID, to be combined to form the ID data.
In some embodiments, each of the first number, the second number and the third number is greater than one. At least one of the first number, the second number and the third number is different from the others.
In some embodiments, the method further includes the operation: defining a size of the second hash data based on the volume of the device and a collision rate of IDs. The size of the second hash data is the same as the defined size of the first hash data.
In some embodiments, the method further includes the operation: when the first hash data and the second hash data are mismatched, determining that the device has been tampered.
The foregoing outlines features of several embodiments so that those skilled in the art may better understand the aspects of the present disclosure. Those skilled in the art should appreciate that they may readily use the present disclosure as a basis for designing or modifying other processes and structures for carrying out the same purposes and/or achieving the same advantages of the embodiments introduced herein. Those skilled in the art should also realize that such equivalent constructions do not depart from the spirit and scope of the present disclosure, and that they may make various changes, substitutions, and alterations herein without departing from the spirit and scope of the present disclosure.
Goel, Sandeep Kumar, Zhou, Haohua
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
10599796, | Apr 28 2017 | Taiwan Semiconductor Manufacturing Co., Ltd. | Metastable flip-flop based true random number generator (TRNG) structure and compiler for same |
7813507, | Apr 21 2005 | Intel Corporation | Method and system for creating random cryptographic keys in hardware |
8667580, | Nov 15 2004 | Intel Corporation | Secure boot scheme from external memory using internal memory |
8983073, | Feb 10 2012 | XILINX, Inc.; Xilinx, Inc | Method and apparatus for restricting the use of integrated circuits |
8984296, | Mar 29 2009 | MONTEREY RESEARCH, LLC | Device driver self authentication method and system |
9893898, | Mar 11 2011 | Emsycon GmbH | Tamper-protected hardware and method for using same |
20020147918, | |||
20100023726, | |||
20100287604, | |||
20110066835, | |||
20140165141, | |||
20150195088, | |||
20160149712, | |||
20160359636, | |||
20170337380, | |||
20180129801, | |||
20180129802, | |||
20180184290, | |||
20180241568, | |||
20180337790, | |||
20190068592, | |||
20190147967, | |||
WO2017216614, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Feb 17 2018 | ZHOU, HAOHUA | TAIWAN SEMICONDUCTOR MANUFACTURING CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 055739 | /0760 | |
Feb 20 2018 | GOEL, SANDEEP KUMAR | TAIWAN SEMICONDUCTOR MANUFACTURING CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 055739 | /0760 | |
Mar 25 2021 | Taiwan Semiconductor Manufacturing Co., Ltd. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Mar 25 2021 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Date | Maintenance Schedule |
Nov 28 2026 | 4 years fee payment window open |
May 28 2027 | 6 months grace period start (w surcharge) |
Nov 28 2027 | patent expiry (for year 4) |
Nov 28 2029 | 2 years to revive unintentionally abandoned end. (for year 4) |
Nov 28 2030 | 8 years fee payment window open |
May 28 2031 | 6 months grace period start (w surcharge) |
Nov 28 2031 | patent expiry (for year 8) |
Nov 28 2033 | 2 years to revive unintentionally abandoned end. (for year 8) |
Nov 28 2034 | 12 years fee payment window open |
May 28 2035 | 6 months grace period start (w surcharge) |
Nov 28 2035 | patent expiry (for year 12) |
Nov 28 2037 | 2 years to revive unintentionally abandoned end. (for year 12) |