Techniques are provided for anonymizing streamed data. In various embodiments, data are anonymized by receiving a data element of a data stream including a plurality of said data elements (pi, si), where pi comprises an identifying portion and si comprises an associated sensitive information portion; obtaining a partitioned space S including t regions; assigning the identifying portion, pi, to a selected region; encrypting the associated sensitive information si as e(si); and storing the encrypted associated sensitive information e(si) in a list associated with the selected region but not storing the associated identifying portion, pi, in the list. The regions have corresponding center points, and a nearest center to pi is optionally determined. The encrypted associated sensitive information e(si) may be stored in a list associated with the nearest center.
|
1. A method, comprising:
receiving a data element of a data stream including a plurality of said data elements (pi, si), where pi comprises an identifying portion of an ith data element and si comprises an associated sensitive information portion of said ith data element;
obtaining, by at least one processing device, a partitioned space S including t regions S1, S2, . . . , St;
assigning, by at least one processing device, said identifying portion, pi, to a selected one of said regions;
encrypting, by at least one processing device, said associated sensitive information portion si as e(si); and
storing, by at least one processing device, said encrypted associated sensitive information portion e(si) in a non-transitory electronic memory device in a list associated with said selected region but not storing said associated identifying portion, pi, in said list after using said associated identifying portion, pi, to assign said associated identifying portion, pi, to said selected region.
11. A system, comprising:
a memory; and
at least one hardware processing device, coupled to the memory, operative to:
receive a data element of a data stream including a plurality of said data elements (pi, si), where pi comprises an identifying portion of an data element and si comprises an associated sensitive information portion of said ith data element;
obtain, by at least one processing device, a partitioned space S including t regions S1, S2, . . . , St;
assign, by at least one processing device, said identifying portion, pi, to a selected one of said regions;
encrypt, by at least one processing device, said associated sensitive information portion si as e(si); and
store, by at least one processing device, said encrypted associated sensitive information portion e(si) in a non-transitory electronic memory device in a list associated with said selected region but not storing said associated identifying portion, pi, in said list after using said associated identifying portion, pi, to assign said associated identifying portion, pi, to said selected region.
19. A method for configuring a processing system to operate as a streaming data anonymization system, comprising:
configuring at least one processing system element to receive a data element of a data stream including a plurality of said data elements (pi, si), where pi comprises an identifying portion of an ith data element and si comprises an associated sensitive information portion of said ith data element;
configuring at least one processing system element to obtain a partitioned space S including t regions S1, S2, . . . ,St;
configuring at least one processing system element to assign said identifying portion, pi, to a selected one of said regions;
configuring at least one processing system element to encrypt said associated sensitive information portion si as e(si); and
configuring at least one processing system element to store said encrypted associated sensitive information portion e(si) in a non-transitory electronic memory device in a list associated with said selected region but not storing said associated identifying portion, pi, in said list after using said associated identifying portion, pi, to assign said associated identifying portion, pi, to said selected region.
2. The method of
3. The method of
4. The method of
obtaining a number of center points C1, C2, . . . , Ct corresponding to regions S1, S2, . . . , St in said partitioned space S; and
computing a nearest center to pi,
wherein said list is associated with said computed nearest center.
5. The method of
6. The method of
7. The method of
8. The method of
9. The method of
adding dummy entries to one or more lists;
deleting entries from one or more lists; and
maintaining one or more entries in one or more lists that have been output.
10. A non-transitory machine-readable recordable storage medium for anonymizing data in a data stream, wherein one or more software programs when executed by one or more processing devices implement the steps of the method of
12. The system of
13. The system of
14. The system of
obtain a number of center points C1, C2, . . . , Ct in said partitioned space S comprising regions S1, S2, . . . , St; and
compute a nearest center to pi,
wherein said list is associated with said computed nearest center.
15. The system of
16. The system of
17. The system of
18. The system of
add dummy entries to one or more lists;
delete entries from one or more lists; and
maintain one or more entries in one or more lists that have been output.
20. The method of
21. The method of
configuring at least one processing system element to obtain a number of center points C1, C2, . . . , Ct corresponding to regions S1, S2, . . . , St in said partitioned space S; and
configuring at least one processing system element to compute a nearest center to pi,
wherein said list is associated with said computed nearest center.
22. The method of
|
The present invention relates generally to techniques, apparatus and systems for anonymizing data.
This section introduces aspects that may be helpful to facilitating a better understanding of the inventions. Accordingly, the statements of this section are to be read in this light and are not to be understood as admissions about what is in the prior art or what is not in the prior art.
It is often desirable to transform, pre-process and store a stream of sensitive data so that the transformed data can be analyzed without compromising the privacy of the data of any individual. Each data item in the streamed data typically comprises a first element identifying an individual, such as a name or an address, and a second element containing some private and/or sensitive information about the individual, such as a disease that the individual has. The identifying part of the data should be transformed so that the processed stream can be saved for later analysis in a manner that allows the data to be analyzed while maintaining the privacy of the individuals. Generally, researchers and/or analysts viewing the transformed data and associated sensitive data should be able to analyze the data and make reasonable (though approximate) conclusions about the data without being able to identify the sensitive information of any particular individual. For example, researchers may wish to study diseases in a particular neighborhood.
Data anonymization techniques can address the privacy concerns and aid compliance with applicable legal requirements. A number of data anonymization techniques have been proposed or suggested that achieve various privacy goals by ensuring that the transformed data has certain properties. For example, k-anonymity techniques require that each individual in the data set must be indistinguishable from k−1 other individuals. In addition, l-diversity techniques provide sufficient diversity in the sensitive information associated with individuals.
A need remains for improved techniques for effectively anonymizing data so that portions of the data can be published and shared with others.
Generally, methods and apparatus are provided for anonymizing data in a data stream. According to one embodiment, data in a data stream is anonymized by receiving a data element (pi, si) of the data stream, where pi comprises an identifying portion and si comprises associated sensitive information portion; obtaining a partitioned space S including t regions; assigning the identifying portion, pi, to a selected region; encrypting the associated sensitive information portion si as e(si); and storing the encrypted associated sensitive information portion e(si) in a list associated with the selected region but not storing the associated identifying portion, pi, in the list.
According to a further embodiment, a permutation function π optionally randomizes the order that regions S1, S2, ..., St in the partitioned space S are stored so that an adversary cannot obtain information by observing the data being stored in particular regions. Thus, the list associated with the region Sj is optionally mapped to a storage location using one or more of a permutation function and a hash table.
In one example embodiment, the space S is partitioned into regions S1, S2, . . . , St, having corresponding center points C1, C2, . . . , Ct and a nearest center Cj is computed to pi; and the encrypted associated sensitive information e(si) is stored in a list associated with the computed nearest center Cj. Another embodiment provides a user-specified distance parameter d, such that for a fixed distance d there are enough center points Cl, C2, . . . , Ct so that for any point p in S there is some center Cj so that p is at most distance d to Cj. The distance d is generally a limit on how different transformed identifying information must be from the identifying portion for each data element.
A more complete understanding of the present disclosure, as well as further features and advantages of various embodiments, will be obtained by reference to the following detailed description and drawings.
Embodiments described herein provide methods, apparatus and systems for anonymizing streaming data.
As shown in
Consider the case where the transformed identifying data p′ along with the associated sensitive data s can only be stored or transmitted when it is written as part of a set of such data all of which have the same transformed identifying part p′ and with the property that no individual's sensitive data s can be determined (e.g., so-called k-anonymity requirement). This goal can be accomplished with the constraint that there is a fixed sized RAM buffer into which untransformed data (p, s) can be stored. According to a further embodiment, a limit is optionally specified on how different the transformed identifying information p′ must be from each individual's true identifying data p. In this manner, the approximate transformed data (p′, s) can be a good representation of the true data (p, s).
As discussed hereinafter, the example embodiment uses a combination of semantically secure encryption, a (randomly chosen) permutation function, π, or a hash table and a clustering heuristic. Various embodiments reflect the recognition that that intermediate sensitive data can be stored on a storage device, such as a storage disk, flash memory device or a network storage device, or transmitted over a network, as long as this intermediate sensitive data are encrypted. The example system 100 employs a disk space on one or more storage disks 170 as a secure extension of the RAM to store the transformed identifying data p′ and encrypted sensitive data e(s) on a particular list L(π(i)) associated with a particular sub-region Si in a partitioned space S, as discussed further below in conjunction with
Once a particular list, such as list L(j), satisfies a predefined anonymity criterion (e.g., the list has at least k elements), then the elements of the list are decrypted and the following values are output to the disk 170 (or another storage or transmission device) in an example embodiment: the center Cj of the list and the decrypted sensitive data values from the list, as discussed further below in conjunction with
While the example embodiment is illustrated herein by finding a nearest center Cj of a region in the partitioned space S, any partitioning into regions can be employed, as would be apparent to a person of ordinary skill in the art. In another example variation, the space 5, can be partitioned into a grid, and when untransformed data (p, s) enters the system 100, the data can be classified into a particular region Si of the space S, for example, based on the x and y range of the grid cell.
The example embodiment assumes that the identifying part of the data p comes from a space where distance is defined, such as the Euclidean space. Thus, for example, the data could be location data or any other tuples where each component of the tuple has a distance measure (since then the distance between tuples could be defined by any multidimensional metric such as L1, L2 (Euclidian metric), . . . , or L-infinity). Let S denote the space of all possible identification data.
As indicated above, it is often desirable to transform, pre-process and stare a stream of sensitive data so that the transformed data (p′, s) can be analyzed without compromising the privacy of the data of any individual. Generally, researchers and/or analysts viewing the transformed identifying data p′ and associated sensitive data s should be able to study the data s and make reasonable (though approximate) conclusions about the data s without being able to identify the sensitive information p of any particular individual. For example, researchers may wish to study diseases in a particular neighborhood.
Some embodiments encrypt the sensitive data s before it is written to a disk, such as the example disk 170. Further embodiments partition a space S into a plurality of regions S1, S2, . . . , St so that all of the points in a particular region Si are at least as close to a center Ci of the region Si as to any other center.
In the example embodiment of
Consider a data point (p, s) of the stream where p is the identifying part of the data, and s is the associated sensitive information. Then, when (p, s) enters the system 100, the nearest center Ci, to p is computed during step 320, as well as the value π(i), where π is a randomly chosen permutation function on [1, t]. On the disk 170, t laists L(1), L(2), . . . , L(t) of encrypted data are created as follows. The sensitive data s is encrypted as e(s) and e(s) is added to the list L(π(i)) on the disk 170 during step 330. The encryption can be any semantically secure encryption (i.e., an adversary cannot tell if two encryptions encrypt the same value). An example of a semantically secure encryption is Enc_k(x)=AES_k(x, r), where r is a randomly chosen value.
Generally, the permutation function it randomizes the order that the regions are stored S1, S2, . . . , St so that an adversary cannot obtain information by observing the data being stored in particular regions. In other words, the permutation function it maps the centers Ci to disk locations.
In this manner, an adversary cannot inject data into the stream, monitor the portion of the disk that gets updated and thereby learn which part of the disk contains data about that particular region. In addition, timing attacks are optionally mitigated by adding/deleting dummy data points to keep all regions growing/shrinking randomly so an adversary cannot determine where real data are going or being written from.
The permutation function may be embodied as any randomly chosen permutation function. For small sets of Si, the permutation can be generated and stored in RAM by enumerating the entire map π(1), π(2), etc. For larger sets, only the key needs to be stored and the mapping can be generated from the stored key. For a discussion of example techniques for generating a pseudorandom permutation on arbitrary size domain, see, for example, Ben Morris and Phillip Rogaway, “Sometimes-Recurse Shuffle: Almost-Random Permutations in Logarithmic Expected Time,” (unpublished manuscript, August 2013), see, for example, http://eprint.iacr.org/2013/560.pdf).
A hash table implementation can employ standard hash table data structures, such as Cuckoo hashes. See, for example, Rasmus Pagh, et al., “Cuckoo Hashing,” Algorithms—ESA 2001, Lecture Notes in Computer Science 2161, pp. 121-133 (2001).
Finally, if at any time it is detected during step 340 that one of the lists, such as list L(j), satisfies the predefined anonymity criterion that is employed (e.g., the list has at least k elements), then the elements of the list are decrypted and the following values are output to the disk 170 during step 350 (or transmitted over a network): the center Cj of the list and the decrypted sensitive data values from the list.
In one example embodiment, the lists L(1), L(2), . . . , L(t) of encrypted data are optionally kept at an approximately equal size by adding dummy entries to one or more lists, deleting entries from one or more lists, as well as keeping some entries that have already been written out to disk. In this manner, any leaking of sensitive information using timing techniques is reduced.
The mapping of centers Ci to disk locations via a permutation is efficient if the number of points Ci is relatively small. In the case where there may be many such centers (e.g., more than disk storage locations), but not all of Ci are populated, a hash table can be employed as an implementation of the disk location mapping of Ci. In general, the hash table may reveal Ci from the disk address, so Ci is deterministically encrypted before using it as a hash function key for locations. Deterministic encryption is needed as probabilistic encryption would not allow consistently allocating the same disk location for a specific Ci.
Finally, using a hash table as a mechanism for disk location allows easy update/re-randomization of the disk location used for storing the C1 information. Indeed, simply encrypting Ci concatenated with a counter allows the allocation of a new and unlinkable location for Ci (current counter per (Ci) needs to be stored in RAM).
Consider a feed of data (p, s) where p contains the geographic endpoints of a communication (e.g., text message, phone call) and s is the type and duration of the communication. This stream of data comes into a computer (or computers) run by a service provider (SP). In many geographic areas, the SP is not legally allowed to record data points from this feed. That is, the SP cannot write (p, s) to disk. The SP wishes to perform some analytic computation on the data, for example, to optimize network configuration, to build better pricing plans or to prevent chum. The computer has limited memory (i.e., limited buffer size). In various embodiments the SP can write to disk “anonymized” data to be used later for analytics. For example, the SP may print out a value representing the approximate location of k (or more) actual communication endpoint pairs along with the accompanying sensitive data about those k (or more) communicating pairs. Such sets of points could periodically be written to disk as the limited memory fills up.
A unique feature of the example method is that it allows a user to specify an upper bound on the approximation of the identifying data. That is, a user can say that whenever data such as (p1, s1), (p2, s2), (pk, sk) is written to disk in the form (Ci, s1, s2, . . . , sk) then the maximum distance between any pj and Ci is some distance d. That is, the quality of the approximation can be specified by the user. Thus, among other benefits, the disclosed method allows the user to specify the degree of approximation it will allow.
Also, by carefully storing encrypted data on disk, the bounded size of the trusted buffer (i.e. of RAM) is no longer a limitation.
System and Article of Manufacture Details
While
While various embodiments of the present inventions have been described with respect to processing steps in a software program, as would be apparent to one skilled in the art, various functions may be implemented in the digital domain as processing steps in a software program, in hardware by circuit elements or state machines, or in combination of both software and hardware. Such software may be employed in, for example, a digital signal processor, application specific integrated circuit, micro-controller, or general-purpose computer. Such hardware and software may be embodied within circuits implemented within an integrated circuit.
Thus, the functions of the present inventions can be embodied in the form of methods and apparatuses for practicing those methods. One or more aspects of the present inventions can be embodied in the form of program code, for example, whether stored in a storage medium, loaded into and/or executed by a machine, or transmitted over some transmission medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus configured according to one or more embodiments of the invention. When implemented on a general-purpose processor, the program code segments combine with the processor to provide a device that operates analogously to specific logic circuits. Embodiments can also be implemented in one or more of an integrated circuit, a digital signal processor, a microprocessor, and a micro-controller.
As is known in the art, the methods and apparatus discussed herein may be distributed as an article of manufacture that itself comprises a tangible computer readable recordable medium having computer readable code means embodied thereon. The computer readable program code means is operable, in conjunction with a computer system, to carry out all or some of the steps to perform the methods or create the apparatuses discussed herein. The computer readable medium may be a recordable medium (e.g., floppy disks, hard drives, compact disks, memory cards, semiconductor devices, chips, application specific integrated circuits (ASICs)) or may be a transmission medium (e.g., a network comprising fiber-optics, the world-wide web, cables, or a wireless channel using time-division multiple access, code-division multiple access, or other radio-frequency channel). Any medium known or developed that can store information suitable for use with a computer system may be used. The computer-readable code means is any mechanism for allowing a computer to read instructions and data, such as magnetic variations on a magnetic media or height variations on the surface of a compact disk.
The computer systems and servers described herein each contain a memory that will configure associated processors to implement the methods, steps, and functions disclosed herein. The memories could be distributed or local and the processors could be distributed or singular. The memories could be implemented as an electrical, magnetic or optical memory, or any combination of these or other types of storage devices. Moreover, the term “memory” should be construed broadly enough to encompass any information able to be read from or written to an address in the addressable space accessed by an associated processor. With this definition, information on a network is still within a memory because the associated processor can retrieve the information from the network. The present inventions may be embodied in other specific apparatus and/or methods. The described embodiments are to be considered in all respects as only illustrative and not restrictive. In particular, the scope of the invention is indicated by the appended claims rather than by the description and figures herein. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope.
Wilfong, Gordon T., Kolesnikov, Vladimir Y.
Patent | Priority | Assignee | Title |
11914738, | May 15 2019 | KONINKLIJKE PHILIPS N V | Categorizing a sensitive data field in a dataset |
Patent | Priority | Assignee | Title |
7861096, | Jul 12 2006 | MAJANDRO LLC | Method, apparatus, and program product for revealing redacted information |
8627488, | Dec 05 2011 | AT&T INTELLECTUAL PROPERTY I, L P , A NEVADA PARTNERSHIP | Methods and apparatus to anonymize a dataset of spatial data |
20070061393, | |||
20080247677, | |||
20100191975, | |||
20130227195, | |||
20140019683, | |||
JP2014183547, | |||
WO3021473, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Mar 20 2014 | KOLESNIKOV, VLADIMIR Y | Alcatel-Lucent USA Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 032528 | /0194 | |
Mar 20 2014 | WILFONG, GORDON T | Alcatel-Lucent USA Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 032528 | /0194 | |
Mar 26 2014 | Alcatel Lucent | (assignment on the face of the patent) | / | |||
May 06 2014 | ALCATEL LUCENT USA, INC | CREDIT SUISSE AG | SECURITY INTEREST SEE DOCUMENT FOR DETAILS | 032845 | /0558 | |
Aug 19 2014 | CREDIT SUISSE AG | Alcatel-Lucent USA Inc | RELEASE OF SECURITY INTEREST | 033654 | /0693 | |
Apr 22 2015 | Alcatel-Lucent USA Inc | Alcatel Lucent | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 035525 | /0024 | |
Dec 22 2017 | Alcatel Lucent | WSOU Investments, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 045085 | /0001 | |
May 28 2021 | WSOU Investments, LLC | OT WSOU TERRIER HOLDINGS, LLC | SECURITY INTEREST SEE DOCUMENT FOR DETAILS | 056990 | /0081 |
Date | Maintenance Fee Events |
Oct 05 2016 | ASPN: Payor Number Assigned. |
Oct 05 2016 | RMPN: Payer Number De-assigned. |
Dec 09 2019 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Jan 29 2024 | REM: Maintenance Fee Reminder Mailed. |
Jun 03 2024 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Jun 03 2024 | M1555: 7.5 yr surcharge - late pmt w/in 6 mo, Large Entity. |
Date | Maintenance Schedule |
Jun 07 2019 | 4 years fee payment window open |
Dec 07 2019 | 6 months grace period start (w surcharge) |
Jun 07 2020 | patent expiry (for year 4) |
Jun 07 2022 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jun 07 2023 | 8 years fee payment window open |
Dec 07 2023 | 6 months grace period start (w surcharge) |
Jun 07 2024 | patent expiry (for year 8) |
Jun 07 2026 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jun 07 2027 | 12 years fee payment window open |
Dec 07 2027 | 6 months grace period start (w surcharge) |
Jun 07 2028 | patent expiry (for year 12) |
Jun 07 2030 | 2 years to revive unintentionally abandoned end. (for year 12) |