{"id":1723,"date":"2025-02-14T21:51:16","date_gmt":"2025-02-15T04:51:16","guid":{"rendered":"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/?p=1723"},"modified":"2025-02-14T21:51:19","modified_gmt":"2025-02-15T04:51:19","slug":"privacy-enhancing-technologies-pets-a-deep-dive","status":"publish","type":"post","link":"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/2025\/02\/14\/privacy-enhancing-technologies-pets-a-deep-dive\/","title":{"rendered":"Privacy-Enhancing Technologies (PETs): A Deep Dive"},"content":{"rendered":"\n<h1 class=\"wp-block-heading has-secondary-color has-text-color has-background has-link-color wp-elements-023fa366b26ce8d0d378b47bff102a42\" style=\"background:linear-gradient(37deg,rgb(255,245,203) 22%,rgb(182,227,212) 43%,rgb(51,167,181) 94%);font-size:45px\"><strong>Introduction<\/strong><\/h1>\n\n\n\n<p style=\"font-size:20px\">In today\u2019s data-driven world, protecting our personal information has never been more critical. No day passes without fear of privacy breaches and unauthorised surveillance confronting us as data theft takes place simultaneously. Privacy-Enhancing Technologies (PETs) enable data protection through a collection of tools that maintain personal data security without impacting its valid use. <br><br>The upcoming discussion will explain Privacy-Enhancing Technologies (PETs) by explaining their core concepts and analysing active threats to data privacy in contemporary settings. Whether you\u2019re a privacy-conscious individual, a developer, or someone just curious about the future of data security, you\u2019ll find something valuable here!<br>[1]<\/p>\n\n\n\n<figure class=\"wp-block-image size-large is-resized\"><img decoding=\"async\" width=\"1024\" height=\"680\" data-src=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-5-1024x680.png\" alt=\"\" class=\"wp-image-1730 lazyload\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/680;width:727px;height:auto\" title=\"\" data-srcset=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-5-1024x680.png 1024w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-5-300x199.png 300w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-5-768x510.png 768w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-5.png 1400w\" data-sizes=\"(max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" \/><figcaption class=\"wp-element-caption\"><strong>Deep Dive [2]<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"has-secondary-color has-text-color has-background has-link-color wp-elements-81f1a7cfb2e68dfc91142e8b1059ea51\" style=\"background:linear-gradient(135deg,rgb(255,245,203) 49%,rgb(182,227,212) 100%,rgb(51,167,181) 100%);font-size:29px\"><strong>What Are Privacy-Enhancing Technologies (PETs)?<\/strong><\/p>\n\n\n\n<p style=\"font-size:20px\">Privacy-Enhancing Technologies represent tools and methodologies that protect personal data through exposed data reduction and secure processing techniques. The combination of PETs enables data analysis with complete protection of identity and sensitive data from unauthorised access. <br>Let\u2019s break down some of the most important PETs currently transforming how we protect data.<br>[1] [3]<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"640\" data-src=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-6-1024x640.png\" alt=\"\" class=\"wp-image-1731 lazyload\" data-srcset=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-6-1024x640.png 1024w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-6-300x188.png 300w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-6-768x480.png 768w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-6-1536x960.png 1536w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-6-2048x1280.png 2048w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-6-1568x980.png 1568w\" data-sizes=\"(max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/640;\" \/><figcaption class=\"wp-element-caption\"><strong>Data Privacy [3]<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"has-secondary-color has-text-color has-background has-link-color wp-elements-3341b2b5eee50a8eb8e48b6a7733d5ed\" style=\"background-color:#f4f6c5;font-size:24px\"><strong>1. Homomorphic Encryption (HE)<\/strong><\/p>\n\n\n\n<p style=\"font-size:20px\">Under homomorphic encryption the system executes computations on encrypted information while the data stays encrypted throughout the entire process. <\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li style=\"font-size:20px\"><strong>How It Works:<\/strong>\u00a0Mathematical operations are performed directly on ciphertext, and the result, when decrypted, matches the result of operations performed on plaintext.<\/li>\n\n\n\n<li style=\"font-size:20px\"><strong>Applications:<\/strong> Secure data analysis in cloud computing, privacy-preserving machine learning, and encrypted search engines.<\/li>\n\n\n\n<li style=\"font-size:20px\"><strong>Real-World Example:<\/strong> Healthcare providers can perform medical data analysis on encrypted patient records without ever accessing the underlying data.<br>[1][4][5]<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"702\" data-src=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-8-1024x702.png\" alt=\"\" class=\"wp-image-1736 lazyload\" data-srcset=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-8-1024x702.png 1024w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-8-300x206.png 300w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-8-768x527.png 768w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-8-1536x1053.png 1536w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-8-1568x1075.png 1568w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-8.png 1999w\" data-sizes=\"(max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/702;\" \/><figcaption class=\"wp-element-caption\"><strong>HE<\/strong> <strong>Outsourced Cloud Storage [4]<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"has-secondary-color has-text-color has-background has-link-color wp-elements-b079de4ea2a1beea0b26a7819ca0fbba\" style=\"background-color:#f4f6c5;font-size:24px\"><strong>2. Differential Privacy<\/strong><\/p>\n\n\n\n<p style=\"font-size:20px\">Achieving differential privacy involves controlled noise addition to maintain accurate insights without compromising privacy standards for individual participants. The effective analysis of extensive datasets depends on differential privacy because it helps researchers avoid privacy violations. <\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li style=\"font-size:20px\"><strong>How It Works:<\/strong>\u00a0A privacy parameter (\u03b5) controls the amount of noise added. Smaller \u03b5 values provide stronger privacy guarantees but reduce data accuracy.<\/li>\n\n\n\n<li style=\"font-size:20px\"><strong>Applications:<\/strong> Data analytics for health research, census data release, and market analysis.<\/li>\n\n\n\n<li style=\"font-size:20px\"><strong>Real-World Example:<\/strong> Apple uses differential privacy to get user data from iPhones to see what people do without invading any person&#8217;s privacy.<br>[1][2][6]<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"850\" height=\"633\" data-src=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-7.png\" alt=\"\" class=\"wp-image-1735 lazyload\" data-srcset=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-7.png 850w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-7-300x223.png 300w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-7-768x572.png 768w\" data-sizes=\"(max-width: 850px) 100vw, 850px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 850px; --smush-placeholder-aspect-ratio: 850\/633;\" \/><figcaption class=\"wp-element-caption\"><strong>Differential Privacy Using The Laplace Distribution[6]<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"has-secondary-color has-text-color has-background has-link-color wp-elements-ca305081098567800080c76cf5a87e13\" style=\"background-color:#f4f6c5;font-size:24px\"><strong>3. Federated Learning<\/strong><\/p>\n\n\n\n<p style=\"font-size:20px\">Imagine a machine learning model that lets you train it without exposing your data. Federated learning makes this possible. This technology enables numerous actors to work together to develop a digital intelligence model in a de-identified fashion.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li style=\"font-size:20px\"><strong>How It Works:<\/strong> Enables model training across multiple devices without sharing raw data. Local devices train the model and send only updates to a central server, ensuring privacy while improving model accuracy.<\/li>\n\n\n\n<li style=\"font-size:20px\"><strong>Real-World Example:<\/strong> Google used federated learning in mobile devices, and as a result, your data is processed on device locally instead of sending it to centralized servers, thus preserving privacy.<br>[1][7][16]<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"900\" height=\"600\" data-src=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-9.png\" alt=\"\" class=\"wp-image-1741 lazyload\" data-srcset=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-9.png 900w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-9-300x200.png 300w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-9-768x512.png 768w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-9-600x400.png 600w\" data-sizes=\"(max-width: 900px) 100vw, 900px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 900px; --smush-placeholder-aspect-ratio: 900\/600;\" \/><figcaption class=\"wp-element-caption\"><strong>Federated Learning Applications [7]<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"has-secondary-color has-text-color has-background has-link-color wp-elements-93efa1673ff87b2bbe0d228abd03a445\" style=\"background-color:#f4f6c5;font-size:24px\"><strong>4.\u00a0Secure Multi-Party Computation (SMPC)<\/strong><\/p>\n\n\n\n<p style=\"font-size:20px\">SMPC allows multiple parties to jointly compute a function over their inputs while keeping those inputs private.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li style=\"font-size:20px\"><strong>How It Works:<\/strong>\u00a0Each party encrypts their input, and computations are performed on the encrypted data. The final result is revealed without exposing individual inputs.<\/li>\n\n\n\n<li style=\"font-size:20px\"><strong>Applications:<\/strong>\u00a0Collaborative fraud detection, privacy-preserving auctions, and joint financial analysis.<\/li>\n\n\n\n<li style=\"font-size:20px\"><strong>Example:<\/strong>\u00a0Banks can use SMPC to detect money laundering patterns without sharing customer transaction data.<br>[1][8]<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"850\" height=\"612\" data-src=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-10.png\" alt=\"\" class=\"wp-image-1742 lazyload\" data-srcset=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-10.png 850w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-10-300x216.png 300w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-10-768x553.png 768w\" data-sizes=\"(max-width: 850px) 100vw, 850px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 850px; --smush-placeholder-aspect-ratio: 850\/612;\" \/><figcaption class=\"wp-element-caption\"><strong>System scheme with Secure Multiparty Computation [8]<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"has-secondary-color has-text-color has-background has-link-color wp-elements-affad1589963586ddfc5b414ff0a196c\" style=\"background-color:#f4f6c5;font-size:24px\"><strong>5. Zero-Knowledge Proofs (ZKPs)<\/strong><\/p>\n\n\n\n<p style=\"font-size:20px\">ZKPs allow one party to prove to another that a statement is true without revealing any additional information. This technology is a game-changer for ensuring privacy in digital transactions.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li style=\"font-size:20px\"><strong>How It Works:<\/strong>\u00a0The prover convinces the verifier of the truth of a statement using cryptographic protocols, without disclosing the underlying data.<\/li>\n\n\n\n<li style=\"font-size:20px\"><strong>Applications:<\/strong>\u00a0Blockchain transactions, identity verification, and secure authentication.<\/li>\n\n\n\n<li style=\"font-size:20px\"><strong>Example:<\/strong>\u00a0Zcash uses ZKPs to enable private cryptocurrency transactions.<br>[1][9][10]<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"633\" data-src=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-11-1024x633.png\" alt=\"\" class=\"wp-image-1743 lazyload\" data-srcset=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-11-1024x633.png 1024w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-11-300x186.png 300w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-11-768x475.png 768w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-11.png 1200w\" data-sizes=\"(max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/633;\" \/><figcaption class=\"wp-element-caption\"><strong>Zero Knowledge Proof [9]<\/strong><\/figcaption><\/figure>\n\n\n\n<p class=\"has-secondary-color has-text-color has-background has-link-color wp-elements-627c542fe4dde1c953e1aebc1c8dcba7\" style=\"background:linear-gradient(135deg,rgb(255,245,203) 49%,rgb(182,227,212) 100%,rgb(51,167,181) 100%);font-size:29px\"><strong>Examples of Threats to PETs You Need to Know<\/strong><\/p>\n\n\n\n<p style=\"font-size:20px\"><strong>1. Model Extraction Attacks on Federated Learning<br><\/strong>Federated learning allows multiple parties to collaboratively train machine learning models without sharing raw data. However, a new\u00a0<strong>model extraction attack<\/strong>\u00a0emerged, where an attacker exploits the model updates shared during federated learning to reconstruct the training data.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li style=\"font-size:20px\"><strong>Attack Vector:<\/strong>\u00a0The attacker participates in the federated learning process and uses gradient inversion techniques to reverse-engineer the model updates, extracting sensitive training data.<\/li>\n<\/ul>\n\n\n\n<h4 class=\"wp-block-heading\" style=\"font-size:20px\">Mitigation:<\/h4>\n\n\n\n<ul class=\"wp-block-list\">\n<li style=\"font-size:20px\">Apply differential privacy to model updates to add noise and prevent inversion.<\/li>\n\n\n\n<li style=\"font-size:20px\">Use secure aggregation protocols to obscure individual updates.<br>[11]<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-image size-full\"><img decoding=\"async\" width=\"685\" height=\"498\" data-src=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-12.png\" alt=\"\" class=\"wp-image-1747 lazyload\" data-srcset=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-12.png 685w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-12-300x218.png 300w\" data-sizes=\"(max-width: 685px) 100vw, 685px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 685px; --smush-placeholder-aspect-ratio: 685\/498;\" \/><figcaption class=\"wp-element-caption\"><strong>Multi-phases framework of trusted FL [11]<\/strong><\/figcaption><\/figure>\n\n\n\n<p style=\"font-size:20px\"><strong>2. Data Poisoning Attacks on Differential Privacy <br><\/strong>Differential privacy relies on adding noise to data to protect individual privacy. \u00a0<strong>Data poisoning attack<\/strong>\u00a0was discovered, where malicious actors inject carefully crafted data into the dataset to manipulate the noise distribution.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li style=\"font-size:20px\"><strong>Attack Vector:<\/strong>\u00a0The attacker injects outliers or biased samples that will increase the nonlinearity of the noise added from differential privacy mechanisms and lowers its effect.<\/li>\n<\/ul>\n\n\n\n<h4 class=\"wp-block-heading\" style=\"font-size:20px\">Mitigation:<\/h4>\n\n\n\n<ul class=\"wp-block-list\">\n<li>Use robust statistical methods to detect and filter out poisoned data.<\/li>\n\n\n\n<li>Implement multi-layered privacy mechanisms to enhance resilience.<br>[12][13]<\/li>\n<\/ul>\n\n\n\n<figure class=\"wp-block-image size-large\"><img decoding=\"async\" width=\"1024\" height=\"474\" data-src=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-13-1024x474.png\" alt=\"\" class=\"wp-image-1751 lazyload\" data-srcset=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-13-1024x474.png 1024w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-13-300x139.png 300w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-13-768x356.png 768w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-13-1536x711.png 1536w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-13-2048x948.png 2048w, https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/image-13-1568x726.png 1568w\" data-sizes=\"(max-width: 1024px) 100vw, 1024px\" src=\"data:image\/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==\" style=\"--smush-placeholder-width: 1024px; --smush-placeholder-aspect-ratio: 1024\/474;\" \/><figcaption class=\"wp-element-caption\"><strong>Poisoning attack [14]<\/strong><\/figcaption><\/figure>\n\n\n\n<p style=\"font-size:20px\"><strong>3. Zero-Knowledge Proof Vulnerabilities in Blockchain<br><\/strong>A comprehensive security audit of the o1js library, a TypeScript framework for zk-SNARKs and zkApps, identified three critical vulnerabilities.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li style=\"font-size:20px\"><strong>Attack Vector:<\/strong>\u00a0The vulnerabilities stemmed from flaws in the implementation of zero-knowledge proofs within the o1js library.<\/li>\n\n\n\n<li style=\"font-size:20px\"><strong>Impact:<\/strong>\u00a0These issues posed significant risks to applications utilizing o1js, potentially allowing unauthorized access or manipulation of sensitive data.<\/li>\n<\/ul>\n\n\n\n<h4 class=\"wp-block-heading\" style=\"font-size:20px\">Mitigation:<\/h4>\n\n\n\n<p style=\"font-size:20px\">Developers are advised to promptly update to the latest versions of the o1js library and conduct thorough security audits of their ZKP implementations.<br>[9][15]<\/p>\n\n\n\n<p class=\"has-secondary-color has-text-color has-background has-link-color wp-elements-2b37a449078f4cdeef25ee701988b75b\" style=\"background:linear-gradient(135deg,rgb(255,245,203) 49%,rgb(182,227,212) 100%,rgb(51,167,181) 100%);font-size:29px\"><strong><strong><strong>Challenges and Emerging Threats to PETs<\/strong><\/strong><\/strong><\/p>\n\n\n\n<p style=\"font-size:20px\">While Privacy-Enhancing Technologies (PETs) offer promising solutions, several challenges remain:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li style=\"font-size:20px\"><strong>Evolving Cyber Threats:<\/strong> Attacks such as model extraction and data poisoning present continuous risks.<\/li>\n\n\n\n<li style=\"font-size:20px\"><strong>Implementation Complexity:<\/strong> Many PETs are complex to implement, requiring ongoing research to improve scalability and usability.<\/li>\n\n\n\n<li style=\"font-size:20px\"><strong>Privacy vs. Utility:<\/strong> Balancing privacy protection with the need for accurate and useful data can be difficult in some applications.<\/li>\n\n\n\n<li style=\"font-size:20px\"><strong>Vulnerabilities in Systems:<\/strong> Issues like zero-knowledge proof vulnerabilities, which can allow unauthorized access to data, need to be mitigated.<br>[1]<\/li>\n<\/ul>\n\n\n\n<p class=\"has-secondary-color has-text-color has-background has-link-color wp-elements-c5635776626a4a174b6dd24709f015e1\" style=\"background:linear-gradient(135deg,rgb(255,245,203) 49%,rgb(182,227,212) 100%,rgb(51,167,181) 100%);font-size:29px\"><strong><strong><strong>Conclusion<\/strong><\/strong><\/strong><\/p>\n\n\n\n<p style=\"font-size:20px\">The full potential of Privacy-Enhancing Technologies (PETs) requires multiple efforts at the same time to achieve. Research efforts should focus on two main areas: performance enhancement alongside scalability optimisation as well as simplifying technological deployment and public education about PETs adoption.<\/p>\n\n\n\n<p style=\"font-size:20px\">Addressing conflicts between privacy and utility, and finding the right balance between anonymity and accountability, requires continuous innovation and ethical commitment. As emerging threats, such as AI-powered security threats alongside model extraction vulnerabilities require organisations to sustain their innovation while staying vigilant. <\/p>\n\n\n\n<p style=\"font-size:20px\">At the end of the day, the future of privacy depends on how well we adapt to new tech and emerging risks. By staying informed, using privacy tools, and working with experts, we can make sure our data stays safe for years to come.<\/p>\n\n\n\n<p class=\"has-secondary-color has-text-color has-background has-link-color wp-elements-adcd6dd3844f3b1a5f075cedfed58c8a\" style=\"background-color:#e0cfcf7d;font-size:29px\"><strong><strong><strong>References<\/strong><\/strong><\/strong><\/p>\n\n\n\n<p style=\"font-size:18px\">[1] Information Commissioner&#8217;s Office. (2022, September). <em>Chapter 5: Privacy-enhancing technologies (PETs): Draft anonymisation, pseudonymisation and privacy enhancing technologies guidance<\/em>. <a href=\"https:\/\/ico.org.uk\/media\/about-the-ico\/consultations\/4021464\/chapter-5-anonymisation-pets.pdf\">https:\/\/ico.org.uk\/media\/about-the-ico\/consultations\/4021464\/chapter-5-anonymisation-pets.pdf<\/a><br>[2] Dulanga, C. (2021, May 10). <em>What are the 3 variants of differential privacy?<\/em> Medium. <a href=\"https:\/\/chameeradulanga.medium.com\/what-are-the-3-variants-of-differential-privacy-d65f780f43b8\">https:\/\/chameeradulanga.medium.com\/what-are-the-3-variants-of-differential-privacy-d65f780f43b8<\/a><br>[3] Le Callonnec, S. (2023, January 26). <em>Introduction to privacy enhancing technologies (PETs)<\/em>. Medium. https:\/\/developer.mastercard.com\/blog\/introduction-to-privacy-enhancing-technologies\/<br>[4] Ghosh, A. (2020, December 29). <em>What homomorphic encryption can do<\/em>. Customize Windows. <a href=\"https:\/\/thecustomizewindows.com\/2020\/12\/what-homomorphic-encryption-can-do\/\">https:\/\/thecustomizewindows.com\/2020\/12\/what-homomorphic-encryption-can-do\/<\/a><br>[5] Vengadapurvaja, A. M., Nisha, G., Aarthy, R., &amp; Sasikaladevi, N. (2017). An efficient homomorphic medical image encryption algorithm for cloud storage security. <em>Procedia Computer Science, 115<\/em>, 643-650. <a href=\"https:\/\/doi.org\/10.1016\/j.procs.2017.09.150\">https:\/\/doi.org\/10.1016\/j.procs.2017.09.150<\/a><br>[6] Achar, Sandesh. (2018). Data Privacy-Preservation: A Method of Machine Learning. ABC Journal of Advanced Research. 7. 123-129. 10.18034\/abcjar.v7i2.654.<br>[7] Cloud Hacks. (2024, March 1). <em>Federated learning: A paradigm shift in data privacy and model training<\/em>. Medium. <a href=\"https:\/\/medium.com\/@cloudhacks_\/federated-learning-a-paradigm-shift-in-data-privacy-and-model-training-a41519c5fd7e\">https:\/\/medium.com\/@cloudhacks_\/federated-learning-a-paradigm-shift-in-data-privacy-and-model-training-a41519c5fd7e<\/a><br>[8] Dodiya, K., Radadia, S., &amp; Parikh, D. (2024). Differential privacy techniques in machine learning for enhanced privacy preservation. <em>Journal of Emerging Technologies and Innovative Research, 11<\/em>, 148. <a href=\"https:\/\/doi.org\/10.0208\/jetir.2024456892\">https:\/\/doi.org\/10.0208\/jetir.2024456892<\/a><br>[9] Lavrenov, D. (2019, January 25). <em>Using zero-knowledge proof, a blockchain transaction can be verified while maintaining user anonymity<\/em>. Altoros. <a href=\"https:\/\/www.altoros.com\/blog\/zero-knowledge-proof-improving-privacy-for-a-blockchain\/\">https:\/\/www.altoros.com\/blog\/zero-knowledge-proof-improving-privacy-for-a-blockchain\/<\/a><br>[10] Nnam, D. (2022, August 20). <em>Beginner\u2019s guide to understanding zero-knowledge proofs<\/em>. Medium. <a href=\"https:\/\/medium.com\/@darlingtonnnam\/beginners-guide-to-understanding-zero-knowledge-proofs-cadc4e2c23a8\">https:\/\/medium.com\/@darlingtonnnam\/beginners-guide-to-understanding-zero-knowledge-proofs-cadc4e2c23a8<\/a><br>[11] Liu, P., Xu, X. &amp; Wang, W. Threats, attacks and defenses to federated learning: issues, taxonomy and perspectives.\u00a0<em>Cybersecurity<\/em>\u00a0<strong>5<\/strong>, 4 (2022). <a href=\"https:\/\/doi.org\/10.1186\/s42400-021-00105-6\">https:\/\/doi.org\/10.1186\/s42400-021-00105-6<\/a><br>[12] Lyons-Cunha, J. (2024, November 19). <em>What is data poisoning?<\/em> Built In. <a href=\"https:\/\/builtin.com\/artificial-intelligence\/data-poisoning#:~:text=Data%20poisoning%20occurs%20when%20bad,significantly%20compromise%20the%20model's%20integrity\">https:\/\/builtin.com\/artificial-intelligence\/data-poisoning#:~:text=Data%20poisoning%20occurs%20when%20bad,significantly%20compromise%20the%20model&#8217;s%20integrity<\/a><br>[13] Cao, X., Jia, J., &amp; Gong, N. Z. (2021). Data poisoning attacks to local differential privacy protocols. <em>USENIX Security Symposium<\/em>. <a href=\"https:\/\/www.usenix.org\/system\/files\/sec21fall-cao.pdf\">https:\/\/www.usenix.org\/system\/files\/sec21fall-cao.pdf<\/a><br>[14] origome, H., Kikuchi, H., Fujita, M., &amp; Yu, C.-M. (2024). Robust estimation method against poisoning attacks for key-value data with local differential privacy. <em>Applied Sciences, 14<\/em>(14), 6368. <a href=\"https:\/\/doi.org\/10.3390\/app14146368\">https:\/\/doi.org\/10.3390\/app14146368<\/a><br>[15] Veridise. (2024, February 14). <em>Highlights from the Veridise O1JS v1 audit: Three zero-knowledge security bugs explained<\/em>. Medium. <a href=\"https:\/\/medium.com\/veridise\/highlights-from-the-veridise-o1js-v1-audit-three-zero-knowledge-security-bugs-explained-2f5708f13681\">https:\/\/medium.com\/veridise\/highlights-from-the-veridise-o1js-v1-audit-three-zero-knowledge-security-bugs-explained-2f5708f13681<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Introduction In today\u2019s data-driven world, protecting our personal information has never been more critical. No day passes without fear of privacy breaches and unauthorised surveillance confronting us as data theft takes place simultaneously. Privacy-Enhancing Technologies (PETs) enable data protection through a collection of tools that maintain personal data security without impacting its valid use. The &hellip; <\/p>\n<p class=\"link-more\"><a href=\"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/2025\/02\/14\/privacy-enhancing-technologies-pets-a-deep-dive\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Privacy-Enhancing Technologies (PETs): A Deep Dive&#8221;<\/span><\/a><\/p>\n","protected":false},"author":691,"featured_media":1726,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"ngg_post_thumbnail":0,"footnotes":""},"categories":[1],"tags":[40],"class_list":["post-1723","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-uncategorized","tag-isec-611","entry"],"featured_image_src":"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/v-600x400.png","featured_image_src_square":"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-content\/uploads\/sites\/119\/2025\/02\/v-600x600.png","author_info":{"display_name":"Firas Shama","author_link":"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/author\/firas-shama\/"},"_links":{"self":[{"href":"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-json\/wp\/v2\/posts\/1723","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-json\/wp\/v2\/users\/691"}],"replies":[{"embeddable":true,"href":"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-json\/wp\/v2\/comments?post=1723"}],"version-history":[{"count":17,"href":"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-json\/wp\/v2\/posts\/1723\/revisions"}],"predecessor-version":[{"id":1758,"href":"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-json\/wp\/v2\/posts\/1723\/revisions\/1758"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-json\/wp\/v2\/media\/1726"}],"wp:attachment":[{"href":"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-json\/wp\/v2\/media?parent=1723"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-json\/wp\/v2\/categories?post=1723"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/wpsites.ucalgary.ca\/jacobson-cpsc\/wp-json\/wp\/v2\/tags?post=1723"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}